var/home/core/zuul-output/0000755000175000017500000000000015156563730014540 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015156577766015522 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000243445415156577662020312 0ustar corecoreikubelet.lognc9r~DYd` \-Hږ%C{sg5݁ϱ(3}$WU)X6>KlEڤ펯_ˎ6_o#oVݏKf핷ox[o8W5`% oo/q3m^]/o?8.7oW}ʋghewx/mX,ojŻ ^Tb3b#׳:}=p7뼝ca㑔`e0I1Q!&ѱ[/o^{W-{t3_U|6 x)K#/5ΌR"ggóisR)N %emOQ/Ϋ_oa0vs68/Jʢ ܚʂ9ss3+aô٥J}{37FEbп3 FKX1QRQlrTvb)E,s)Wɀ;$#LcdHM%vz_. o~I|g\W#NqɌDSd1d9nT#Abn q1J# !8,$RNI? j!bE"o j/o\E`r"hA ós yi\_.!=A(%Ud,QwC}F][UVYE NQGn0Ƞɻ>.ww}(o./WY<͉#5O H 'wo6C9yg|O~ €' S[q?,!yq%a:y<\tunL h%$Ǥ].v y[W_` \r/Ɛ%aޗ' B.-^ mQYd'xP2ewEڊL|^ͣrZg7n͐AG%ʷr<>; 2W>h?y|(G>ClsXT(VIx$(J:&~CQpkۗgVKx*lJ3o|s`<՛=JPBUGߩnX#;4ٻO2{Fݫr~AreFj?wQC9yO|$UvވkZoIfzC|]|[>ӸUKҳt17ä$ ֈm maUNvS_$qrMY QOΨN!㞊;4U^Z/ QB?q3En.اeI"X#gZ+Xk?povR]8~깮$b@n3xh!|t{: CºC{ 8Ѿm[ ~z/9آs;DPsif39HoN λC?; H^-¸oZ( +"@@%'0MtW#:7erԮoQ#% H PK)~U,jxQV^pΣ@Klb5)%L%7׷v] gv6دϾDD}c6  %T%St{kJ_O{*Z8Y CEO+'HqZY PTUJ2dic3w ?YQgpa` Z_0΁?kMPc_Ԝ*΄Bs`kmJ?t 53@հ1hr}=5t;at 9:I_|AאM'NO;uD,z҄R K&Nh c{A`?2ZҘ[a-0V&2D[d#L6l\Jk}8gf) afs'oIf'mf\>UxR ks J)'u4iLaNIc2qdNA&aLQVD R0*06V۽棬mpھ*V I{a 0Ҟҝ>Ϗ ,ȓw`Ȅ/2Zjǽ}W4D)3N*[kPF =trSE *b9ē7$ M_8.Ç"q ChCMAgSdL0#W+CUu"k"圀̲F9,,&h'ZJz4U\d +( 7EqڏuC+]CEF 8'9@OVvnNbm: X„RDXfיa }fqG*YƩ{P0K=( $hC=h2@M+ `@P4Re]1he}k|]eO,v^ȹ [=zX[tꆯI7c<ۃ'BdIc*Qqk&60XdGY!D ' @{!b4ִ s Exb 5dKߤKߒ'&YILұ4q6y{&G`%$8Tt ȥ#5vGVO2Қ;m#NS8}d0Q?zLV3\LuOx:,|$;rVauNjk-ؘPꐤ`FD'JɻXC&{>.}y7Z,).Y톯h7n%PAUË/_xY~7.w47mnjGgG{9_e552s4IG^ۃn󨔖I@[ tWv Fyw9J֥WmN^<.eܢMρ'JÖŢո%gQ=p2YaI"&ư%# yCùXz!bm5uAߙXC90뼯nNNXYt\oP@gOV ]cӰJ:^q';E=-dZB4']a.QO:#'6RE'E3 */HAYk|z|ءPQgOJӚ:ƞŵ׉5'{#ޢ1c qw zǽ0 2mK:ȔsGdurWMF*֢v|EC#{usSMiI S/jﴍ8wPVC P2EU:F4!ʢlQHZտ.(7N*,< JDA?VǞ©H\@mϛ~W-ce{0d8 X]Տ޻(*exBaEW :bT:>%:ò6PT:”QVay 77ĐrX(K&Y5+$wL#ɽ 4d-bbdAJ?:P>n^2] e}gjFX@&avF묇cTy^}m .Ŏ7Uֻ󂊹P-\!3^.Y9[XԦo Έ')Ji.VՕH4~)(kKC&ޙ-did˥]5]5᪩QJlyIPEQZȰ<'Y]Q4`Iz_*2coT'ƟlQ.Ff!bpRw@\6"yr+i37Z_j*YLfnYJ~Z~okJX ?A?gU3U;,ד1t7lJ#wՆ;I|p"+I4ˬZcն a.1wXhxDI:;.^m9W_c.4z+ϟMn?!ԫ5H&=JkܓhkB\LQ"<LxeLo4l_m24^3.{oɼʪ~75/nQ?s d|pxu\uw?=QR -Mݞίk@Pc n1æ*m$=4Dbs+J \EƄզ}@۶(ߐ/ۼ𹫘qݎt7Ym݃|M$ 6.x5 TMXbXj-P\jА޴y$j`ROA"EkuS#q * CƂ lu" yo6"3껝I~flQ~NCBX`]ڦÞhkXO _-Qy2$?T3ͤEZ긊mۘ$XD.bͮW`AީClСw5/lbl[N*t*@56."D/< {Dۥ sLxZn$N(lYiV =?_e^0)?]{ @| 6+#gPX>Bk2_@L `CZ?z3~ }[ tŪ)۲-9ֆP}b&x Uhm._O 4m6^^osVЦ+*@5Fˢg'!>$]0 5_glg}릅h:@61Xv` 5DFnx ˭jCtu,R|ۯG8`&ו:ݓ3<:~iXN9`2ŦzhѤ^ MW`c?&d.'[\]}7A[?~R6*.9t,綨 3 6DFe^u; +֡X< paan}7ftJ^%0\?mg5k][ip4@]p6Uu|܀|Kx6خQU2KTǺ.ȕPQVzWuk{n#NWj8+\[ ?yiI~fs[:.۽ '5nWppH? 8>X+m7_Z`V j[ s3nϏT=1:T <= pDCm3-b _F(/f<8sl, 0۬Z"X.~b٦G3TE.֣eմi<~ik[m9뀥!cNIl8y$~\T B "2j*ҕ;ێIs ɛqQQKY`\ +\0(FęRQ hN œ@n|Vo|6 8~J[,o%l%!%tyNO}}=ʬ-'vlQ]m"ifӠ1˟ud9)˔~BѤ]һS8]uBi( Ql{]UcLxٻa,2r(#'CDd2݄kTxn@v7^58þ Ţ&VY+yn~F8I !6WB3C%X)ybLFB%X2U6vw8uUF+X|YukXxVO(+gIQp؎Z{TcR@MSRδ~+1æ|mq՗5$B᲋eY(|*磎\Dži`dZe j'V!Mu@ KV{XץF .Jg< ƜINs:b zĄu3=Az4 u5'og^s7`Rzu-anOIq;6z( rx߅ euPvIɦ7聀t>G;_H;2ʗ6 h6QװxmR JQUbTP2j˔Ni)C)HKE"$ӝ!@2<Bq 2oh80,kNA7,?ע|tC3.㤣TiHEIǢƅaeGF$ u2`d)/-st{E1kٌS*#¦۵_Vu3ЩpRIDr/TxF8g4sѓ{%w .ʕ+84ztT:eEK[[;0(1Q@ET0>@wY)aL5ׄӫ A^%f+[`sb˟(]m`F3 W((!5F-9]dDqL&RΖd}})7 k11 K ;%v'_3 dG8d t#MTU']h7^)O>?~?_ȿM4ə#a&Xi`O}6a-xm`8@;of,![0-7 4f kUy:M֖Esa./zʕy[/ݩqz2¼&'QxJE{cZ7C:?pM z*"#窾+ HsOt۩%͟A498SwWv|jNQ=-[ӓI(lc-kyhH4mߘZaa|鯟j$@ ucH DVmaY$iuKVMٞM9$1#HR1(6x]mD@0ngd6#eMy"[ ^Q $[d8  i#i8YlsI!2(ȐP'3ޜb6xo^fmIx nf^Lw>"0(HKkD4<80: M:'֥P!r "Lӓݰ@ 9n# " $fG#stV \'yMgaSZNg8>e!^f%cYr]qs:"̊;isXa]d+"v=x7p.fZCg_Ys;pE&\U}ܫSh])qKYAـhhdEnU14&G * QIQs;rԩ.k83֖8Muqu_48dHܥlW7 q>fu6+'}xu\Veelz`Zbym gp8펠ˋ߆ֆ:1IC8qٞ^vXçL ]X/r}7O}Wh,h ;RQ=]u00yiC۔I^3!?H6iUH:ô 4P$rT`%2Aq-֢׍qt=@x#~0)p# ы9'iri]ͪ/@繁qVGCڤr,DihB ,m 9 _$q3= A$IC"6g^4e`Xo(D*6"^eTh'4xpFڜe'fVQ7~'c L^ԯwIڣA.}H;Ë*׬=`^ 9]r鐃 -Dfi2|QwZk‹u^6DQ1&H凎c!n[mi3)WfsF:M"uҷs.1!뾧1%s,hQs|hx̗3%*v9(I;:'>uQ+v-vR/egBhAAdh]4H:n^$tHI98/)=mͭ ڐn}}~קg_6WĩDRc0]rY9'z .(jHI :{HG}HDN`h7@{jnE#[dz;7V<*EzfH{]:Qymg+DnA$uSgu$m:8)pA{kI:BX'MH6@w钐֋H{xYEE>6nOf5~IJ|~!yKڮ2 h gb9%islԃ)Hc`ebw|Ī Zg0FRYeO:F)O>UD;;]Y,2ڨi"R"*R2s@AK/u5,b#u>cY^*xkW%!vĉq|?mtB|A ?dXu7LGml?*uTC̶V`FVY>EC}DnG+UaKtȃbeb筃kݴO~f^o⊈ 8MK:mM;ߵoz+O~e3݌ƺ(ܸf)*gCQE*pp^~x܃`U'A~E90t~8-2S󹞙nk56s&"mgVKA: X>7QQ-CDC'| #]Y1E-$nP4N0#C'dvܸȟ.vIH"ŐR ;@_AH<%Ҝ ܣTvnVUY om?'4%hs.o&˛Sy*LD ZmWb{ݏa ې!rGHw@56DǑq LA!&mYJ*mxz2*{_;IYJXFfQ* 0kA".mݡ"3`Rd1]u6d逖`7zGMf}k/⨼0Κ_pLq7k!dTmW#`N)p#4YUh^ ɨڻ#Ch@(R &Z+<3ݰb/St=&yo|BL,1+t. !Df~;)(Oy )r#.<]]i-*ػ-f24qlT1  jL>1qY|\䛧UA"3DZ笪}閹V#g8#,?;RTHZd¡lY}1R/[ QA?^"nYG0_8`N 7{Puٽ/}3ymGqF/^Bld.HSntºB4;~4]%.i|҂"? ~#ݤ[tfv3Ytck0O ͧΠTW_k6J$V^{&Ά+4*Iqt~L4Ykja?BH 8yݪkIf-8>V#ہll/ؽnA(ȱbAj>C9O n6HNe">0]8@*0)QsUN8t^N+mXU q2EDö0^R) hCt{d}ܜFnԴ.2w⠪R/r| w,?VMqܙ7;qpUۚ5Tnj ۝jlN$q:w$U>tL)NC*<` `)ĉJآS2 z]gQ)Bی:D`W&jDk\7XD&?Y\9ȢG:${1`+i n8=%Ml%İȖb7AޗuV3A7ำqE*\qb'YpuHƩҬV nm=Ɂ-2=|5ʹ zi ' ׹U>8bK0%V\ t!Lku`+]c0h&)IVC)p| QUA:]XL/2La[Xѓ F;/-rtx-rei0hE˝ݸDt#{I} `v;jUvK S x1Q2XU&6k&lE"} Q\E)+u>.,SzbQ!g:l0r5aI`"Ǒm O\B!,ZDbjKM%q%Em(>Hm 2z=Eh^&hBk X%t>g:Y #)#vǷOV't d1 =_SEp+%L1OUaY쎹aZNnDZ6fV{r&ȑ|X!|i*FJT+gj׾,$'qg%HWc\4@'@—>9V*E :lw)e6;KK{s`>3X: P/%d1ؑHͦ4;W\hx锎vgqcU!}xF^jc5?7Ua,X nʬ^Cv'A$ƝKA`d;_/EZ~'*"ȜH*Duƽ˳bKg^raͭ̍*tPu*9bJ_ ;3It+v;3O'CX}k:U{⧘pvzz0V Y3'Dco\:^dnJF7a)AH v_§gbȩ<+S%EasUNfB7™:%GY \LXg3۾4\.?}f kj· dM[CaVۿ$XD'QǛU>UݸoRR?x^TE.1߬VwխmLaF݄",Uy%ífz,/o/Z^]ݖF\\UR7򱺹...m/~q[ /7n!7xB[)9nI [GۿsH\ow!>66}եl?|i [%۾s& Z&el-ɬeb.E)բA l1O,dE>-KjLOgeΏe|Bf".ax)֒t0E)J\8ʁ,Gulʂ+lh)6tqd!eó5d ¢ku|M"kP-&ђ5h ^pN0[|B>+q"/[ڲ&6!%<@fpѻKQ31pxFP>TU?!$VQ`Rc1wM "U8V15> =҆#xɮ}U`۸ہt=|X!~Pu(UeS@%Nb:.SZ1d!~\<}LY aBRJ@ѥuȑz.# 3tl7 ]وb Xnݔ[TN1|ttc‡-5=VrPhE0Ǐ}Wd|\aD;(;Ha.]1-{s1`HbKV$n}Z+sz'ʀ*E%N3o2c06JZW?V g>ed\)g.C]pj|4逜*@ nBID f"!!*7kS4޷V+8弔*A19`RI/Hй qPq3TY'퀜+/Ĥ'cp2\1: 0mtH,.7>\hSؗ΀ѩ آSNEYdEcaLF&"FhQ|![gIK v~,Jc%+8[dI368fp*CDrc3k.2WM:UbX[cO;R`RA]d+w!e rr솜[/V`+@;Τ`5d0ϕ_Lع`C"cK>JG.}Ε00e>& 2䯫vNj31c$ i '2Sn-51Y}rE~b>|Ď6Oj~ebIapul9| 3QtUqSCxTD7U9/nq.JYCtuc nrCtVDƖϧ;INOKx%'t+sFUJq:ǫf!NRT1D(3.8Q;І?O+JL0SU%jfˬ1lމZ|VA/.ȍȱh M-r ~[0AG꠭y*8D*-Rz_z{/S[*"꫒?`a;N6uilLn<Yllmb rY״͆jqTI!j.Pٱh s!:W_´KxA|Hk1nE6=W|$O -{]1Ak$ ѫQ6Plp;3F$RveL l5`:~@c>q,7}VE-Q8W70up˳ A¦g/OEU:غA>?=CۣPqȅlW11/$f*0@б 2Dݘrt +qr;d9OU᧯MR3V:<}xXh//T+coY5Ȧ4/m0NE(G2[+G~H'5ipӘ͏O +Px SPp.,?Uv|$Ҹwm#rTF*iw78M>bYʼn_R'M2nm!ubLDfOvP&)$MQbJ"4_2?T5y8_EUPQRj(")ݪ)"D`RbTꢬ;SYy,TGSmP&Bd~j,Γar/!vSy=T $3?))vSRvb@ɩf&}24|x ؋PM[/+ eq̊\ݟp,L +6*cM102vhFNGqd<Ҥ5a|][q͓h,;~9^a~Olۇzݕi ˗<9w`C`ycx?O|!*t_^ro1%B>@%{Rd2B;"%W9vĒU3%+I\M=3L {we, ?KYDD Ly5?LTb\p1 vКwIQzmTQ',\y-,2#LX Ӵl'b|ܶ@q|4Y*.d-Qv]d(.ץ"%jeʐ Yy]<ribCa`FúC/3u@Kg#M۰Bh"r"AD(90+g|UʹSL Us1 (Q2b|GS!Fד{ɰT*'*Zb}Kg]z#] @dC^3sόG핃L6{t4K'(]V# ѱ4U^\wyBn=Gs}bQ/gȣ ?33ܙgYw7^NeH@&y." K Q]j*f NKOb{C9,nFʪyi2q@Nۇ 5ٛT S=hy2+ȫx׼\@dYwӪ4 0ԕ\K Aë:'A1xlopΫ*f_O5>ۚ @9(()!6M6}}CMue@LOį1;WHVwLw`"8Oސ0U+J>66hA H ?qI|f !I%hTy89~L$P_ h/N~fZȠ<$ o2iqho{7sC]RɍN/(OR<%:8.#.F.n1}zŃtM𴚟ExՏW{r3k^-52B\A\|Nn\!up3?'p\pOY\GgK 7bIVe$muzʺ C|5qCM`B4y9 j2MfMQ|E' y=EX%2IފϢ`u0!/ |goOi޶@qHP4Ɨr{[Um)iBo ,Dќ\`?&=ړM׹E[ˊp6u+fiO$_ ~cUBmN9MOfM"48~O$OEjO]P5W tpy05 [؇Yrl]r?l|iU8[`?s$0-+?;m+0RH IpTvBwJ-K&#vr;Q8ߣ_ޟzi<0v|m=H-M嵈:+y,4R 4luU#|jiR*EfZW4Y$ӉUNZu * v-WG1OKL@k`K1$2%,IzWMjFܰC.hDaEaxL1 $jj&y! XuP.uPJar c[#DnOn?4]e_ܶJh}$fiqڗ1M,}7w!Uj[ϲ$o@PL0/t5RQoU[0eB+b*55sZQ ;8;ʚ1~NC hϔ/s1hp|r%u}W@8Eno4x.v_iF6vnˁ'aW?M sW@hog@]h_2!ȕ7h-OM=f&MU¶p^ka\j19~h;3FȨ47vwt޺h\ 8$ϥɦֲ$ZZw\m &nGW[dDj5`CՃ9 B$z6Js\9*# ~xHQ9i|@gCe[7Yl9j.Jrb9.>I;jx5=D:&o_@źh,]}WbMY{ZXuەzuʓk'c&.V<+wbf a;TM-3ڎnJ`">vT{mdBl tBczDS91O.d}Bbys eKލ=KE=TPaz ZZ\ϵЧ6sd^`\֥"G.AZ\<)hw")"fV/͛" ɢ3sTV_d:Mģo%[eKjd6ת9 Ds&JKJv4՝6tlB5|J<Ziu `&u9FTUD\Iq p{ELg;R.LOdyD<4.ͽd4+IL uJlq*C\he'g }|F֊u~.'2̵55ebr ym# q]Ӌ# iٱ B/;XAYa`ӝاLx! B$op^nVCSUS~_cnI:  }Dn+i܄TJ>˲le(\!,dVѦ4d7!k!:,ù-ꁮ#ݭqݦ0Tzm.F5vצli<咯yM$Z#֣'SVMX@ST!]U>zzxu=p}[ՎW:YKֶԴZM}cT=lQ%@3|!몇VR\kǢ"4^[ :MH*>h/sƷq8#(8>ZQ;cmQs|C? @Zb*Xi=t{0*_rWō=+@{ #g|AljFS4ݸ0m8Rb؇25w}f?PM11c@8v~ЁZ$}Yr.a!2=O"0_kX"l8}VS"5J HV87vgͯ$!?N,8;K0eH#AM@٘A4iYd{ީ<mS6LU)n0:!m~PR@@؂(|r*;q] ԝ:@$?;iه6k@(Ȼ >@(So4!vY͐&,})9#  aKHLD& R⢩:mQh޵ml2 pݽµhI@Il$Rؠ}ϙ!ê#v-Zw&RS$]!O tF_ CwAG;Dk΁-S",("K">)؋0NWO}7_-.\{OG,ž%-طTnl(| [ |oYqo04ʶ[Nv3qX(%HcVnե ]c( [,-@½CG\_&mIg]9\F D@ƆEGD☇G\93 V@p[8_5no}85@}#P] yƱUӹNnܤ~w*1sS#h+nᯧ] wsS=Ӓ`tӽq~h27}"r9sӒaSu넋~ ([qxݸ|:Cӿ90ۼ' g3t*șFʗ^M@ L$u׶n:jr]!hqk{t?nY^1e;s:jqq[:Q[e1w-n0J0rwmE%ͼ[rm9ݲsbNp\ز']N*we۞{@%nUo1}wy`en`/r9?sK:lv}ߍS\XCil X#Qwۤw;-F DgPuTvߤࡾ 8RNҒ٫66SBw)@C_bq4#B5>f1 EIȘŌ#2MH~e.2Dnwa1AUdR&zw@ܞz&8,09 N 2on ɘr#jb@YBBREfdBM __" )(~Tĩ|KqRʽ!΃rf/#iAFE>!ZL=ٚ=^O׀kyJ=P5Bt ]:Rj"!t|,!)i.T [kR,P.|劳{kx/ƪ/T%ϞD|ۥJ'P:.{r: X -G2gDM@,g81 ⪒rJɸͦe0>h?Um[ B;X__~2̢)xϭBĝɣ{Vs/0ڞX)<בd akQ@!ɒ{rr !f?IjJȨ琰7`̢Mip[dp҄٧]U,૶(3rGI!/j}a*?k) VR7myb-)|=:{hi2⪴`TQ>Qgk um)R._O>~<;;͡q"=ۍYohnc!0 qr7CX={9^ieaEFĴ}oh!BGq(&vȣMr볎JDDǍ0WK䷓7b!ydf1(Tf$C8n`AK-WU sXUU?ڰ%,>Od:# d@XFm<-7/?< =Lr~|7%tz h"GX|!h7߾O46 ;Hx7Qną u"zzQ+15Bo|Ž8voEņV|QОqφ[g44GT&4+9UϬb׳L3.DhY>&1,xԕ4SZNj]JUVPԂαZM][jÑH0 z( fZO&U(kQ^ʇQj?]d0-{b-MW[jSDgy6\&]mlL{Uy8s++^Zzdl2QLm&O׆و\EI %7hֱX0H8b}w97ݘ۵ r(Xinkc4E$Ζܳ[$ c3P^ؓe=8&G&{Wm> "'؃^|ȧ9D͚m1i XoDn5]2(T1 2ff^1kk8"4Lx_/y g5 0-2!?Ae_oQ9&&Lrݠh0?/Κ''ڃ~*5Zއ, E!ސJrA"$<26-qf_X&1(?Z|MPۯ+%)UMMoH9M"4O 4cLY.o0+0{hy?AS7+OPhA"%h}5 +r 8"ի4J4O/!d45m)W87ǿAWU0 !ꇨ~I%M!6a_d ̖3P(D;cYB+m]|2x"J*`ΘHrF0C4C]4'P_2M&I{H^,j(8cH,sY ݜIY~ 3xK8 5E >x Z.B31E%0AqfI4ӉW 칣yҖgHM"F`:BVlZ _0` UyF©ԗU~b.Uãf =qe[NBz2a-y50+E?f6fhq317W¯m 㚰zE_r>vgu9O =*WeW'ig(OR-0gE2-S,0d*BtpXh>b/?b^2 gGO'Q6>*GPܤHg'Mv#YH "N+Dr&(8X(Y"'Xx '.zri jtNC{Xpvx5PPWS`9P`1F")H!(GݜZ' "- XQ D:B& 51é5![`PԫJaB) VEU6 N 0CԪY~]qd4@ҏgRr+bM>Hq}}o9 GI 52,cc~ɌFou)8J$#<"#uvFa#4+*_j=2OQ0V)cԧE(sA G>^"!zUTՖH43~wi+rpk!%Xj!9OrlMVfʄ.VLyn,Y\k t3[Ȉ+Jڷh6KOy&a;Z_{P{wf>p-jM{ l8ڏr7P!G!r[xWH2V[][1SWnkN|,;" >/z20k{(^ڐ`d!-"YO캵 ))mH٨c):"G[ZFem}v1z"ZH oQCҳyN mE$ւ*nPڐH3naqM%~ 5S"Pܬ}*Us @_KރkZqh0$ L6!ag9Lc_o)AΙk󸿱i\KvN tP /n#aáwak+*렜Wu#%Y, R5жY+4ȿ%3p^2jeºW Rxuv>ndV- UUWۈ|d̳!Xqesό-P1/^]5dy]2#,ōxsE8U";u),f`q%7U!K^|3rŁx y AHm岏l5,\ 5V#])gCǜhR [jUSSwfyy GLYkV"b3P%"DbE0X4\8@oGˎF*Efb*XքG7gDI>DžA/hE,p|$'ZYlmRXB2P'ѩa֦uѲTW?չH9@h> תN.z[P쎽Tu U1ڶO4_,'ʰK7'S$+I'nMZ0 s}ybG%"P5xp#u[JA"p:^$S~;#E_YVf Lq\Gm58̕aw޷1`#A}XmcQ$ ƺnG nYFlϴ 58o}oj]p0^`dgG} rB i$zV-$DEzQđFWbr?1)X/yV\`>~?l`AvHKS}Iwʼn]Sm\0d[){uiY_~t9lKf$xZNd{Sc?~Km\og0.^-?Lv%9*p/rq< [W=|@o}'cj,}@[햠d͸`ϽヸWNޣ[՟_ itDBiS%;I=2;di5ןYW븞y`}kM xe6{Z!)w'1*{L8?"s*um\jd-t˚EM{E(_W3K9DJٓv h3No?98ŇCm8JJiKw1E__5ĺ" ۓ5#몌DP ^Ԇ2U%X$:*Eζ]!ahLdaYjJڂK'[GDb(&y\Loy)ki2OR}E% ;`a7$i;<QQe 6F݋D{ \l Di"&FJʖ+C:eP6 :$AӏE3Y%dJZpG$7E1ѸҴ41f 7-PZ6g- %UMHb,[`/p7L# \WǴ`W֘yH"c,◛_<z7Gˤ&1Q1g1灋jn=b,mAMyaVZNLY2oIf*8r qb@~ޚzoRYݥ) c E#c8qY){ϙ()c,0E#yI8jev]l;Ҙ-QJ4(c] ƒX6I~" U{S\±U0 q4fpSH݊] Ccꁐ!9DݢtjL%RC-` ,VI3֗OA7!dtH₱6?n?vU;StG Qi؜+22`VRnw$8 ķ8%7٘`!*2aX#5&֪+qłI`Pz7vݥl 8T ?IٻLˍto >4lvIMlXmk^@[Jf,X4`OC%B/xZ(YL'kXް(K$фwQU nt%Jh1t(*ţI 0%rz>UʔO-,IL(2Qc~n> iA&̏cNr7.X7ࢆXU5Tfכ|tO2@925q>wD;M )ݲMc6,&I6KPKt.ǵB= MŃ$ d?R< y Ҫ|^2 u>mWvUI\"ʸW,[ `1 &ZYD~ ,ܚL`!segR2y ܲ7CM1xZ+OYxE|P#ȫEgWrZuB3S|\~lnsq)v5M"âJ$CET" 1RI5d"xuyOq}۟h:pnˡӽB#^&DL}X\|~Ir418.a0-1x{!qؙe۳f8޽=oq;#Ipp0%?fHsW=-IDUm9Kn"U&3%=O97VȇpZ!7b,JD^<(8#q'bx/K`F=7 tb}gJ, 4&y,vHM%%?׏$8Jo!bYcՊ$hڜj(@KB;*!qL|$őbqiQ63l48xֹ`sM7tYq<[t,;ξWD[9v޾瞾76EIl2 }͇ozs_@bn|Q:n.Hb8L*ou>ToJDF{S K=0N[Lm#Qщ~x(XI5i$pd\E `,;L>OKVt40tC7869dV),9?MYV7W;Dy'o:ܑmahkkq& S!O,$o5̓F`,6uݰ'l1?(\-7ЊѢ<)~DU~AҊ?\.xt!(.x2'ʼ$מG.o N*n$\,cleFX:Y{K`jؼw:\]u,1тHi3iDRW@<>%h’J¾]BZj)x&X.* b*Eζ]FNBX:F7v0zp[4? YhH}sl0[sD 疄rkR4JќܘF;zQBq3Պ/ kΌ,q %CNwvuGvR-3](2}hܴXxY{&.iQcHRK/ CL\t=|Zrpo[Ǎ"&8jkv Wm"ppI+n9k-@,s&r1U}&Cʨ/͢ݎ]J}j.,f'iO4~#\~] [ Kqvߎm:z^,b>5n}\u'cLQ]c&^1'M&NM~-m \r(55[?I-a`\ynFT2yFgӓT!T g)jmK'}F|¶oq\;{3[@?Z8*G[z αU?X{Z%q![TmAp SI3Uf սO5>?OAW<Ĩ(i`s.~PBpy0 Qo Uŝ~a\Y@{.1DR3kGDvM̺O0]oqtY?5exOѵ?|q  `}$V8IO c]r&v3 vogruM2s̲ad'˺le0t'lP΀U"~SbMVXJ,~IRYG#%$_(M%Y7cѽp2g5* FW=lwIh aŧZwa/j~zWT4@vycǠH}S`ŲT03i(=*o .zwgyo"0NrCXaڟR]?~ΗJgS̹fa'Ap̋^KJxHY,G)[dJI h~h| 4-g0nO +\Ҭf=^F4Tq3U5ڧV].륄z]~١d Dj@a.KL _+C^#s0UJ"|#3{u8@LkI#V^`A;(AN<@ݝ[S8ta~L`-]"oD\X-O z'Jg:l>sqx ./NS㟼 ~@(^&5>yW)n=޹GE{О'xBAk|Zh}9G+0xIn/ :(~(ܧEOewG2Ż-ғbH9! 4K(N$@[h?ZZxl?<tۏ@3g3JPt_z\YߕՍnIu~jih%Vm3jA=.ۑo'چ;?]U巎Yz^)q$QbƦ4Go59? 66.L麡Ck1niB%%rz#Z]8c3W˝by]ݙ]Ž^* ߕDqwřg|ڻTR/g8eߊ+N@m.YOĬrE8Ȍ:&?eixetFa!}+%%q K9&=?5}]]jߔ 9\ ݫqcX~Nmse*O_3 Qch(b@x#ߍK"]f$ـNTm/ٿ:Ts㾱WЁyyl;X9>?}ٯ8-w,\3),YtOFGs} P{5h fRb Q6C$Ps&J.9StYA$uYvM<`; y` ]z Co<4rqm gp, Bzql $H+j~2WBr?n~\Údf$zʎVNn88{j`8-@]֏ /0- Y߉|BW۠ "mPTim0yd144U#DZ).*Ђ}-hj!FnUD= )&["*Ŕȱ[(-L mJInOb2h H0QOE9M8jiւ<5xnQܦY"]V &+HhԂ"dIѥA')48;YH @)O#eBNvV% 4lYM0eQ&Bo}mJrRQ;[}=f\hr\i[Nϧzx?C 6 &'E^\]#i@~j}M*4^=uXȢ".L 5B.,?!`,zqg;P2~XBۦw~@i2{C0 uDiZ!YHgENe:]kU,m-M{7%xLj-c<+r1V0 D* w1"Vjr<_P_URqZ-!0<(R̝@q8q\Pg-|$+ IjP7kڧMZrKk#L"iRcl:Ju{Z'l2X̶s} J/̷~e~%=QJ%/v\S">sJ#r_.2XO 58lrmDoqJ}["WCG^^ms/~3ɖ}v]_œ̑\UH(xDŽV9,[? 'Z?=>de<{h%߲==t]Utb򀷣 h"A7#upx4},jo ^`^3{"pjgW=&ЈUo8|"t!&&O%<D=ysY^SG&?id?;'3ov#Yp~s߄@ ,m$g ڃf 7 (ƪULZl@m.ƖO´xـCpZ z~,wܒ%`djg gJHmǂGWǖuQ}ΟwR2n<'yNOYs_}9X"y8pb b"" $8\0'r:5 t>+P|".-Սl̟ٴp*?!/x@!k,O4S4 zAd+7Y]qe-kq:F+d1I"bKE0G&ΐTZaD)q|ǑfmoZQ#oZuU}FBԖj#'"8VՉ")qDb%g$bO;)6ΆtD $VI6q4q4(7j<ƖT!fP)g\ZZ  LNkkFӊiF4CiR 7j`g1x; v}BFRE7v5>#t`(@YlKEkCCUgcWi}sSoI=&EϹ7b%.54T J J;%CMUU{(P@C?E(WT}3oߗ7kv\r7?t};z;}n )Vݣw텿g\U 5\|@Dz^*QeU(_ #*n?_p~zF+'={*`$뢆PoDhF}:lW~WvMts{gB}#3ck @͖߂q{ć+y8b5kcrԛ|̺RÎk\}rЮ_U)M^!GTkTxb)(pSy;CF@h~w/jݜa8E__oQUQF;+]9_9x'_Me97cwDcS J`:DHs;F v{#TcH=lCji4 Sh^K6BNpk'ciP&T ]sp|U#Etle  檖VI)\@\YCi6`Oɛ0#P7Jo5qk|F66^{W swէҟx/U/g3`ډe~}GIr1/C+3H<@a6w \3|O|@7du6Cs1nW]ZĹ9{b{Z [FL͹pj~ fvح/UDYrV >&VzT5 lj!&!r gR@րD$ټ'&W#{iq#Kc 5TJ[hRϱ`1 j6XۖU2dK &ny`[3i=\lM97HhQÏ<캅w yidC0,ɚճ9[m;jX5Kfѩ*&di(R<=;OS_/݋ñ̮F$);ݠge0JZ&iL>cSKERTX֤L)Gs+;I dZ:y㦡A0n.B`` ͮj^IT2IIw2pw1m j\bSyB]VQNK| q)NŸ61KY8J2$҄ ˙6|A0޹  p^q= y2vE/g MEµʉYi-'TyL u.IR 3Lr#=C[r :wWP2&"QZaa3NPk/EL&Z&V8+bC\%%] 1,JXI^|3H ʄCˡxCJO޸0+3wl=)Mlek75b mA`ʹhlЬ,!h\*f{o{AHѵ^X[ɇ~=Bە/'D!juBm|Nj']&~靘eM]0 x`#>iW6L"v+Eh6߀V5Ѫ_ERE7yL4Ant \֨!_Ij8EvUBE4UN6 d0J Ӣd-2i0֘4Y[L5l㡧04 X0&t{f}A#|Hb͘}BT@ʒpWA k|@Lx05}{l^* Aۮ"GJ$wQ^މfT"ݚ,?Q8/<:'/^ !6$ 4o>Xևm8хzh *Hx0qWLZ/,q(GBcgBC`Z|@ prJƱg\w0akc~X4^cKW <-*5*]ƥVuW_rt=ޭȭ8'\ }Τ6#_~F!as1^Nޗ%o휘rϋrRcO^e<=(yl%~;UoΗt}fws#H\%:F`KiZ8 UG}GgXw\oxj^nO@X~I&Zd=ŇU[[5sj{b;΍?R琝w>p+!\dfP޼|7t|<拏oG}tfyݛ{CD1/J9z FOgc}Kewksc|3FG1/&/;xqI!Tcϴ_{GVT/68;6p_M_ߟFϕz )sвj#@QhamS<2BJ^)wm/G)Epg]3i|-*M2\CL[XlǾgox+Po9<(\H/6 h [Wxfj~.Pj*zmLAxcM%x>c* 6uKkEU)pl#tyFIƷ Pb,E<HAacf~0zNq4zITDxtMj_AJ uf-;D*Fr6tT[:=ox߯cT12zz<o"؏<~.ĹZ+|t_>ڂy7OTN8F Uz-#w29~ $f6qBͤ"dR*YXI99.$3}X]=8Qޓi`+y`^l]2LԌ}1Ԩ5"[\*$x@+C_]7͜A Ċr2syL}_?KЂuy(u4C??џEU//⫟~=ʯкe#/~#bſ_}35WRzOGxB/x$[A)jtq5F@f<0"W[Ạ٦j2B/N* }oW#2}~Y= "-;n'}]0`?CYjT ԫ?Y 7W -}* :CI PzXso͊v0>#LEjezc2ysWTRm9RV+2V޿;{b&ܖOEhS1y7zs}LGN㙽GAzﯳ۾ph ]= ~|9]4{_':.s֣'ϛypR}/7)Vlѳօj΂ FTwp`436G9~%y7'è)Q.ɣ<Į!J-) JbV`AD\=Xf=ths ;CQ[Y@TpBsY`*gh- R-D}ԯ[ZFě б"MVGݻV.^=uٚ0Xx5'ޘo~g H `<фJD/i!RRC-5C96/5B:-; !0WՋ|v 0  !h "kRTtd)HT)npBDA~\F倇wc=) 0-}lUQ~UCҠT֗ilC,9)h%g8Kx,9-`ɪm@rǒ%SQPEY?a j<:yԓlM?.C3 \` Bf?hZ}KZ=*NLM7`J ԛn&̪gJ3+TFUeS)Sgͣ:X6+p@ZvU?=I#W<AiiaGdNTI2!0}ƍj`+aWfjZ\ɝnnIʸv\0__>^98% MQ8o.6Yu~Urm@H]Q=WCmYH01 H޺5Åk&ZKĮ,{>_|>wY>Ru<"_[iNTG +)5pRp~_j^˧$^q~oTHHrʴ nt4NN̓6C<˝5sFS]Oj 1 S Eߚu fhL@M;0Vfh \E V 6yLVUwj@J-*wVnj[X1OI7fZlyVl_-S̪/K$whjt*OkCہJ bF {W"@QH+_B!):NQ0k_$iҵqm#oSG9;LHp?#e ZGʸQF|ad8ZYz85_{겅>pB/+F(NǣIB``ĽU"6XgXFrvXտP;녪iq8[RvK4U{~2 s>jKcM+vhjlZTH(W)p<ԯڡ5kP@WRRk0tc-eoWk9gjXo+YZ) ឫ)1"cm5@YRRz;V֋Rq-p4U_S:۱6^kHhh?GܱގzZ#vcme8<`g@SQ+Xozz*5fd`z;Bk}WUJj|pTqcm'U0, T2;?+u:۱^RS.Ujx8jyc>%%+T HI3XoKY/Q'͊)ZFק4Xoz[z[b ?55Zݱގr VQ(HoGz[Kz8DX%G4'DHoGz[IzIe*5 :549Cm8=\u8Ə+j+|xq:ګFAEk(Z`?dQzuQc_^y,4T t\N=\X\J1Hq,ͰxX鋮hB^.u#JDX­F#­L2a|mӼ&deLU[jx.f :O}Ħb"B LbR@c. ֠ O&K×srn\iVE = &Vg)ð)˕Oj0*)U;cf[e<ԥ +I*5g>-brI4@IrAg!>ˏ%_z wmIRVwG\Ζ4r9 _UƟ_r g6RXQY_& uD?TBw!DkPx8d 84/ᦏC>I$шeHOoLԞ%^M,J .$E.uz<.u/Mj3ӮdD*@-cU`}vZj3!Wakj= Vz ] q~#82pNUye’]TRE6O @`gy9>eiKZHN}7;|9i2L3; R9Prk@ , 0 K,#3!K YX2,K5iYY;8"€4?`$8p KzX."k\b/렊+7_[+/.;39XyQ3/BJS[`QW"8k,TeVc)M;oDp H*2 7ěyCYW&ޜ7Gq'⍟"HnhL>HAFS6^P<- +O_vmH["Ai:H4RK7 4O9FO_`ɭPDe`k$Tg:2+\Z*n}p!:@D~dE ^#@iCiY5tdצs1HTn*!6&9L%d6RؔJz}*)|%A;'i۫D'!d5 PA P)@<Ѵ\*']mDFP 3EOѤ^ԃ(~ک6Féwg&8׮/ρp&X iQPoMU֥.s Vf,mimRkK'sTmb%G`k$XM@3L6ϋLz)C{ m`e`v@-5D )TAFayd:, MLɐC!ϰ-Da98H< 0chIsIB? 9u}FqsQ}v7]:LoOU],PƠ m,GukÜ9hJV9R'MpEa8p LS[ :4^Hi+^Ji/Ü" jX8FadJ*"ham_A~#Ѹ6x,*h[w簻ى0BlM'RNS4 k ?6Q+]J#<Ad0ZIh?Ft"(LڀȪbCCB@XxR; 0 4' ;Q%h#0QTE\%iQ Vxe>PhUn`-D*F?#b}(buUrQݖw)߿RF vPF6k Pk$Ti!ɜH{#lbQJ&ۚ*q5 l,%]d-楑 a[m0L˭84W1 z#o6שuAiOpu@/:]\bVVW#[ƿp?6_~<l~Wk֡y0ZGtk"*}VQX{܊EXȼJ/{[-.[xLԢyH<$xRfHj~MH²ء r*PX%AbX>e]a=V@Yru?7fk}U {#4Λ횻{nv9;oew~o$lt3qr~s?e*^l9c<^'K]l 'np%Qh`G04=׾%hC8nn}S>=r37-W 7ݼ'RiP4|wnzZZ6X4m^e~n7Ҍ2Fw9_5ჿ~#7x>U`,Ӂ=}tt )DqgzI5Ȩ|2p_:FMĀ+6/Ymc1Jvmu|cw@/S>6r_~nތ||S8\^۟WLHnB6agQsM×\pZ'|y^_\qf:DL/wUyl`f9WdzZGߪ|UI nt<8]s:a~7jY uWk{]3lyW y8&8G,<3}ŤOJ5f:sRjg``)€ٲ{xmW Ljbyrm쨞<-faR]H2{ǔ4zݘ&_ Za ^%}OWFW֝_cS|}s{MD={-=w3`U}7331yUӂ<_c٣i&>NH{Cqgڣ?lZ)OkAsMn蟖s5o:nV^P\|ޅrg!9Nf^6+M%;=,8Z< cŬ,޼X Zt|WތW}7wMǷt<~ͱѮjf|sZQxe>Fk:z3???/|mqgYG|س`#mMC= djnx> 6O׫}S:bݿvQu a .Hu~_pz7N|{nOwG ERg~.ԛ, 4^r tlzs oojA049t%Mn)ծdW  ")g_\bog^wi~\ڽKr]1GUAH-DI:ؙdmG"\3=L֣<f&^0JJk (9Pb$NXY zLQgA+1ifqL#\?n&ՙhb#^3R}y.m5>J8V!+. InaS:>̛+y.s@kVOLH{曯-B4TXjYJ RE9#Yʊ-0(tE?j< Kj'-N$T45vB{t׻4zGI"yE(Qt$c(KM M9v֗1!L|9w}M|Zhi=:ͷ&$v .bJ}F,>6 4l 1V~n͟ihtP_hienǮN<#+QN>+/j]&YOc s#lu0MmܱgZВZ+ 4QA?z|K=b~e!{_f''<ΛEUK}h%qYtid~K׽7#^wݞxJ~Don,Ǹ*fn+:u`PzwLG{e!1bւ) GJF-0i?nG_&oYO~3xHZd;\k}wZ)U&__Jk[;+>HvoΑsiE'9enJNı;ü4@ -q`*/6{'˴iL~݅uԤ& ܉IufھNwG%6펼Sb*1e[~@^ӹ͡QJ[šmJR|;MY<WqV7qbpjZ;|rhC;R Zmjݳ۰/৚_-hw\'<ĦX#-0=337w6 IICd0іC )m箴#5&6l^yh#wmDDmhTk-4p=nOnE7~졢+6w~{:X{*]Wlo}5qP-7W9Th?&x=m&\☭$60r':Jեc2tޖ|^N-M }M0'A;Mp5oAKN偹C93<0ekf38i N{WS2S)bH^y Lfm"X#\HZ'ȦH@~ r:\KMyJY<0kVk{Z9pI,$d.^9jnuLdY[ qjE&38pݹЄ֊½s9W3c93̐벽4U]5-<9o%[ Z*g3XJPN^wkq%XXaK74/r  >[_eN"sMT?q~ l+OrG]C2p E3/ӳEb>|ʯBf!=_pdN?ry}Vg^v2;]B(V_?~b]8!}(\ckk, kH׎ؙqV&^/ٕskm8!)ė8\[ȫvf6vmpiQz[xf,8Nԯawr Q=Kk7p.yJXx W}z`0kn+?ђwϏ,`F3"&e T, Twv<ej]P,)T!H<Ϫmf?\ƻ0?9p}#//w !b)I|vy":|ٮ{/7>8xEG]|+"B٧7[T7|v׳!jzTѥ{>̎0A>!]Aa,vqB?]?sY){鐞) yfν~WCrjqWU?wA(y:l#|%}G+{wH'/Hx__חhixwX]h S5~7mu )Zf xIHOM!Ok{vY4_ҙ@щ-=ΑoS:-A.G>]۠y} ]R RuEG9hU&z6$bLFwi 9[7_ӧ%$Ϙz}.½Oކ |j@G>\FuA&!L\\-pM*[GZ^ Y{Z{=h!tDJPD$BdJcV),";6 [jEhvaO5LʙmDB>W^l-2mtPQ(5ڈB96DkK @C.:K%93d.RNRebFe6hyOܪp6RirTdlrV.,5ExRXHE{I@ ХmN)#1;l(&u9£cɉ(w) aFK(3{VR_̰2[(41KІ,!lZ+6 *(c!Mb*h*M!w] \ Ipᜐz-@"uu䞤>RNޅT6Y8B*iBKdY-`I6.F5SkPys#YGUVرQ9P:ωSmёw3$y29a1t7K 2)1ï`dF>tHD؄#zKVI V"[ #HEio21[8ًbrN"L}2Ȏ"G- &1 t|x^[!2H`G>aթN,TGԧeD~5 eT@m#=G^=НM-,5Omu?N#NXZDW's+paB{(%Q)N$ JDޅ\K+Lr@ @ԙ hwPJIQB8[oh9cޖeŽj8INzt< i/ hw<6mDQXsUtUPZ*r|)S#HN$-G'eZcgj6ө җiIo, ?xZ#ݑHdD\ @ayP) ڑ%Gu @/!XNxhM*x7t#,I+,3&LcK3Jy7RRH[o{ P),0' "C|^iFRicHD mZXv 0LgW  u0eVx3nKQP[=PmK,fj=M{VB+o[UY~`1ݳ! 7fRNk8`H/ ͑yrtBc\ VSD6 ;T@=#& )rCpzhʁ+ QV 6dX$o,W#f dE\"5&T#O.ʝ}k"VZ0%0raͭK;#0q@y$ CzJv),M$SFr `NDea仑Zځȱ:z@&2THkBc3XN&@X)T{&5sV#+Av JWд Mo"U 1*Y dp–tj`!jp˜6H+-MBe)<ٙ#KiI1j, d> 8e*Cq!bfS[ 0?x}@pifJh-d:kuW7˥$a`eGtte@:2nׅu0p[X DBQ鼈)˷P] AKԲ,Tj^ VU-RY- R(Qi-]-Ȍ]5_]=\ª**eK Ex |ߔDMFNABoT}hi~4żnvLN\,9fYaTOt:_^.J!u2_pt;uj)!ԏ{Ͼ_m߯o1ulCa@:X!ꤒ3r6DTM:JJ2Q{$$3Q'u2Q'u2Q'u2Q'u2Q'u2Q'u2Q'u2Q'u2Q'u2Q'u2Q'u2Q'u2Q'u2Q'u2Q'u2Q'u2Q'u2Q'u2Q'u2Q'u2Q'u2Q'u2Q'u,Qxs"Hr:#@ܧ~3DI4o6D&|D]2Q'u2Q'u2Q'u2Q'u2Q'u2Q'u2Q'u2Q'u2Q'u2Q'u2Q'u2Q'u2Q'u2Q'u2Q'u2Q'u2Q'u2Q'u2Q'u2Q'u2Q'u2Q'u2Q'u,Q'm1s"$u .-φi|DHe&|DQ YLDLDLDLDLDLDLDLDLDLDLDLDLDLDLDLDLDLDLDLDLDLDLD7Cy\wE?:LM{\}׻??C=+i+q˞k ⴩ӟ^^3Lj2 i d.4m1bg"Leg"AN ),bg",!%L$ 0\FB幌 PLE<ahjDXш g"!BE]ꋒg",!rq&R">aRMު??(Wno4I"2?~:ֽh|Cߥt]On⤾ljKjR>~A M_PĕI+pZoP}:ҩLJ➏o]%));tWR3rWPu6*{ڑJ UR쮾CwXi<#w)lUsqWIZߺJRʌGw9#w烮lUVZfw+ÐUXqWI\#]AZJ[wWI5ݕyԛ߼:H`)^{0q|]wuؙ.wuk pWwezg䮒Ow5\%^ikﮒdw=+_-.i 1I)_3}Y%܊.gCy!-\<Q Peg4Oz5Cw8?݊=u*FtPuw]@W4)P ?,ƃc,rA"I׋?OaŃoz?ƸO*NPKj8IGP\ ]sWj磻af4fP~}^$Lb:l.~ZM#LŤ*nt͑y[iumg6̜pԅ՞}뛭ڸ9ѼB y0U$}law,W1v4l"OFW3y;D,U^WϘ+Yh2YCT ߦ_կ 'QgZwzVev " SjyL`jן ޹0JkM&@JcyINZvVXM}mLo2lzG1 7tܓ2UY0\*U& rZi6*+ʶPR}Ey:W)p#©SQR_-w/o#!ȢJj[Hؑ%ivJ#hKFb}ty:o!:x#7ntdQ()QU|-!gRc<mN;>AMNU+Oz$Vc/o_2weeElwu8M׻d$naywS!^j]pNrb?f4-Z:N27"6 qr2oGjJ ?B{/ist={P0{05L͟G:}YGД|9=ۺ},^oRۿՓS?_5t۲&o e3cZozp9?wf/O% no_k&}N/G>5zh=R5UWPVҺ@YK ^ [XlALP1^Xed}\\eTz>C?pPI]S-?==fh<’k{8*9 ߍ?0hl&k\NyPՈnCy  aи.KB}} >^??_^ lqʉw{\뉙vM̚/vM Qb҇U@B9sMM@"!P),8<he9va7)gyh ҺM8_?@`5'7aD`Ow;^eenl0_0dBw+V6ja(#] L;TѫBSBR}x0Bvmqɦ=n_u*f/܃%l=8BRrIzo"͟)n1eJ305"جj:OzP771sEJ?,BgO=|h τTRDM)<=/Z &֝nZNBtN2z)9r=P)UoG '~v9+GIgD B!W͞(Uͼ?.o'lT˝s8yc̞TCugU~=SJ\ؽRn 8}wUT'L<YU*a. \ x5iH+8Iz: #{ks55/O"1>yʃϿgu_kUhmRTTibb >o3S`oOX5rTvVpwv>zmy -鈄VGQi T:]1ϴD:R]0vip:2w)<Юu fvڬϗ6}R" ٱs)λ[ZU!TΕuG"*h1V dm+־>6cehe z?<30* |9h L*/<(jKm*MFV>VW  !~ Go*!N SL ,*S4R%b-ٻ7WzU"CA#wyPh,9&93.s4N`N)ǪS9uTTm6?;r\e}E{YF6jVR0Wa"KB ʬfFd#Y.ش|=ᴼN1wڋa#l@:W*VՋpyV}߾vBkɧDޮYizNMu-_ds-v޲[)ˣ|5+W+)C8ZD)-2JMP/HHAih-*;. |zk)Oӊ-^ngxKKϖ=XgXJ<=Nbzu3n[^OZvHaS[M]߮UhC{ o8(l~W|sZ/_lǠ/Zm¿` "a_W3? m'oEmfUVJ}+.͚~c#/|p)ϏǮOA==qQFOG\AWko6uen'Pۿ^mC8~>]u~jv?+uS@ǁuuP]+XW}ջjsTk.-T3T{s 1/ӄ[+4'AGc_9v^=o 2B(9ѢTh%".TsXxF* d Å-Ko2l1B?U#b6aJIn ~> IʕHDm 58҅w]r@2Edo}{Idgb'HZ4ukH'j}-H 97=8w\:`Ѯ/ƅ|ZVg+;زXO.7&sPr/82E"zF e&Y^~ |N:lq `tkAƦ'+x} z:mc"~n SYiDM/=:J;c9|T; ,-3\E-㩪D奛ONg]߸]ߟˬ23&Xi7[0k  XMnߧ՞߆~^mYs&Oo~Ͽnys(l鬹amJxM>QȾ6[ﮆhp]+[ZdHN38*s3l-ZնrTal<.uDeI!hxsHH۔%Gҷ0cn>fDž9N\-b`!JyAdԊ;-geȞ-'[ֲj􂖔%IQ"$ȬHJ;ɜV<C-PqxGQR~|9. ^ûkG5%Gv51R0%H:Bт7|(.718'z WrђGPC/&N')s9ART 0]lAi>tASmv{_($K(j!rb`TĨMty{8Cq6DG+'ˆPw zdRo$}k+Zou[0Y Hdr>O )⬄K&Mk"fuNC[1@  cJ4v#8LUrrAn bz>,lҔ\<áf3cE$ѐ@MJR7 c"_I:Bт7| F3+tzc6N+ 5i՞>+؈: ¼ KSmLD."dKu6VwC!a{CSGk˹oT )ڢ >`wNv` 6F@ l`C5mT(s8ȎvuDhSG+;]%CDTɔH^E  P T𡠺/8zmaI4N6a88YxͣgQȑ̈0H]9\`j.hMӑFkaeMeEJLK&3M\bZ9bӻѕ|fb a>҉ Q c2 5|(bD k~q+d뛗Xok`NDk;)♶5|(#Ʊ'6Uaam Vyg9P&45Ͷޗsw m;PÇBlt5u҆v|+kTM! & ph8tѦ(&Mam>]!F=°4L>SkӍM v;標*s+1Rk+B5Dp >cю@  cgH_4N|{7L&):eg$ɩ*Bg)/x&EDhHBă.8gB؞ :dBP2͞`ρC̯n9v2PÇBw*NT )^#CJњg>Qc=n#dW9jP}{ ).s&DKbc%ϭ+)Ds fmQÇB>b}g#&g4O,[% :8 )qc}uKF6Rgp8J8/$(+Њii";)ٶ5|(#)\\OVE2(ݱ=wڜO17W̨k9*sэd5Oɏl[IAle0E5.Tᒭ/&25bC$Zd D8%)z-P'Y/H\ܬ:P=̛NC?4-yYOVoF<ᄄ5їUVT[ecƙOmZG4 <'TREURSGh>pk:6Q~˹>w;bQ/.|rugzGiiL|B+c==r7DXOk"5 (MKgh,Fsm5|(|)F뇇>e_ci=c.L 1GIer;T!/@1ghqhB}KLb9GuRhE0a*틮v#֒il'0V#l>Bsԛ(8WE@71 Y )Պj-=G݅,;Gi5*ٝ=C;y=*43aI~y/$ˡ]T)L/2A¡g$sA A>Llᚆ.xZ~mw>]=z(;1SNoqrR T5\Uz_]T*kM;=4 Vvز\P3z7/ M#nڀi( /5$y /͚YKRtVxMQÇ+Lŷ=<,Z_-yDû}MJeܗ[or ƅŗ8P lN˯ g${ܒ?:83{k^:>$/iN}nhMɡp>d䈵 8;UUkrdܛ)|<#w YcRT|쓳Q]%gDr8 1Voq67A ]*۬v!:">g;~Sc[{d9+i5~s3͗_xHz).@TȰ?лe{`vQ]9<^1Z3!}*aNȯ8F2D&qD'm5wQgQC^MgkelIsI8E[yySc;klUP!c* ӎ/x)zxYҌP;Ft`~e0p N|;c!m졆\g\dZy7&}R|PvH4(.1GŜ(s=韨Rv=O-}gjZxڮ+d8U?cB:닫3|L;әt:-SjPl]== Q}Y}í#|iX\mwPU]mqZha^Z >Y _m4& 8\.$hIIcgt#dn+kPr1P[[٨ˤf?ܝ|iT5ieXu/ܶ ی>cGk1Gv枮qɿ"4&w{3yL=܂ݎ푕߯(ْbKN9)uAbX%Y*ևPẈ+Ɯqȧ\j|,Aݾr3Fa5SR%~;f5`I<ЧmJ-6V*[Wʉ&Myr\eէciS~pg^ba% ebmVr}A.2J;fZ!$0)WOQ'zN jX̰'2r~ &SSkam,K(F1(K'لs}V_?хY*FF(S1,"sJgpZbևl5b>5zov"uObD n\ankvf}F+nW,;2뙿b{֕8^3?ꌃ5K5mV.6U\W*SG0c-3ME(~nj{U8ꌋT fF}F41K/ʭ>/WE^f]zè1 3nUca2ZN*wveo\V+S 3G:`*멄7{|.*){\Z?bS)=7U&7Ehcv7ZX@oOy(Z{#?4LkJO<?e?/ݝ[U?+"/rj^CU_6E:ˇ~i} v]\m? W}a8 tݺ~kƃ/CiWn$2&,~pֵ_=KwpŠkZqQF[_[[xYyMZAhI}ю۽aKZ]w gYc{kAu(.v$OsmqZZ)oYjL'wg)UKl}jb%³M]n3y0o(Sb–SU_.ELoC"9ܚ"NqprD;-v8:c<0c/+ޯg.)ц&W'dr8IS*6ԗ(RkpRAD.hޮltyr:avEtݰ-ѝwܸU]> ~LΖ߁mA>鬎ބ$G1Vp;iB2jyh| OhqYã7͌7ނn:cяŪ7k]E4^(pr~_ |-GQX6dLQy=F^Csrs8838f>^^ՕfĖ_(zC틾ncXoI.9)rʣ ,C۱ <Ʋu̧P|B7flQNao>ڇeJSq :6F%1N(IL~f2q؎snxB (Qd)DSP1&Snnv^c(_P0kȘByQ2^VEB*9( $̡&MxZ¾[4G=G:bmȘZ4ήi{,) x#tL#x`Ô:X a8:? 5'ٔԃPZꘜqq 3b-vxd:IJN\>aSQ~CXӲ<ƺc&"F3Q͊lvXy ++M?XB`.0-} wjEhd 5a&_M0V5mȘ:8CUK[C&cj4%ͳ,Nu4&R$yz fĖg(5K3w[3yjzfx(a09p}?:;\wW IMv=c8(TQ)g]cFlᕽQTU$L-ryk4cR0KvvxJPM09BBN[q b+%e60޳AMTIpr I4 g\HV*+QyLPog| k ?CdɘƄ,P &9.Lo)`:"`]SP2mA5H3u-3zsRCKJ!u%0hͥ&cm'E}T᷻,?{~&c='z F.w]Wma2أJn6h6dLQiZ b)]E WUF) nSe4dž(n C88E9Vz]m+  R8U!;1`;=8ty$$`IT&J'5tJdLL`iH,q}`TEfZ02C3z< ,1P |.>*l'N 1-cP*,j ˇNre 5Eo:8L$',dTmW&cj~է|4OB1z'T1c{}8D^S.%9VÙ\-&LruePb!c ̣Eᚿ5t/ c|FƭrxZNn=׿F??9 BY=7O©3kTyVPD"I[g1)\)( 0y;))9!@ 推s-Z:x}a2W%~inYF*pN ̬a/+볆gH#%3s߬VfbSxxhpG 0RE\X4\  у`5zBmZ[MQ0Y%Tujn|[o,Dr<9v~D+/ _U^2&Q^ IB=#&˲iw,8 #LYYe|ѐт!-ϢXQJRݝRg:$2!\ZDDŹ%B,rL\˗J L?exAT#Cm +ҬB0 ~!uUa2Pat؄=s89UU9@bɘ&Fn7+oA}r_%w4M2>d/~Y!|RaO [AdL"OX9dE&<K7"FJ7bG\{mhh:)`"'hw*2SGMYKA4gԒ4>zc8ty0BI*6K O-=Pzm' h0,;7)Șxޟ1Z!1rw\z8DamNxk,@9J#>߮~2LIpe9>fއ ua`teM4塺s" 7:'.ccA18At${%]z^z&)Cߙb1=E<џ$< 8yAyDIsρg\ A[gui7]!u+ 1)DKNTUIk%ɘ&TJ ?"3O Xw⚍ غ< 7DEP22--LRْ̤$ILc((f-PXD43xl.堮6՗8_>Ҧ#Vթʣ:N1fQ,#B7+:O=<1-%r P>ͳ0s |&Qi攋VYeixP< LwDBVw$@@DE4\>=\aI#JBXbIStl ] Is"0"#]N?5ESc$cN[ptΓ/J3}r2a=њ$y䊔ĚE,D j-}n^|"pt?^I\qT?V$wub/ptG(ΒU({i2n|{%{ Y{=g gidx`"QMXjS19(Jrigt-> 3w/"p/݋ÌT6P<`)+(P<˜/A`_޿`/N_Qmcʷ&=%{IDČGs\莏a0BDp}}z(]sg~C&Swf/<ߙݪrQ]^dKIs u7Ɯd9e:Kiθs2#^r[ xm ʓL:o(Yxe;\"Z?:@viP<7ݩVܢ~c~ =eC7kNyԫaײ>0ٜ΢;Z'Ï*b,%r(C:ŁZD3c% e4Ƀ*A|Zdp20<תNgdEdnd,7s9ht;ei {U6ME5IAH q霯:|ŁWOe. ނU7D%J.n,h@Pa` ##`2hKZ_4h,?tZ47v$#Nxz(2҃!jq(YMED(S2Z.JO a3\C-$B(m' xFsEiBI4Ea%*xz`)l =w"-ݮ`N̹/o-tg(5d*i7'H2'M3!ԝi-< t]O 0U3)OHHpZF<۽Fb)Y=u]v7,Nr+iSEA4D$B{FT [YW|K:z}a06^ߙPX0'~:OQ];kqEWaƋ ;\'%(DJ4WR^4z0h̷K v?p't rTi 4,~ w@9kW.M ڒy[J~(;y,ڨ9>A=t帪Yu3.=܂") T8zW,L\iKHphyOжH0A9 k]gLb^bЉ_Os? χa^@m 8ѻTmj?m h@~\R  8?ftּq \UmL<c9 (f௽i#Ǫq?^P}k1s_~jjJ,p||tv~c$O7l{Pfu;Ǥi7,inlb}B-x`WZ LnvAx2{UlRޗ`"uNLӮK0fUwW/E`=~ݾ搳Gwlw:6'[ yrӛc93pU1]sYHK|_Z?K8O*mͳ%^ί櫪#7 f-[Zx Bm[#,ҍe8g0ܩ? fGpF#`kЌg s8,C)Q|Dwu=>P'~Z5l+ThPgKa`R,]t",!auuZ<]Pn4m60y*)yIhEf%IH1-(A}f6e=]z8A)^D)M@EWcyt.uPV3+ n0Xt;\Z z^E1Vk}2 c64bԑYg#,31kʬ}ڜ> h9DhI"MӅI(]UwF]voڒ9j(hc^Q񘐽u{s ]gm !qS2V|uvZrx B(,b4#860aDa}Az)y.z6uGoԑUj[fa}H}|z0 q , %PV3-'whѻ<4:B8gXQC@ R>8 ta%HCTbd|ty{7F ȤauE29oR[iKAyf#^-to8 4IPҍAĆ xK3Xxњ`f ^)F48 (q0:IAMGyK-A^ޒvtQ*cuf?b";#Zo)ƒW|wNԏe:.(5kߤVc_ ?T#W`Z:YLSq-IIRF>S%x.w;n"FW4b'u\tDllH!T´t iAV/gн;CD,A 'N:,9ҫx̽Ywdί" <5Y.)|!4%&Q&$́0 y+sϽ,ic7 C` x8X}ڏ;uYHåFNO-rQe!2^WQ(X<ݵ+nzP$.@Έ7JDO,> +ߨLO%k9UOt0/>5V4~= #%,$LvT{CWޔymg&1[dBZr߫P۪H&+~*mr,RP eo6vjUxh*Z4ZQPʜ.H5m|X~s.̏c7;ģ;+װ_\LVĔ#hs;Z6v3fxI~o۬ 8=ڡuD?T9t~" KVVp_bCP53AEmc@zVvaLîC F*'&TfE"<8sDŗi72W MeQz$o^?.wiԋb~J@EG;UuG9 V >z"f>EF*/LO'#w99SZ3bMHBɹ(mV}d)7F|hQDFٽC5(SD{\s_N偛! 3z'3ڡFI@㷠D_j+Q#n}l*\f*GčF=&xQԲ+':_'tg|F'W#+O?qpAlNq@}'bl<7)20-Cx%&R5יrhW )É2{UO\t&.͈uMG~a2*w6؞L[8\1h`ݵ 5N'?r-2a1a[^i<9ء6q+@<5c_v_F}җi*'u؜N%p2kz^C{,f˺ FEF ~#LnU5u#G `uLzD=r1h߲7 I5GmŜQFaXyOPo$Ҕ39X-Lh/-N 4kVia*?Sz:u 7?Lvԧ/GueSN fXP#Hff9?_IQ?:>b mquĂ{LНvw1\`- ְ_ xd4oL(Go!éj&'P uugՌOR|=zZ%:%Iݴ3f X>=Y~`Y :pđ].y]G=OUb}8RŽoƽU=*5 h眡?TIyr{x\zDbUVa.zD%4J&'Ȃ4 Qcy|2  /L,b M SLIjW A^cT#Y_v5"Ŵ[f ^)F'q!f_][oc7+F/ i/ v;3y? my7ҿ-_/ﵣ~﫧fq+ikӏU;^&0>j_$|8y'_hlvXnݍ/E7J`>tVvujHVu 1>9?ś-RJupf<:<<'ʳ]܄\[nkyo8&oNzn6Y?مu-ۗ/dާ`6-֛կ侄_lGK?z4[7]+mLn;Ȼ8yھ%yn9#w?N[;>Tnfyn. "O_&./3Sߏ|M_ޜ!v6@;)^݀RTϿ ϗ,Bo8L`Oxfcxyx[G?};9wۯ`wNDe7O˜ \߽#T}?(;o>I4yLy_ߢ0Z ݇;࿟?k8}u d'IV8h7f3܃Ǽ8z~5oƽ>M%I2#Lup&y(ysbq %V+72cꯜ]pJ2P;dv18> >~~5+ % H䈕H%QR},+G ON81 yD3'L㫞8+]rT>?/Lu{ͻo^|vJ<B`T D``Idhඛc|8N82Qp sEDNhDje)r$*ʮS˗("A V'˜]jUB7|Lu#@sn9N0X&q\nn:d4G?4D=E'oț U&uq v{_sKh5!G6sFNne Չ'gs X~&Qtwp@89Ю 2ImѽJ;M5BMT(~ EZ @g>3;ϑÜ(YۺE:d݈RM|Z@xf1*% Ds_'ڏy+֙Uwi~%ǣ8Y ,@6 a0b9В/5'"<\Ӈ(ВpNi n'Xr8ӚbTQՃ/'OOrG(.!m44˰ᅁP  `չ 2N+j!58r8z-bNr'V䰶*iLlOWpK8Wۨ;6;`^ (16 mCJ$:`=ȽO 떴#6z*-*XD$C( ޟ^~"+訵Si6]BQ#sa,>-r x`+ 9LᗫM+k.7FEϊO*:q;d4G}8̉WP̪1b8@>L :,H)،U ҂Սq=29.dI z)*>.gc Nes2[ " &(Mwp$.$Nr$kd_/@Ey䑒+AqH(.`+jXm(W"F$^Jo 2eZr|5,a?̊A-_~ϡQ<q ou8^G#毯=CdaQvrk-LQPGQ޸mAP -|+n{νRTc*\۝{5S8ՌbAUyJI!18THC e͘_^ְlH;1ӧɺjf?d^~aXpX%4x{&Hόsld4c91%2/+{dgFi :+[(2#v+?t+|ߣ=T[O`w911|1PHֻ2q,Z,s `-Im2e|jՌoFTco}X4Ji\5~UmEͼQcP! fM#רg_@cXda,Udwh JG kVTf48-xER$A.,8^3 0 ΆqΫuIh *NvQ-ƀ~S/sԇ5&.@$%'@r꺑]r٧ʍFo HąUr8e8|3lc%|M-Ј"JnW h {Jwh ș {$甀 w"/-YBګj(2X¼ʜ0GC7FBE|O<ū(}/B*ܴڔǞ1(shƊ#D׿|~^B:d$ܹ{6 Y+<y|BQD@ݷ:+ _&Ye@ &H%&^E|h)YߟQfLQET/H Y@H` U(y6>/]F^)!q0vc0*tEl=,%*bHxG{0 C@L=VId =6 jzEN$CU[qCFcp*azS"cy 3вUE T2kSWhYVt5zd,!/U63pPB3#iXHd"EoFL3(ֳuE 2;d4Bw,B  @VJ <8C,Q5/h$$ oB)a ]!~N}.6UzD%1Ɓ:<;i;@IaT<ɩk* (%n%3():d4Gտ{Z,s)pr2GPѢ!xz;FTc^"z!z?CR?{4J-+ԋ˰uS{tA lQJDH2cSEUHCZF_pS2- r K91Bb\߼W? %[CFcpHkNjUEN2Cg)DR\{9ᡢ{CsӀyT><,%u2 v^!18wef&Fo _ɚ C"H<},׸|C\mvxrki'shy[U^ Z~ɲft0,ոoky)YmswG\B;pTle"t1;؇?ImopĢ2X~=lhCL[ -HFaїxH`r1jHNvCl;1Qd~pxp"3 ђI6DojGuK\֧j}u꼍G_6z,z84 kNTGwzADyKWYme,ոxs?5rjѤ|FGԮZ7nuke:< i5bf;hrt5p? ðNauڃ:[ #6VUyk|=`O>3sY5,$봤N|WC#7$*EOd^ϣϣ?~$EXUd>_ԏK4%"M"/dv=F T?_d2~" 12Uc%e\ |7%Or毧IIJ]M ]/6~}PIbĭ,$+|/c#kzN>7>T$oMhd=[k D%Wֿ;E'lf/j'@;$}B^7 `)6EڷI骥"H*m!.~  W;أW\R{,E{>C+ I7d s:GbW9`xμ}\NsX <gRd$x'V~ւRDX4BiMBd=ʡZeyf կȅoUh'K%6ٟ+}z oo5"q+ e<x6Q5Va*t 5[V]/bǪ)eNu@.8hoRCo7j?&r` ׃i6~ϛqk2.P{G7{{A] o߭t(kϣzWVw6j! XE5pkAS)jݏdGNwVѷmҋE@(KAAA\YYn|]]^"_8/b>q/w9(&2BI|qE``T^QztO~هy$Ou7Muf z,  rC0dp@<9s9Bꇘ7ȼ//:=}MI7g} ` OPWk\,𪔗bj#ZcwVb}Y^ <$iwrkײit's;{SYrzd /{ ^6.ƯZeIM7(4۩Y63IF ƻWFN[o-Ǭ#`qYXppvƨ9E~t;ثmK@eMvSG(דkD#vNEjS / .-PVٖo6vOnC.\-˫sj[ 2%3 פ/`T*3=8c*'lQpǕv&͉"0KpۧR14Uoϗ>< ]A)q`|($]3`#(K¡:77{IswXd:mF5coib"֗79,CPXr@71!".x;q(G܈E<}q&9sq>q 2Ql3h3!gArN!kiiƝ1k$AA>Fen@XO)aܻ_,gytY.o/=;ԓ}6HX ڿ׾;ߠ~?sb@A B$Q(Wbpਜ2;RP7ũ)I|^L~?\n?\n~$Qlyc?Ⳮ'4QgX['3 c̼lFJǤnxVͶ<*6Ѵ2u0-L~&Fo.0<3:wŊ>ɣŊa$BԷ 86*6= Ŋh " MKO:[VnvbUgxn߁v{lwYcXuVnΔ!y&Yd(΅ RTeġe41M$ͩmĄk &v0~˃Ar$M$1R?e*tq(]UX|U#VeK]Hb7.D*ZCg/f: ~qn ˆ!G Z2y`<3i7)Gr*&^B= ԁj&MZsc WjMuh "{厦E3$w`T]M*G("JRj3\#׽uN'09(?@$yӃA c)l< 5[JY>g3H.PDTAl!5gwOeثnpd[O-v궓3\|rQMmdR^ky:&+h`*;%س!1>3ZvFyy Q9VʗɎpD 9o:Qcu;v63H&T2 n$=XXڄ+\a{(G.m#azL2}-KKHFKr݁0qçOj/$Mn˕ "yl7JIfW#QA'3|a Gxp ";$ȱb! @t6& h--GYhBhgU#羋"=<<0q tF^ {G:S;mF '2 lkd!e?&5|s2Br)1\Y!4d YKXܻ>cY) N LV!lKwLgjzDpQ43.7܌-"z"=K5S^p~"eL-?)Wgg"ZYm;' 0vHY(Y$b 0őωDpsHjGW,&|L AfSے ֽ2IPk%(8G*,~>af77] _SyXw1zۨ\f7@h\d˴uUXPF0٫ ];ʥQ4Q٧Wro|Ix0YiJ߫ˤ^j1IV:Mx UwJ1C+UxJos9+|T|}^[%ͻbaԌZ庯$d=ZoZѼp m/e]64҅@<]tuNP nG{x=\:p OAդ_ց%Qj){&,:&}Frs'EI0p@'"~.(\R-="h?{6&} s&f5V;͑%|vh~d; c cK[R8>,$ $&7U΢??{%vs-UZȒarܦNE1p|BĻ?,ݺ&_V;'ϷRϖGn-k (GS? D9uU'D7~D%{-x||~F ? 4F棘ӓղM FA޾ ~WN4`ёX$ f2(bd>R_T>6qx:.QcDR8qqH**o8g$W.JODaplȫKA'=.?^j[^xOFSuͩkn6(mev׭A?{7bsmĚ.|cߣ,&*O!q.s/ǹOius{mX p GcH 5c@"i-_s\sjQ!;; ksL_s4A(JmNr?mZA_R]\TR`p-[ úT Qs[mpR`mGְ kDoZ6cjLw `¨sݡ  M4mBVQCioŠ a Ъ=̆Ͷ=}HfR)cw}l( Lu] CmC7 e,,hg}5B{I=5wCpaV/YSソw{;GjZά{E"޷ 8+5[oz~(u&X]o,XK +'p:85[7LP%1϶Unp5起Ýfkeyf`mU3^,<&&l}9G-湔8l%߶ vk.N{=hn{w=8mYolDM]h.U v\S5(f́pT::—M{_0LF$p!R}D1*ط06R̓q{ u۲UÿWzw:BG0OQ\kltK fd`"48s.\(4H-lQ)r۲iŘ7PO!uh(qLې p`'ie֚ h[?%oĥ7O=yDN'S/i]\h*RKt t%SipJFƩĩLB:TZp).T UZE{qg9&O!:輑DS!/]*m<]r,gR߶N4I0Zg|vk(!Eh33e\ߣ</Zo)Wr'@qyV\\= |N(=DIbReH)8LGpFSUAr. Vk@1`{ ^j,]O_D\.qzq(1mhڎ8IԆeLbtBR9OXI a6$FKg "cm0)''C b<ʊ(>Sk6P)LDQZNԲIBGMc,ȓ|%HNlQ<6a{g𔘜smŏ<9WkajV撴8Yڝ$ 9G_HBO2@EKOxCEK D*cϭ_(p} ) c:Z&Q*uAUv-H2MU ߲|MaX)k!u}XzaxԒ6F:`,a²5P_VqJ SL-t*5-`;/ag\wE{ NSyҵeW^bF Vo*jOKΦ`RUJSdV Wށ F(΅D$wtrETb9O_HFP4m ^jA, E;II<zu $FǛKozZ2FpCSfrꪆ4cśahsI6铎R> n%-:~LB2 Qi_ !Q (@9 O~Ș1P2,﫲S9jZ(XrI\0NAx cQ ~ѓIxq&ӵT} *hk˵?Ӌ:]''B9D̅cA,[K"%ȺU[ Y|6D`.+#$HU5.*Brt,5e\^oYq m՞^to:eqA:wjr~/>NӺp-Us碁l7xe=J׵t:-6ᛔbcӟfU|6/~xtdLxfsrgsr ӯ݉t MP? %@ǽ0L?1$NBm4uՊ)\18ڛ(NE*H},/ (7.&fe?J}}xVe^RDKE,E"?HmM(pQU= h9,+#L2pz% KN|)IT@T@|ZZiP+Y*h(vg@U8T y>,?m+ J?CP3\\gdc@Nkb *0=M5V-щySL3X* WmCs2ȦlG@6e;/n5g6+Rү<2SP`…N )J C:寿~Fbz{߂Ak;5{ZE+lU(nʦ|\AkZ;tRj^lU"yl/Ӌ~_~=򉃧`FSKJ&,]8bH2IC̐D7Y\NDQQ-)!d'|YdzẬHŵE/e!kz/;M <tD"&0EZ!.r"\?v$aBwvrV' @I阏:!!:A^k& oߖ% xy~vW'Qz_*Qx.q澛FXzI؆~B/ !/<::9|@h8e(OEBaDT|vmNO$#uKOY & I=lnӋa_Bk/kGlϵŨHfTA)0gd*FЕ@؟?1e+pާ/x\IF&ի-YՐ?#=.Z~=I Ye&7Q)z` > Mp\h%b :ydcHC@ X?x tş>s,;X /§杆D E&Yie5 \_&<37,p)]mL,Ynqfx\Jzy)\n[,O!n燋!]sЂ˙&$] cr x"0sDh ߕGq0[X#9YQBgbYsf)茂ocGߛ |.]"$bmGtK-֕K2?I|fꯎ.˴]m+F_]jEX_E*ܙ˧pz":UAw۸! +JF#9 c1#>', $NCx&/c,W.MUd.*0&؜ҁS92ߝ׳آѓiTH'ŴkТX xY5icA]w/3Rg uͮ= ]YU> Ag ʼ2oY麪b= 8"tW@$w:LEL)*hUJGwfG9H)Bf U닂 ,r} ($Ͳ>{ĥsMkܶx ׳hzgK$S7t/o)Ʀϲ zyi5J;ٳ>&Npl3y{87;$!MYdC6pq~7ȧ|wơ؝q(vg3."tc0fpCu=q}sT.1|o( \u30l)gdeSَJ!v^B0^Z^Z4^Qp+(\J ,u\v>[vȺbU V*[oU V*[oU V*[oU V*[oضrخ`.Vڔm)<)t,/,@x|a1̩&u,_l uh.Y:.Y.Y.Y.Y.Y.Y.Y.Y.Y.Y.YժժժժժRn/ݸyu5w;|.70q`(q{0CuDm]7ec>UBa(ϰn$2R?Em[_dns:M 'sƵXSJ{{#V#˺'7~CEvjuY"Ap.v1]7;}C]{Ǡ51r]z ft%zm8YY{fWlH}L_WylΰVv r ̛Ȩ͒W1|T+/ ]ܾ)~itjxbԁ(5@ni9.\>h+i]u?ͥ "k64{i5(-1֖j+K!]~=HBn\ 4#[@j ;4!3,?޹\6wA}^;Hv~G"N_wJS'DB/0M Ny&}ʹ3캢U6]s*I:o4Եfr;oroG̢0ĈQ c sG+A9S'fPj?!m|p1p+"- @)arBS1dQK!'%4 D\ǜ82bƹAaj[AC~DP+&@R<,)dcF(`g58mtN޽inb :=rD( PPV0žJR1$s}8sӣM, 4ıf124G1S-DDqU?}n.äGLa_KO*ǂbEAQ0>!,Xz j7IELCdú#"%d W>B&Rf4Uy8C 4b:#*mh~5'k`8DI)u[ ӟ;lmڍ)K?_hgAz4unn٬lF=L5w#f#̄1e$6D@=Q_hIdK\rc[/t#'MA ,AVhHa<Wl-"sO"0O+ n.y9!+QjB#GPbRA$(k D.  ć0`Fi,yK5⒏^z|g >dDԽm5?Ѹq4_.Hi5yn27#Vg~:z'^v_'ZRn k{Tr+,JJskYea9TD޽LOLj5S^. t}_R#Kv^ϴqO5J5.FV*^xv%PzӾ$$}x_&y7oX 2򺗽zhV41zPq|O\/pmH?̚F2.ny`^-@tE%7&#GF;~mL@;.4=ϢdЙ*eoEkj394F H &UhA(9%8($x,0Y)| הR4 s)$X-/bxq2~4I@F_VZ~K\227U.a4Np|FOJ$MpbFihb66ꔑ(u6 /N1|P< ml1o%[&\ TD~IiJcuLPZ.@0F()hfj4czɻ9g0Q15 K|!dYgir_L혡+6-,&6֕NNAZ$_I^kTJCӍG_097s)c;F 씥1^< 1Š0B"8=_Rcຒ840jVq(5x!ma<1͌&ռoݾ-6d9084F6v5**<lYep,>$.VCE-¤N?n9c">V V%5 cb<1#BX"ЈT<`(P☃$($<"Ȝ ak&ZE7{aB\ S.FU8?/>_ΧnSꇞ\\ȫ/}'O׸Ӯޮ[V[X:֭"myęYzꃼcB:_MϿ+$W_dJfyH_{ߣk׏)#*F5M*S&SZjAjv'y^~;&HfG'dX+;nQCrLǷnļv`9(hNަj4g 9Hp(\B;3>oV?}.e ypB[r*(al/ɘdD{Vwq5AhSSy\vUU:_r@owTH>nkmΒ.'u|ڮ tuWR|ڕjΝ6i욳 z0#@n6]2e˔-sg%p1u{N`::~zbkPicunrN_n.ȘƜw"ծƞs eE>{30߯;j0Vk HibZk~ !]VT3/ׄ\!brF,UJbԓP辔BxhuoynB[Q*B3)0V]:]:9|堧 9LpՒLr2ylr <Ѱ㩴{o]4eG2v1 /yzivp)DY`elU9OyD QnqҪ9ȍ ʶR@%}NHs 3gf RoiLPj9zjK:6r5z4N4I&YP{fywƝ͊J d'}+}e3qƓ6PVRT;qnp헓2Qn[vwy As/<;Ϙ/Ƭ\HID1JhX q8S,dbZHTj54+NB4ucݗ=E rU&~'nӚ:_gԕ@Yqz>u>zT^x~6邾2h?J^n7ֻ2r +63{Wޑ y}ٛ3^~EvTjsW<4p_o$[ J!);Wp4tIr9Sܰ4 k(QLCqCBau~Aܵl)ORNoc97HӇ';ߞ:pũ sCq8J%4JӺν~b kuS2ԗ6ܮl-6MYg;Ճ `՗A3j:LOb\3u+ndِGGMau>d͘0_ݰā vV7s Z5Q~5Y Ri^wyE+wEc#8;{jQIs=v dzwd>U i>D70 mʆ;кF ming pVն3yL48훅aB`$z//rɻng+ ^f` CԻQREo'Ugܕi뒎sݣ9a֨/^ɶ;KY"Z'O͋'xO=P 1sIqg_ɧ;Pc@]S_D7rsms^3$}WU)|n\#xUώ?w~?6dtۚo%ˀ `&`K2uK}.6oժv7G Xww<5vxig jBmMhVƄ+DLnG3$f%T>9Ǽ&(?ͅ0M9 GT͡k..\j?å( s{u& {^gCq$ny)=폝z7g%/m ,q-"̬΃AO,d΁Uh̗EE.IyEDr朖T"(FJ&UT 4*FMpJ%R 4A,U8+'^`"cKvM}وe*q&0tٯy:uHݠҞDWFċX͹;7,nyO ޓ$)2\$dۤ4c^1uV~~3V,NR47Me%|2R!Fd0KRHl ,66YL76fgᬉp!oS;.d.Wݸe_"RgZ;V }:IzW\_8 RJ "8ցQJ^pNվ[z|[. |dW nKkfrr!әu&m0umiY59!hM)k"޴ 49U) :7ޅVdvOyZ7_T^whէUk }Ar]ԙսGMCL[;ͬZ3X"‘D٨y0fBH01T Щe#x}7xy.x}ad57 3Ȱe(CT6.o.=E/@*W:v\X6aVe;%^lcGd?̻&G͕7,tuWt߲ͥG n|/lj.߾^onv%XkA 2؇fd-F Web4ESùG2$s"j c.dx\2b!W֊m._E]g%<0b(xQi>=}O/EqcV|Aʻ^^ӣ(q{ ^n )F\iFNfosP5`ؿ6+|oqH0/fgjH6J>TBbRj`-LK"}L"ۧ_lo^Ntg5ٹ03JY~Iaj B5; q#Ͷ DR CpMS~ XPfs_3ic&wδ@/ l5dwβbE霿E0Pqm!9֞BbmL2TMd\=bm OibXߩbժfܘ*f3u{T2o~ HE |ւNiH><0 @!q:كO%7M3}Ҙ((QFB(ƘXBJ$i PDGRC-k*}A>ب^&mἈŝyחgV/f>+b&!J$׆?3ٮ3Lr}yߜmLgS G:$>$JIȥ`!ڗ:0p3`3 8y[;/MFïYj8#-@֚͆58gzi)k_S6诺k:RVvKi i]u!?e} <>Ž9whi_\JXJ˙fkƤ=?^ 5;CEPs%\r/?fJ-3t~{3mvwij(ɧSybv ^I^GMɐvl 39q|| ew 2 +_%1lw5y,kO׏-^o%_egVpcLǘǘ3:grLeOܡiTCJ6ĎHm6wdKҶ=tnFr͘N,mleʓy$2+vs&5Tx!cҿ tSmx% *nxHEr2#\-o6r9-z"iN&EB+{i.Vv҄h8}MØ eӝePT<cޖ '-F_n>tGZm^o- ❧>2=큔Ju`7n4 y1eJCnԍh8j() ` 0 #(f.Ex7Vg,U){,m{Y@ىzC>Q+fxejeM\8}In5I 156P&7X(7$0H8hk~aWdj"굝*غU>oIԜL6&Dj (RSJ$XI`Ȃ BL.lppܵO7Ek,C%E_SQd^`^MGnsL>< F#TڠEP"Z(틘(npRqp'Q L54@ bLR*$Yk\`\H [V2h18sP $4F< '|e] / .AIvM}<ՋٳtS0/GNٯyzqnN#{2ɕ "Ox#JzARM?moϙgOdwϿ"$⮉r Lj>i(ozQ5Uq*Tɹ!j17xnu#K zQ/[70$(!! sOuaY"_L\eEp"YBsS;.NPykn!2?R-a!(V."X@يHwmXe;a X&Nvg,f|=%$э{YUe[rX J<9,F~V&dSfeLJd$aH2#Fal)\x-eӋi{ ~$ws?K?DӉG>Q%Xv瓲o?+!o9x;+ʌ3,qEU]B8} Eq4#ya[u]y'k;/a63OL\nfJZ.)u:8 P ߧLn ;O7F͖?]ۚ7Ҝۚˇk$\[^zvǷ؛;/:zex w_ -g^Bx\sƯf]`4n} -u/g{F kB GL~3Y#a@#wз_gQ;81;>MnI!Yk52r ⨱SVC_atE|ӓv}>?~ 7^>ZF$ up2Vݛ_4uX:ð;(+ǡO0\I>];)E3(QOnQFC%sZ8Γ  %5h{aݾƾСoot G 5b(uхܭC-Pm%8>σ>6૳I6y,;isVn. n^ECN$ڢ!\tMf=@ 8l}\ ?D;y32%/BF6cP?B~77*h8YԿ͖&qּ]OO`YhLcd n%n]NҌǓR;Mb[זNF-#>p|7Ο裏nЦ\PGH!Ӧ) )Z0`_걒nKhzR9|u,;YxU􏛱߬˃rFˠlY^xF7wG{A~ PC;뿬 Q巽ؽ3u_"WޑͪOlV[[9ۨȥU3m$fP\J{,;%[Bz{>xJeA!L99TwTUZn>kEl2C$e<]0uܼ>jКqO{y/hc(y]{1_x^;tol/&;ZA_d?7zI~f 9IGB|ո)+" 5 䢶ȁ{s-bjIznH7 ὑ!NjNl7b/osG܄qβ5ٟfxKL8Ϛ8 ZnHTw6^:x۞睟}5Ɗi 4A/g;e{)M4WP]݀9+9|O^z\qj_RTn]Ru"BHO \(&UԈGːc"I0UhDsYBIN]4|Oisyw٫byŴv-\=Ք~C/:ǐk~OVvovuzˏ<:& _!W ׏+mf{$ªpCI|J7k@iJ=>XX8DL!˜N bb!'R0cb)9cVo9ȝ&/C97 rAdVYC-O);2 aLbΝfN<"]2XI^RxJkrq PHBeG!Hs&F"1_bxmH1OÁD  .Pd^cRB`j'8hoäM i Us7Oa`T*ib<2*=K,(ǵH{"qč~FbF OP|*5Mr.hIjG $ y/a Kynba) 欠 9hd)g,0ʨ C-O>a*Ь(8 UO&*X`]Wq^)3- rxۗXhNm++pZ& "@5U'?0ϗX~[ +@s\kr$Ƅ /PJI(M6E!&cG@+P<6$@-!@VJ <n_bxIa*XB šb%BNtK,/-i%hoC=}-D2 /P< OBA‒&d&EqyQC 5:o^IvJm^zcInb^T+B4CoY< $SyWd|z'Gr׽6q16%Tt@y&58♃j Zw]f01qHd"$ua%LP]_2cEzKtX^*K[ܡ3ɮJaiQ%Y=h(B]{B+pYλhUHSra,&4WX56|>ZMGR@mzdIS 53`)*f@k HX`WA1ЯX{tiFz"6ﯰaX^oLBPJ NC#C`Jyn,_`x ^Tń氠6pkkQ }Vψ" d4Wഠ,pĈ$K_`x%pb[q(XhWV%j' xN2B" i*$vX^SQ"1:"8FH8LTih/B)ո31뎖:s^+X^ʖX Y+(1|& m`*Ղ0Цx`̗X^ߋSH{H0. lXu/P"v'7#eZrP[DdhPu%6KJ% ,#V!sXî ~K Q:k- 畾f @$"=*~,N pɺBSUQ@fhV`I?9* 0:U&yaʇ~GTISR?+,q0N[#XVaԧKp{8oUiac\u6 Å5\yGI(]OEM ;}knY˷<NG1  nJg1VR..~GaW-Yk~.S o&Q0.5lByE`X'}/8!LjEAXm_'@$ECd$ѷڽ!2,Q Ee{܇0cX)V F7nȟ:yXh#WT9(T0>Q? 0poZ{>(h 6|dh|CXN4LNL@CNfFۡH!OgƤ~R&N],Mr9e^:2 Qkijۛy[YX2:Jv:}lw;G{gi +uM*g=s^MLn;p{C+}Wϧ1[Y|dwAվ:+tKޱEȹHs5 !b|N|$o(;a+nSaks>[$p N a O8I\_0=3,QMaeC50--iGg`ތlZ,-{uf4nwY^^?y=VpMa `aGip {)ZNAԿ>LX5?>0x:8 e04\}H~cw⫬ō/PL@;$[)LrݿLlN882NcRWL?̻1:OLlPrϢ l]}mT9\覆__n/rP'WŘ=oI譯iʺ'ٙ(ʒ-2mۊDQm懗ۓ]v "!kQ|ŷ=bR-Sn9?/ЉI_MqT04r5ٝl9%ꙶ)/+MRly,BxFqÌDڏ$H(AtOGm2FrlaT46a9y`WGʄ-RtԘ}#f,E $ƭ5õСhDIcJ2[QlsސPцfh̋ MwN6ހB^eo.FX ? [w!}+uw뤻C{wpSZ_~۽ .y{[[|k]CO{B#yC}0jY,h{z[6- їۣ,$>HW8 Q =:KJ,uwXPF.WWX ]|`~DNJ0D#b l̼J*@Dԋ#+O lr 9rLH5u~2Ib ҉j8y UNQG"7q/WnK+fL83ma,ˏФ% |)=(?D{鸃" ox*)Av]5jؖ[Ė q$:ٮRs}oHDdϡ%״Y Mi,[ ФPjbIc6ɪqlYӨ&:{2btcD}Q,Į;N;[폇ãf8%6қ$A6 -7NnֳKE BG@ {T'k"vwg︽9<#ԋL_k/ ~Nm6'{p@E?=<)><Z;:qd uq磫Eoo'l#vA8C Aɴz$'->nbуP{kĴ:.D $@yr"#K ̊p&{(3VْU9*j>XqtdUVLJZ,B"88Mο]b O1^}풡J2kevZ뽙)|iN^g@|q=& N1PI!"IRLL8KdI`מJTI\&$9*U%-۷K=&oHDxz_zvO[ўsxsctx]ۃ{,L 3vQ5! $^[DưO◴lKgDݬDkoo6@N! +r,.[C] iẻ̤=/?7P|lF.닼s Ehg(.h6qxI.!9>Lo0 "TXT|tO{ ɖDtN|CkPRbY !L{ܿ[ų~B?Q00uk\P W4p 'ff\uP LJ3!T8T&©\ٲ>Bo~荟^]WW[N_4 d?;(v -o|nF].}۫ jbWV,W5/u|K"Ƈ证Q0V^/4kC/6$%* EE@JPn'~w7C ϖ*!T9}# MZmBʄ+w.RmJqXL$0UtS0UZҖoRaKootr"E"UN`'y}Z'wOۓ-˵tb92O]ە-j(4JA4bغ"QP;~nf"gddAL :mv1Í_EBG>X*˫x\[{fF")Pz~xݯhtQMFXU!kor`|3`r7~]Unm*v -0G)hSC ]HQ.a`TO//Ts@d]]_٪\UKsλ?Z0ؼ7PF\xI Ci ZΜK/0qQ IxͣYpXQKNY,4FtNyE*`&Ff|p,eG6{拶8dIG݋WKp0~'WTG.w3=&m6~hc+ [XG]*ڲImP[ &2F,j.;bS"计|W"%Dq4=BZr0/~S[uj+J2CVwu~2* *D]kPB_TyG8f=_5-֝U~m_S}} Q?\l6gV3Kcө5ohgt-A0ٗo'60Oؽm6Аܜ_jb5h^J@63Ѓ:atX)q(ÿ;>^=]׳^z3ϕ&]ý r)٫J@<ݜ&?ONIʆ)j 6}UTL6-XIC}SWSKu4= ehLTpt}˷eC5cZٚi̫ >"@qʵGF۲҄j\k(jd? 9;b <]]K>,UT0a(TL[lÑ ŗ|ۥ:0%eQK5G4SQa4=1MrKJPzKʶ7_Q*-=dKwwg&gyĺ%mR6yjEW//p#@f#n)`?7نgnbBb"eK<~TF 5-ŔX bX_G4/Y3m&@Zm?T7'*,5|n 23AehM#8&.Nq˦eXֻQʕgc2re;CU-fayǁ?ߢ_ÈLu!W?w-Ydف< sc\pf)Td=JuΪtXZn>uwX%nUVNisoAڣm(j mtfuWKtTlnMduWo\,Z/MA],ㅲ\`n6|i͝?l0vwXXklyY2Վ Rqy0?Gho5]{ܭJĽ r2A&\=lU 4nK,JrBes0"4E iv4= ݯ/Mc=[֖tExRCyb7XD)ig$Q } p*L44h 2n|Bt}) "„0mI*P Rf1 %#n a$!w. ډQ|/ &6Zs*`6.`)n{8aѤEGCRN;Br@(EpqxjwXv}-XJ9<v`G}8g&t*whc>dG_/޽EXe7$$3 ~Mq6u2/xvjRN8xӒ |sEݬvfg1GWCҪD[X0hŽB+jZjA2AZ,(b,':*AP!EAO[S=;Y|Y׻x}5$Ϝ3QIah,?rCҁk0Ȑ@Gq  "f{aHUo݃ܦ]źY6\($wA쥏D},rjTCfANl TGMVb4.]125y}TMr&@u43򵐺?UGAEHܯMO%He"K5=>z 3g˲d9J9yS0qm o.VK_~q}EUnEmZ G 68쬗 g̬Y1O1uzůJ:H"DzMzҒ7ll+o?ȞV>l\. 0I[rgKץMɌ5g[ǑLdo\ 26.ˤf^_3f9&\l7m`mZPpC0pr56Ȇb6=I-&rRuǡAr<ɲ/T[ @'}}-OXX]_x CQ?q҃oh~,Z.8Fk+Q/K8f9w3qno_ձzXcd6^q*j"%X0%$Ԓ ciۯ &We*j/c@$bN xNyҍ{-ZbOZC^FE?<_0^{YJ#ޙh㍳h48f="!VD+f0ۺvOO>-lPAtl2kZִiԴְ=(ņX:~2QYI<;[sm\bV.(Nڐb`{{ &zRj)>|d Yc@PPQ`ĚyՑ_HZUHjrEzxSKh7BR㤂CQ ƋT)*RC鈳ɝb 6w;ָ[/qE>f5V2Zf)͈&0;ThĢz䁠,n}d y:s8"=J@(2Q/RûI'i=t/RD..t'+rt}/>ԋ}Sɽ+"z؟Y73RI9cfM?һQÐDq8 E,>nmz_͸ˀj~;zw.ENߕ~Rt}ERoFF!#KO)&W.j\Όg58Bㅴ@2qK+몫4]֋jzgmKEe\(c^RoU;l[MjxP|^uJ]Tmgul⼈X|ߕFl~fY4㻳WΙLEUp_~GX k py\v"e]3D-_Y8xS`}E Ә,Gzd-i-kz5h|s& lC'p{꧟O6pWm1 a,$tZeyt JL+wȁĂ3@-T0@,c#De^ӥ)N9LIo N3GVu>t-$UqF1㎘qĽפ $9Tb1͵w@@HLmȄ Ym==2{sU'\̙Y1gOO8pUa,nV .,FnuG5%<#!ڒA5[X0jF+gQs;}C>=ϵ\/xSo![[PϦ ;oƣe7['NI4o|Vwm+fc⁙wq̱"ԵSv=sJD"@Az⁉Xh*@p&$K`ݪT,{}%|Y0YZZNN%N㋤sJC(H=Ts30TDŴw˝QVKׇw[#s~:@F7àΟ9*^)t\^,_Lj#;P0v5#yJPZJ@)dxQɼ pЩe90$gSFOiM>ʵ:,LL+#Z9$i)ǛpPl#"þnGrS$PBT+ YLiiW{3V+v]Nr{}̧K!ȕvhw9ڳET#y(Ai7(\,#\,!ˮ\,!E^4QC|BӀ8TiTJ{QE{4JArW^b<_aoQiZ&@} ሽcA0MT%QC6n*ӱcg\rr79EC-Q;3)lMx!bFf1 %#?,$ԻreAYcw+m삉- :ND?#}3EvÇLA1|hRD2Ǐ߇>jDq.2wH+ QaT"D/Sq8|49DdT 9C5}shc>䨎^:vOQ#?}T)W::/Re{ 8PQg'jJc z 2~Ͽ>|o\<( *UiϓF\TI/o}H K=(E+ DX[7xs hf*UW{yp&z6Q*T}(yv .3MAYլ$*yw!)cI7ǹrZ~._ y]08jvNj%ݧ6]oge;S1qWqן ;L[jhqq*^ !NIr  %$WםNvm #i~dٺ#68 F6Qg`3Z|mWy;};65kwM-IKr U u+ Awe;VW|1Ph/c4 Yrjc}.C=f(SCwUEP n\.RLrHTjxTjW򂯘 ]L۴fz7#CoDwEݖ}՞X0cjyL40/M4f^+/qc:4V{4ߎVUvڥJ+unD줰sW=r=4oZhWmTAs-r1!o6!Kz0w+#cRsUI!|"rd"^I1h͘v׵ _O4F4]~ge3u64+kV.)N./sOۈ/Ghf#0 jӺLk<*÷Lp2T#R݋'9R1t%[|9Zqȗ0yې/QȗZu|!'|!%SYCܡ0[vP%ĿOڈ D¦@MHdTb*aV e_q;/>РKq±[oe%UgbEN22em;V^q{~o=&o綧b1Jibb D :gIO=a)5 '5/雏 _y:+CHX(fwAB%@ydZs"N)}x6V<NG]8ao3hZ,ʃ&#UMսdw\an_"wU !9sТBbTƠSV dmt`m> aw\^V7s۴NM|Jyg F_sV"*4KZ 3(Y.0N *1$MF&J!6t".T;-F )= "F@ IDGe[ uS0V00f4X*5 ΈIR)52JX\19?Xq>ZVVvIuaSQ%jU8]Ⱦ,䛲<ኟQMk}iaf0"As/62!4CI$ I&pd(U/ F eSRD5oR-]9-Z鰝zPh"Y`VG&xOVAtc8c٨j̅6r~k#;5Jw`vh%8ܠ  !q &+]\2q[˯^\VJ\-+&0_NӴ?q57eAtA%|f8U;zTu?Y <.AmK՗.H y\>ˇuLW0!_]8ECw挠 UJ_ a@W*]76Y BzGW}3@ wa?|)l[" q rԩ_4fY~,)S3˜$cqk$26$m r^2pZf%3{]LQQIP@?[ZJC<Ih?&Q\>N>m_|D}Պ#_F߻JC R9|g4#F㩌%eb/2)Ȭ6suH d3i:j%0֨kW3[T9bC*7Έ\љ3Z)]\V*Q +tI\A)WvE\ q*Rqu 4F]ɧzv<^V F>.g~m;4VD9 8xlDZ3ᕡ[jzur./X\mrv ÿ|3C݁0pwd:j<;'֫uA\ݿ1 (^˃1XүλJ5-mwȖv6d\t]h x N*Em}y\i6K\ )YP/g@O0WȽg`& ?] zEv`Pl89Pztb{f]M5} 2(o]c6̖~v\.UZ٬s#`'}˾n(VaQ;YԤ8\-p6z|%A&Y˕pGmqϿzu5f|l6!WBaܺ y}ሜ-ؒ'l'R!N@<J Q%jl%/̻ؒ[Yɽs6߶d7) lUNHTc411"o3{R 69k5IlʹuwZD+yce@HX(fwAB%(LkN)u+ywncE+ K*s`-uN9ģ:$F(8zA8l :%nU Hf &|+~/PA W؋|eus8O݃?+`0:OET2i803U(fQF\`RyɝAQTbIȍL>&nClE]!, wZ6S5{)D%T{ {-Xȹ6uS0V0V_2| H4.V3KJDԪ:qz7Fr|ůƼ+é3f Kιh:~nA;(ךZ Vs,+M?չ {>!g igi%62nu4[m Th p *dЖrIeU`uu]gw{6ȋan»!GYNjaׇޟg_ެ+w5ۛGӧp2$}M>$AѬ/Gj}=M bsܫ iZZ XT7A=L#ʄloLJd$aH2#FaʩU9zVj2i@j hrrUN::- zp}WWj]Xy|u;IKZэ_)'cͪ&L˻Sm>@B;qGBv$cqk$P>)xyZx'WgpˮD`9cV7BĊ1o|ǿ 砘g VR#-CQ$Tߢ);_sLpI(ͫJJgiEpKpmmy_NwY˫'ҦŲȐͪ$;tzj3it$}M YS?bchc/?# -saC,ZP$Eh퀉[ϻ_v[{Kl%fB*sS],ST1dڸ@Vg&868.wIMD~-G"s>5&HJjAy.I,S,:`mR'rgetTkG W'J`<Ӧ,ĀR?PR/\{'G@t*Ms4vtcxh?o/s<;#a7\_W _&Sʛo]~_i:2ߚ(xyU՝0yVOhz+Xؼk *?N)UqvG܂!H_u˒R>}Vsutc!9>NV4Պҟ!=Af#Ez|Sc _Uteť,?yF &2c.&IP;OeJ,,a2Xn.Dp'>*KlLֆogŹJ>0zgnF1с[> ٗ2udbݓŚ$f.ˌg4ekѯ} E9uux7c6Ƥ\&&Mq9d %(')bS]ܯճӕMVۋhc.9vWğYun6mmML(n2+Wt?$#WPr Z˺cP#gcD3D>J1Cq)G_f̧y,q[+Cm1N$s3safL@]Y?Ίmm!6 ?v~g̭~Y=Mgx6gb!^XRz&r%6UN$'q2+ˁlsoG:D%)I2F2.@3cS8N:*LZ BiK̒LtO ZηD*>߃wg^ ajcz\o*G]l}|1,h.r޷o:.2UJEsro>Lӄ\ZK-5gZ)s`@Y9=M8,eZ-y m.5T8+djTÌ&bNk: ]fd#%J.@zR95ґ$F Rf!^)iًIeTDRRLCiy$N%kqMJ]0L~";v]"o BDς=0&'smm[ZqjR.%i\4ªK4u,{ z_ ϟoVӖŔr*dJaAZ-rUn@W\ SEڴ|;*_~w#R>xfqjm#myQ}Uٷn(Q|(6]^sG~ dWMI5]惘FT5L̏'GAb/&G_V=9.5NFF׺jZwU#73P#Xdb7s< :Ѥ\"Z40qb9|8q8Oڶ/)]y {.)@YAMfe|8eQY $M|Lo~//%~?^/_̼?] §xNh9W]\j۪[U͍SnQ/cm6yG[/KB M'1{}ı$((4|!#S&&` .skL5)-R2<4gFxFzG#+vGFO ;kk4& CҌ{F Z+<+8Eq*'N+ ;s);7v6 6& fN%poŶ4y]A}]YnwZ^R߹!׫;7Ci4nbW;"m12Œx@;"0,PvD"2R;"qGe(]`Kh0tp) -3]+DYQ=]NWгaHsrj؞=u4t %ݢ+րXOW=xWm&BmkOWJ\WpTn*=qL>CcbMuUF }9ߙ嫒vʠJ(!D*l:{d#=Vk  naY|?Sky|oyU@R)joذ7~ʹi9,dRĩ1Iopp¸Z,H8`t-Qj3Ե\)Br b<BRu#HWBhd@t$,BZ'jRs+) 1: T` J ]!ZCNWꞮΐdLӀ sup `]+DYtu>t|u{t ]!\L [W>+# !DCWCW+D{g, تra28N^s}={V!(c藟}zoY)9Z}xu&9,klRG!v7o7o dϣ &cE:pGx~yWqxMoʾ[B4R&q>VF%ZVX[j:U'W?>wѳcCQu8~~h0 vw!nz!s/t4TH8g%`&ڥ4M弈LS!m!>T:NmE0~GnhаdV (mNb9H AnjX,eBxGwFp @~sfYZ:[5F؍y,h((.+<$n|v{b<[MM4V|2@Z W_ +^aVo\ l/XA >6T2&?z]'CV*}-=ATؕ9:xoS╉Z;s-F]N|RB#mZmQ`meE5aQe ~ɧ6yH{]uz|-^Fٿ|[~ '._%?By͆1tD5zs/Z+.ЇΏLeP%4P{U]f Lqꊛ8PG{@w&k0n2W^*ՆKԊw܇0~WW!8g٧E GlKPo}35Ǝs!>|+EUvexߦ,'Dʖlݯ;CQ9TD_ 1<Ԑ=+5$~5T#RC2^mwDѧG]ѧ,[ Dթ@TGDq*k`7[ uhnܷ,~?PpxnnE7WBJ"ca UQ3~7?mejWH؉Uw-!Q2qq1̢6M=!(ŭh&)h_sQY{uچ= $vÓZrٖwlw}:V[woG'}|@`FTk }bS\yKm5' J۱xSڡCOZf ҄` 2 ]!ZaNWRΐKm/ݷuYc{^XL{{)suocdV2=D;~ͬ:I/mZ0nj2GQ9Y+0s8)K, KDaJ \BR0|PQ {)gp`+l \Mh(th:]!JǀϑVҀ :Fp٩o#3+`!#`]!\C+DvDi{:GRUv8Wc]!ZvDiHOWgHWH-E@tUXWҮ|7& BCWzthBΑ唅 "`3pM0ۭ-#_D];t%j<}`JO׼\~bjV(]3cΠh@WCZ ĆBW1+D)iOWgHWxrTj^)j(ř_<cdԺ`Y"ѵPt-w T=C]+vFpO!ZκNWkt$t" B'E\Ii(th(eo]%]ifg>t1 ʼ.]zt29~);huE:i\ 5E/ExS,{@䊀 \+eLt\erM$jdVDWk ]!\+B+@+Xw["u=K(R pfXWw}Etj\OOW3zj\vԸЊnk6ՀTOW=UX]`u0tpOKZFY QvbP &ZGT:" WE+1i_1cea)f_O'tᒑn䡫fRDuSHV}o, DjǓp$;Q)I=.Kޔ@/G4g0k._gd:y~yY/Cn4g^) pSkf]g{6W!~w 0&g$Y#4G!g$+|c^H6\ orm,Rj^_WbczSbvǭ!LgXblJ(K3#d!L>D 2;0ڦ`#>?z49%R0,Pu NX\7 8{-GefR됩>5Vt\fy[q4f#JtnKlzUit楣1˩+Z7] `ъ{)'TKR+i ^#YN4h[C:٠QiT1LÆŽ7>/!檶ÜUgH2B2{8ӥ>V*QS9!Ӂk0Ii Lp;C7|"Qn'71]iLDi2!4'nFUxhGmxrymܬT2bz8Gky{W!^_/'ݹui=R3n63v#QU׭/{CgSI,e,ٲAo9JAcv!s[* `&bY4-1HĨw)ҶYWywop7w%0{[{T Zl&*fn<`ˠ J/<9(栘kkPVРdy1m}Wn']D=ɥ Gf70 DmqQFi9~:XcrkS&S}H&|(5wԴIg4},/oq'KwrGm7VfE&Ԍԍ- jZyΠOycJY,.{-jq'NPiYG"@'RpatyU:L*)oO'=̍<|(M?8lJܐ}+Q'ܘK8gyH!N!D->nչz\'>klX1cpE\RxD )L iɄrɇ\/%EU"q^@qC$ɉ/vR5oVkl!6plg^֮ldoo/yz{=gL΀7jc`1fF+3\{KmǞ-˱Pq8kPֳ Py-i #RR sFL>x=|Z{c/>uk-k98% I{nj9e[}#d3jT(E<W)3鈳ɝb 6 Ǧ>59{SS7 Z;%@\ Sv F듛qFr :|oD FgA,Gz:)/Lj!:P6erG"4Ti)SJ3/uj8eT2/pbs9gd(0${9#ۉ,*nˍ -i*%dRWw&e{m}LB-0.~~ bRB,t(?umZN7+Xqo-vCe5c̵l p_+<;:nzB>0J22^S2,7LJ[ p)@\3@U Éi+ZfjgݟE?_W Y?vV/yòjvzC&/WMۤ3+aJ cw =q91ME"xq)xR\*v)Nz,vt8b3UdfתS_blrH4#"`œVQM36\SÙw^rRkÊӛ-1mu{]ܲ[۸W2l4.nİ[UTCe(4nP* Kt2ƀ#/7%ˍE*ʍrc`X똣1%g8EboCe85Ax~.Ϥfs }0_Pyaw0(ϭN$]/yA湟p)aic<bNsdRNc2gTWt:2#5AX)Cqvȯƿ 3n,a3!a9B0#FK ɝL  2W&gs9 XvNۮuR6̭瑒M7hoW[* [k>\ox-3p.Em:'҄:b2yb >PKb@SJ!xc:$pa~њAm7<,w $FBbQmj*)^"xQxqOJ}vۓdƃ(>ϰ kӘ %3nA Lg&P&A%ֱHXsN`\>DpY㠔)|&9 "#"CjkQ]x_EڔJ(f<x@ D(eXKApmf-DKәӲhJFD;>rCr+Cp 8 b\J"f s3={Gw -zx} \/~[@Jri͓;=tOV7M Y1La2'Fa礰Y +$q}dR/vv*C`-N-!fw ‡ۘl\2 { [R5!]گ$IqSϘ6]tYHʶ (o'vt{J/w޵6Qdٿ/3Pv>n1 1ƀw?l|b-d`fcrnھ@`[]2o{NV[|/y߿ʿxj0/5!{Go]_}Ҹ.vsjwRsuu,*#xpԾաwlj[5қlJTO\(A ymNU&p{Ҧ`MvHhI*TT,I J]q!:.` d!&/\tUtM™U7v!4?ؙE|}v<  bǭ.pٽ:_}^m}:rœxa7CnU|"r#;0D$̉=VkOD(H8E#`iU7C+Dԃ W \9ON+oңUrӸp&C;z{(pUR$S;.q=KgV;b|ր2Ճet&|]ـ"IM{ zk%n<盏Nw8poH1/Gnb_]peUnF߸%}c {;u0p/w 3) 0T=aiPT.$Q>0oV=eb> 5|/4؊?xh>Jyd&F3jTQke|z]E%iS E?ˢ4룴YFM#1gG_cB/ʾѮ<>e,k;rNJm.n&'RFҟCP(& $H|qTd\P)&jQBrC(cZ"4xQŔEZ$5'y?liM)k5Z ^ 8gX+2TyᓗւYҩdLFDU*F I݊-Šc0ɺdZCO_z}[RNRk䒔!pTT3!KH-IFRwȥB(`1cRV"16`ltʪHR,-u{ᎌHf }Q;eeqHh7asw 0ih&M ;J0U@-ާ\1XUFҚw-bUdM~%|m==2`bwu1A_q[FnWCB'dHX:¯ erZvd_UY:QS#J̱6gΉR,mtY[Mޙ{pרc%satc5H܉2VM`rke_GRX:bBZq*ii f`8)5X'i.f$^f!)фŠŦ] 53,HX&.YGV JoBGw3+ե$PT":dP jTސ 3Xr E #1٭(Pɒ n

o9@+zWmi>jZ.]Usdz n:X h`!Z<%EPiDiPyU*|p|RT+W0,9Ѽ8{*ċ@%AYZ{[fP܆`m@]o,TɂNU~T}%*y*֝ m+dəj&$@dt*E^n}&USط֝GUD|ҷK*a-4+Qw`=HݭWhC*r]/oՒnA׀R")e`PJZ%LEK I(Di:ࡃ,B 9^@|6ٕ;,u惙3`0cduCJR+ !:MȒ,R\=vEQHYAdD"QlVB5@բ-By(?+ >&|LHY%,h%bl9MـJ6-ʾC:b8k4Q8FJr[6i(RIQ{)@EA joV[A A" \M{-{D0dJ-6mM:̃v竸)GCmi%ǴR\ەI4 h qfQ:Z 4Q\Cژ9Ev;"'2Y]^"f՟Fj9BGբGhm2& <=0-gsg"Xhg]r9V3*XA$L $CB6=<^r` T߼UafT( ,7bE]Q("VR2P$@᪝r Jedy ahaO1'JuY1Ԕ*.FB,z@U hH?x.ڈr!cΩ Xl+7 fj ƚ*Ei L>{jfR@&CP ZTY[~\Y%{P|P}!b JHĝ4FC 6 #<paVCIH G΀z"2\HO8%0ƇJF9T5] =+~ %!.*Dx( u{cgXbXۥ>>ꘄjIRI8 ]Xz+9b $Ti`ːv "`}}UQnuԫ^M6ANFbuبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Qud?qHFٗuncԑڃ7 Jب -y5_#{'= H%__?~~2?wu6_`hOԛ!!DECEuE'G]${X_Ƹzq4/G@<~*p/_nb($Q1`5^mi ۯ3`OX?bH!jc-O M 6@biC?A&u/ky@?1̻\hx-Q:A'W|M ٫Y:VQt𾚗?5am[t<儌x]vhE/z:ZwgT5FE8*ы? xóū?ɇǐ8DzB/+Ƣ;Nj(<e_Olp#w>e;:wgz~5R_\n0g9ǝȀ U⮛ b÷^._i[,+/y)w U~=BqF攆HCfgh|&.PsL?dяpy~v6ok~JwMƯn39{vrL%.-nzsh!{.nGOG+ZN{%)IW_aI!~l(ĴJP%C2) v ҍ9oQdSDpk.d߇hye zN^;.WlxF%hn~.h3 7Imq%i~x7Jc*^}t[yȢ})͟>?+,}emK9,=W0\YnN WO&_ 5?fGq;c&ms\I]b&NXMn[$y}|~Moۥ#Vu.uZtB-FЧX.⾚ vȐm#Ta&8;Z.WCη!㶦1Hj5WnK >WW̕ꁄܖZ=vw%v9qN]sI-쓛}rOn>'7f쓛}rOn>'7f쓛}rOn>'7f쓛}rOn>'7f쓛}rOn>'7f쓛}rOn>m!5D!NOmwmzfo(fBϛF AY6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauQ4ݩC2B?`:܃7 J٨)uFQ:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauب]mo#7+q%i/w{!fIedڑ%'֋5emjZLVSdSUduFQ'ouFQ'oFۙ=هc6q5}~ ~1z/h@iJ4\J̾" "@)EWj Yί/ =1 D* Vsdz7d.J+0W}]`XW?9:̲gEzHp]1]B@8WcQrg A ʊ=8sCxpY} ߻+3sv1Q>5[#bw`Chab Nb36Y'2,H)MRz8B}@ -ũ.?,>jҲ%Qm-8{OOZ7a8=6(H%5/{ʶ b' ~ٛ?{! SKlz!pv+zvcl~>_r+Pmi\ճmePKMUUX] uպb7Z'~jءZk.+9:B}Y >%O+DiI=+[9\'j7s+Tz7A?/fTecdoEd O^OFq/>*f}VY Qy!mq>4 B Cw2 x˸ӟF ,>NhFP+).9|3?BRآbU.9U-TT,mQ1\ZA%)d`&m2tpu23eGoMIgzYZED Q-i0ՋЕߊ + ]!\w!e^'K]M]NWWT g J2]!]  V7+ުwBdC5%Igw0&V>R-2]=$\ 0ӻ; ivJWPmxfzSlXM+ke*th)3}+DiX=+]-}עKcYI!#vÌatD휱:_r$uY6L(s/hGJRTAB& \SRN넒c(]`+әTAڤBW\_Ѿ"DL swpY2ʝ-$L sʓ+l*th;]!J}+%)]!VOfj  `6!B& d+@;]!JjHI4tkD+H QJj*bI:BBBW<5^ ]QSrH_6KW&2tCvtSG_SlNW3TibyBt)4BB䙮Jv.% ^Ex]Ě+0 GڿN&?`́;7.`A>\;U8V;Ҙ5$DWX0 ]!\R+DT QjJh'm0Ohzr ]ZAz]!Jj3]!]Im% mЀhm+@{vh@+e(c)X "\Z;;:4#J4jiF7i̇Οzp>99xY+qIˡNM}8jLq29HQ尾vH5pgcpNW`}UD)sԺjfL$DWx燱te*thUse^Kr]`!H2tpe25Z+Cbb_x]7zǷ3uDw vC){v+CO6%jחuۙՌ劃j Vyۙ|Q^'kk[n$o&MNʙYN}XL]ZdBQ % \LhM'U$9J(E*NzEW&CWsrRLWCWp.mBt &Be2] ]ikn΅|\|}<D 8[ )Oj!˿pd*3Ȣ%midbo~ ٷq;8xb0ȴӥ#$*.ե(N87q JSK3Jw,~=scv/T8k;qr9.V Ŀ3/٘f>KU-=w?]T4ahR*󅨤+⮠6lY5Q7~~gOӋ|ֹK{on0V|6`8c!ǃ_PZ}uR\XV53LwX}W״6[Tv._w2NF#b499Ai|U>/'a2O0cA/٬×:ѱoRg仨kIKK3Uvjmﲎ]+vmV`]/&*>uL*/c"wjw:v3J?}D'9>x.e^, _: g;ל.XѰ\/ ~:rށM l9آ^P&>MbH;*{ 6S%_¬pY֛7!U>cE / @DLI@KJK*E <ۆޠ rSA;GwOqnd|YOdkkU J_ *zD𴘿F/״g| ߆Pzb?zY3̬Y1w7"묡khNJ⅕zb(-Y0M6҇@u2v [8yU,'1.]n%g" )xҼLq=Z?3&l=S>3X]^ܜ~v(62҅Osf >3$?p\zXGTef;Ꙙ]9lUZ1bI}{naCgV(O*/+ϧ[8,MeeTC!WJ}r~wuAV}t>pr/G7UN,u*62qk>pL>~ I6l6/ nr~@!73L<4<ٻ3>t28htHo7pz%:kYTq$/=Z#?qoyCI0Ɵ,nG,MX-'KCS: t D >lO߃(-,EAw̏pa&nN.6'[R+{خ:/3W[_վ;VphX7ZW#,g \iDD]cJQXUX68Fy [WaTH*#cJx͞G "Ef[ J2T2NtmCnd:tJ(RPPz#j2Tx~`:f~4kZִiϩiIk3(I]u"lM@`)9g$ݽ?%@&b[2'Sk;.`-8 ~:,&@T!pm눆E*BPk7($ (g VTX]#KZrR UDΣ-),cEa)եpJs `m8/lŠ?-ͿA6yTJؗaL*#С/DI"`{S4kpptȲy5ضO&,u uv}ݗEOk7vM}O>۴ me2pLe.FZ -c1' xXm#}Fce迤&"U>ݳFPW%}.)I舦>EEck(gX+Tac]p^D;zjY;[t^ EPUdqA9o^q&zi⸴(RLi •w.H;g01wy Vιh>x#&(As(Hg8 G^kjE4a317,=2|-Kq*Vn卵<˭GeZ1wAl"&$1AbX'-u=څ|@ aj,Yx`8OG oP1̡\ޘ> w^w7خW_/ƓMDad_wy;_>p3>E] *_TٰVf ͈GޮoL|2=^ͳ,̐RSGcqxc7Q;䬣'ǎooCXܕ׋ak1LB3ccI2&q"O O'.SgO9S%fDsWОż5uvHG08 Dy-t fP0fLwLƳj+wt/!ݜ09$=X@9 IhCp}utJ֓P!(ZS ! dtf% >}?^Oѣ@Q{Ú-7cKķtݾ=,zrrX3 6"q0v!Z0ĕaDrf耥ڄMoS]t>mt\(jmz k].WH>evhXb#VG# .iOu֠KfAےǽܸO|>XxKp3GG ٔ)$LJd$aH2#FaX/  ]jo4KOL1)(SbXMTGjc7ԍy-ō5HOLfR (6kjŨ cT KAyQ9zV d9՞ N;5jWfs%2xBc6Le_gSԄ3 : ?x]V^OӺ3/?~ju޶IdU$fг {jv,,ozWM_}XAeG ar 3 gStݚ.[/$/vcgK;Hy4r\?Uw͓_,mjڤq~=40xz׽j|C`d8\O]k頁9տ)S@E 28qs)+" 5 䢶I.Q1B[od#|iEĜUԈGːc"I0UhDZbNÂV Az pk4N(f}naE]>Q{-IDeZgQDe #'fU"k9oNw0V<oMI=kfȚ@)f13@g?fbk9Vg@GJ -saYg="E%( 'A r :үE  cAX0?RtX)*wW)*VcTHQo0R5Vp6ZEb>O{%l@Q8%ia4H )/Rt<_Εa&=\ڑH\ Ot42Gָ-yVAq ^Lz -a?r~y՚+~x]'5h2ۀ2.U9Ŕ^A-*6fP4WVҳVjA|^,'7%c$)u#6cmqH>]"]UЛ85M_|tSV#7z.D$E91\N[FqZd}ir$KT(ElRW0q9m1i#ZLS1kf\~89;x-!Za]Tٞm9e<0m c<=FS8"C >2C4F:_u`H΂F]ڗLO.(JSUşlʤ sS+BhD9bu%ݩGѹc%Pq]m>QQ5eGuIEkNKW8QHją`f"DrhtɃE_Dgc_}'<$saXX8+i0SHoV T_E "jBTch4#F#Sͤ{/#,^^T@`7nl?cqSD+ !w=B8L>VnVE嬿ȅ2̮.`friTx{Mc>cLYvDL'WLI'v6nt{2"xyG5ڿr#p7 :X]SS*[dJ(׉ 1 `?Uvpe%oV3/{tO7!BXP)sVW)h^Oob8Rp`mLMA]|pcGk[lCglљKi8Ny=,vQ6z^ޟbrGq Ǹ%;[bsK]flfUX~E q<(m}?Zv6ȞZR;a!Ah29L/kp6MV+ Up\?9(ɟ~矿~?Ço~O^3]0Oߞ:pyMW^4|Un&|v)sn`.0%Dx /NqƀΉUc&'RqCyg8BZx]$=g"#Ws=[) E摂eJqhWTc:Rm;<*І$fg\cÃ,RYtX c P\u+Cf"-0s`Y__=Q|;Ru6&ktgpĶayQߣ p?ߏ3 mR-( =i`ˠ7ߠ| ;8+zyIA_E_?{WF #j㢑EIʲA#&2pORc0hȎ[xib^{:'h'MP5˵|m e꼄殏/̮'˫F6r4젖e#jhȑJ窳֍t.yrj S(ن$d巘,$(Қ )^kjBrjc7.O_ Q(D&?[kX{˭͝`t<ŖRAީI.1f祝Y0jBiWcGNQ1%c\FDߋ%loAMO68yo0'zq^N#2 RuS=tP C0Qg&*iЦ $vzwaY*7"bt 54*SL4( }m*eAtuaϲ9UǪ#Q4 @MD񤙈x.JiLg9h4}.t%?Z5jZմCjYkbKUR1Q;9IO3͊Kue_r)K˳>]*>w Q,,fI|j~~43Y.W<77]W W^5` W7~Fణ/7n? u32E|KϧƲY:hpE_,h~y'h=@qKh|rs1.]]mvL'y1-?Cb؉=>Ͻ_OiDoE6e ALf &S)H iF9d ,;7.ylrRk2ZrR^dZ!ULs73MهܱED8a/;zG@v)HmB sYѩ9м4g 91,SS[n ᥤR> X:ΎzvrbU ZSNRRx˓Nt.,EMZ(L[̃63eЮ@*P`-!7(Ӝ pJ8M)Hˌ!:_Q`"F+ jII5NzҪT&3,Bb >lCV`fWENU2p kj8:1'ՠ3x' !2MDi1Nek,PƝS[(Rf`IStQAAUUۺˈJktC13WLgJd1|lQegfL<1 kNR 5aءz,X~JB۱v؏\x/!% Rwm^PKׂHZ\pxλIL7dLFA݈Z=2^Mv%`} Ƥl^~k~d8@fÚ-cK ;\Z;6|roI1 2L&4yFRB""V1A4 n55 |a[Xn\Ug:A{bj(XCB&m5Z8Ǯ:Ե97?hiB2,rj-n-D \IQy<Ӆ=+ xv\dłU1D9]BxVZhKemMfyǟY+4Wc&-P6Le^6\KLEeLFԠ$ 4൙ޯk&S==@JqY R^N9. !T#^#S9Vb+k d86AN{Gi*[=xj"r!PDi"e^R N{m>۵q*L a|%͆C~ޛ.hPhS$RG_01I.b}eT}>^xOKI 50Ryo@b< &s;+^iUg\7=^H-]W-=,sn&o p<$naT >Ic9Κ@xĪāfA+Ai y$\TOa^*?QRL6o\ŨB!{zt(tN.QJ-\ƩFX鼦 \)JAEo3 9^oI\O\tQl5Xaψ) X۳a*saj 9=u2JI+Sd9စ<b2\e\Z~ G Us͕Mn7(u~5œ8zxL1c|  m?@s X|L YPh^So|}u{eS~5^w*Eݽr/lԒI4%` w2YNoL38`~F$eVSe54VfYZ^==-"kdp&O顇g&{l+4X/)譡VS*yJ%O)TJRmH%O)G F&-jqM/YWdl`/<.-0"C6yk4)!<5i(D8u;[:|s ?-/"v`K('Nnɒ]CϝmB끐u%Wᢙ`OTDGY$FH8&qxYaO>sRK(XF#5(Q !;(9.)q GU)CT͎Ȍ hDg"fUy.dtGp (NYiJSY gT. vkgLyXz[)K>SM?>z¯g{HWaϻ9/], 1~J)RP>DqHJIՆ-3իt]-NU'9bNnOڃ+y$X2ti֥PXGva[|:e5G9 qT(%jP`9&( "VGIJ1UbR1,Phʡ*HXG"\JlB Oc*5 1 t ĎR5g%7rvsӃ|eD R`*[ۻY?Ć#.3Lӎr755yMQ<AH[bѠ!"4 )j\7."ɧz\]pK INR1k(2.{6w5uqG^GESO?T]% ; (F"rrNΥSwb*\I5&xyl~+aKC(`rT\Mٜ\ 2pgmYZEaTX|_VeŕI;K0f޾]8mK%3ĉauvjC_hIś_&g U^9L~*5L$x4Mtsp9nz8=mu]v=@/XNa40Px]ErԞ\ONQj5O_Koߏ/RMqa"yqZE<}9O.LqpmkJ *`y]dzgξ&?{2 .u&}HAϻi_Cxb -|vw)#wq\00wk@x"6N6Ǘm6]mAv}\_D L!ŝp6` =GqVL=Ãixim}"ΣuҬx(aQ# #q4KDSjڿQR\>i.>jDWUa$O*Ej%7iQh 5(O+LgsݓOoq5xc(Pʍ"6rґaY;X+\$JGFm=mՅ|Fq1ꜷ[Vn׶^ِ}qtɚ'h_(cz!u:BzjRha782DKF.5q2bXq3rbBQyDF3O+Ҍ(&;=/)k Wer<D;Wg%_ c$Zu@X+wm;.܆+W߯lzgCI"J&%)Dޙ}zZ zm0-\ۋk;~t:N&9m=kK̶׽*/iT9!1ux%В! &_4K C&_6Y,˜H>¶Ѱh:RR <uKm4& _E~̘bg~sMbιcO̖p`kMHͬfK| wkKM&zzx!fed$nd$CD)")%J7Pb%0l\/1IgsT>8 0w["ln}o5n)e;t}=Vjgn&Ct%4;PLӁ'IW˗a;r9J}s=A)~r!_B.QDzO_M,I,>q|i氐%Ô,`QjiI$R!49<ǒ?&jK ^Ȍ ^`A Ɯߊz|qzrZw6jm$'iPpe84% (LQk %/P!`V0/2b.D؞d) !,r!c1S@kv4H rd-.#ȁ%=oNX710BB2PPg($^2N Ij$fXX4#-#-#)v{Pll"Ukժ: a=Mv[y?!J޷,>!Ϻ澻ΛH%2N3Y)hIa EBݻOM2u{-G"=fƞhQt\TG|Gޒ&r {HTTDG)aTh[k+V:d {/L*sPҠmP8&D4rfbP6_怇–V%lvE,6V2Mt κ-Ё+AlףVz^j*w no3DJi3\Qf&-CT#qcJ 5X| ţY^Kw!#7ntbƽ 8ԃʕoYRk?zX4XOѱHtDwi83(* %B# 9`RD$2 @9b A*!lh3z#gGոφR<v#ZM&F"w \pX*GA f4b)H'#qRQN >!pQ_z3"gUE4ʥ& N1E!8y# zolVzStl'摩H! v6ך"CA ik8za]Z6:ar9Y[kZM^̳GaӝEx`^*v x1sV2"jP$RLC6iGe<"@$)jK%XZ-Û)\<Ŋ$Ɣ2B <$Z!<; {\:"m<gt`uf Mʺ<< jc6K̷K;EjE"r,X%X6Yzֳf:H>o~ nI?Q+R;;zjfXה D4Ja&W#rAx 6y]{#p!  fJp^ L6ݶip$ϧX78 =OÌT]-_똸5[RT]<U o| h]OLn"ڮJyI7'j29)Z!oW/U-\^ (i/`̕崏(X(MTdTR LKX(P%B2'ɉj#o%p* h1JK 2~X»>ߨaN*4]a^Qf^Aǚt=ckMMIôfvҽ><*|~O u]0MQ.J-c1sY?~`.x,ٲL'7nv6|j#fTˀFQ'P+R[2EeO0;˜Ͳ%pzΝ%q8,n|wp<\~٪,k|_#]vD#ݺ):ޖ#o| }i`ƮUӈŒXw2WՌuCW7xa4A|0 7{#uZXpNzg0|^ ē0!eLXό'!JʘĉL<bO1Zks R%1GuF;Y_bLzi 5%bۤgOhw9C;EMi/LjW+v}\]&~ זuoWO pm3\c`p+ʛCUÇUc:&r0]Nn2NogcJwnx9Cmc`S&52B1ȸo5KS% c۰7Fv Y@ǣ4ոp/Z+_gpnd Mmg]sWSsܗE7rX7o# ׹y"?=`vV>#`LJk/{pnmdQ. "(M4}u.D!q*fm ]pIv*ap'nn{dsηC>A.ˡ9s]5m7^2-q<}*%ffڬ^fIurE Q<1STL8@1\SŨQbT KAJQg4YNo$ N@>sS=pɀtճ#6O"/Pç>8rߝdVHf'``y֝ V'gP'xy0հ֫ M62MN  Y&t\r)]vZZtUr[Vpq}iIY%MN>VTAt'@K󱼽Ҝ,>; N̹:tO:g{rI>r! *GmgE-3愕ɌM"ǁbyYd3M}ʕ|PZ&ݎ鄷I+g'Nw-םD,J6`n%ܥWkhnA6rOf}6& c= qHdTb*aV e\.\y\| ko{Vx{97r:PObם^:zM.\hc"Qxuj Ka'֍jwyz9y+9V;~MG4ɿh'>r,(LkN)okX EEj9(mnވ&rrʭL3:bM\d7RJR9CbG=!GAĭA,!߈X;I[؋c>_,qNN|Y3zg FσdY5I 3Rbe4&I5A%XDc^&tʾ# KB*"cJ8fϢa {I#^! օ$ {dw$cFP$aAq qv4jRU{n_?fޏc[>c%c* x:z/H}@m{%2#+, L͸ؤ#l9U!f-85O::]J%O_Z(/5S0IG;@+8IQHxܦ넅,r(':靶ѹhbeɱrK$ mnmԑh\4T ! o,ycJ:b/AK K?RоG`S}k3/}4w3ʍ늟\#n/fiG {Y=Q6vi(`I&fB\hf(yqbHR Nr LI#M*e|R()e2u4Oh(BS0zIX;,kT,N)ʨ4#GyNT&N LzOy1*- [Q l~&q)g]DSP=Q30-`O8KD)DW&Ewo$8SƄ F,;`d8F/ ъ`%h~VcPOI } ĒL.d|0 E{S4kNq48N6jq|DsF$8Nn7,QmvRs6Xwގw1{/[h1l@)1Zp-# LcAroWӲ0zaDܣ~DS`ΑM!ΤDF$82jƪ+T˖{ia9x _i=^+J`5^jP[px]Q፭V5X'VUnTN֢լv5ѸYf7Wo73HXjģe1j$*-:7<{!B;Ϊ`[tߧ'isvTVA_ n xur{/B\,|Q^;Ma\Nv |V?usVf@mݫaM(= H:g09 YZi_ߟZ>&:Ϯ!u2fj?®::T.YqbυjkYG.('Xq9= q<}w"EۢHxBcp%*r5"Tԝ#ِ#cs/n6Hm;m\ g+֐Ȝ }5퓗Uwyif~F+<.;/#n4/ .+e 3͹>sSmsS-e|2*g8lt !ik*#)6gF ȸ|J#}_Zn*Gv= {-oiP`KI8nR}޶ N;BS?x6LG41x=\qsB ;aOd,t?)tdBIf#F? †Ƚ61ay GtNGu4D{FY c*h%{gqo]:9A ;y:-F+ O/.kXԞӨnMi BJZOe `*l bz8N*$Nvl?jdx4Yڗ,'ŜIs΅: ͘rwU6n#E:B\L@\9&( "VGIJ#b6" D[b4i@)p (WKM) tvhB=k azqyh]U%-yM!{O|Άn5)6yj "BAsC Dbykʥ|К{mB1pa~Jخ) PitL*"KDAJb,"CHtdG{-p/*P{,*Pj cm(ќrF]D 6q&AI @ Kޫ7QuzmZ}Cmw$!kb*aጳH2i{C֐69vVā Ay4(&Xy:xH!DFioAjc"yÐ|DDJ }5@$y]i+i6\r݌x8}H%c\~L;>>^ .S'B]뚖b=^8|Xq!$OP&g74{:>!jz>p TkB畹0_E+R ZjZNM10BM%1XKh*4Uc1 чbT(`bbx] z(sp18,F.ndSMmH˙PH*2 #|Q5rє㔟}¿L\.>;;7,/|I5h _@FEL27:jM&v`$ooO>yӿߞbNOrrs~@U[Wm*آϓƴhXe E*]5_V˅p܅/] h"7U9Ɲ9M4Ѐ#Q,HypK^Z4PY{Hަ' R\i24@"ɵ, qf,8ͭbaoD:LrKm1?t31րZ4,4b\W8طmDtvgI5IFRJf 55Zc{vO=Q'jD힨{FpR#=Q'jD힨{6\O'zvOտ'jD힨{vO=Q'jD힨{vO=Q'jD힨{vO5+A|B#&Bm9LF ScCQ#PϢYW?tEa]{[ :#8%C62ڠmI+PR2pDB%AYKn9HLh܏ݣ\aMM! ltz4oo?VhRP|؀2@@b81xᯢ)5Eme(zZ vfdɲ:3+C#sZO :oEBfF") N̠h$Ɗ0(rIKD$2 2.#+z!H% M;ΪnPmS3;n0#ZM&F;.8x,ZF3" #qRQN4Oߋu ?1yb(4ʥ& N1E!8y# zol:Zn:>OZ׹,yd*h吃E.j給)nXEENtDZw8L'fۄi-ʈfCmZkřbm8"!m vTF譭[bAS=LE65>ձ+IZ-L = 8ApdFq1H)eD $xNYI#+ [!0<`,w뽡Yt!Z:SG|)DbԄ Bq̠1gH`&Gb< |+c̱|:x#Ky$"kY"M;at-/)lY]7O`z]ePqQh:Z;&b im C!fJ lp6HF7g)Oc.*5X@U=t1 P\K<"s)0}6̇fͷN~] M]_ͻC1\4@Tu=_߱-x4f01ԘQ?ѰAM9\N.Lr39%7ih 5(O+Lfc|ss[CU=y<C9DŽRn&x92J#"N;:\$JH!1[UEz(fU31ͧP:$܏ 3,4.Hπ&&4 ŧJ(,ҨåF)|pqz n`W#y30x/ʫC Ç5`:5c:7hIڕbTXW t]QVBOa%}r2-3ime ayΠFA| o@J:&R+? I4ѧAڵ'}a?4IE!5!8aa%5(׉|Gڲj^~ͻaf6_:ƻQn}m(GÐ_E9tGb)np09sH[G‰uXT~_I;k~NqXtgKNM J֔ϊ:j_Vj~IuN^˖'W*t}.MLrJ)8Fx\s6;˜XP"T\bkzMAwX|..ܝkhW#6y|_N2)7~C߲ݛ76KcT31F6MPz8]9<&6*1XǦ[cdGTc:f3hvgO;J&vo8}eyp=\q1Hj~6I|2c|8GWSbQr'r/Fro|o!΢rc/hE.@3]-;Vw!ӕnlm )j] 0|+Јc lq[\ qFڔj:_Ma&;i>$H~d;DM<;xW^۫.;Vb~3z1|u o\VH~Ev%diՒ.O5BR) .MlWv[Μ6_%b4#e… ?e0 \nVgMi \,?;`gOĢ]ٛRޗU?LꎪfUոTLc( ;yj"^Ձ\}"t ACt6fsU-iِ{\ ":aQ;~:حqC]bu BFKˀ|O`wۃ7NފF0rwccZKƣ'KO;&$IB<] Ҿs?H먖Oކ5=~=X/:–Ɲ^\ $E~9v; Jr¼˙$WgZȑ"Lpbv >MdI+N<_ղzXje$RSb*X0:00;aw>Sٝp \3)'-Yb/"Wwjm2k%QfY'\$8X4%0fNKh2ʩcA»uQeL$,.D ~m.Me"Hj9#K 5FΎU'&e_:bkww;%|{mr \7e^nO;HBP d ,܂#IFh5 )iCl6y. !Kldhښˬ&N0qi#gހ)+oos([i&d5"GCT9#,}g y6Iecn^Y@IDLP121un%y{:2{)7&ՒZʗGeɲp4} 7-%w|#{|)|=˾I36<ϢkK_|^۴-~zbfoUa*XM7g7Dd3R1@B暚(FRX ZPlt[[Zn2l/ jxa<{5^^wk4SN1OKuĊ)++P+|a һ^M,;y)FT%i 3kje./$:q֡9xt}BHʐ4}m(P|GG RЗGz{l86B^ LsEM *[m[:6a7-{A#ӹiE^-EeYSXg"N:k_b M]M YQxI{Ǐt{Cq bI&ِH41KbUha$)$b?𜘜|o>Fouu^E6wJmp\ 6[kJMfcY/y 9$~5_ o +wx|=0ɶśPLf7M|Ɔ{" #ȨU(v!m/אx3)j%IZ|qЫHr++/l  vtISnKT3@| ~3./><9]8(Ek18xuHO`sbY#TƢ_~w$=ooOnB#ʀ( (fABV 5'҉7!.֎p6y\ӫl},hV5kzST8Ӆ,nZ\}ޖ F!-qU !9sYuHPP#$h1蔸U1H"%Dp@Q뷲uU# vvl؃?,Ngƛ>L}e=  FrI15K(̖ 3(T^r'yP0$MF&J7Kg*4, wZ1,YTL!b/iKXkA B{0 [tӱ7cF`F1I*‚4eEryͣ(`RU\ ۰?iZiSZaGP-`zU]=w<:;w f̑oW{L.s8 a"rd"^I1hbۋeoP+.^zM22VEԕBgՈ͛;ou}[Һ{c]~zxERpr _Y1MY|6җ+嚊kU`QJq7T?.'/wX`1?qzyre'*wXJNϫܡQPtz"EpL[W\BWZ*2LtpupEi#q17{;Awlhǹ7]ѳ@h_'p$- {A_1|*OFe ʿn.!))>@)~=t^< ɦl$5QL\c!nDƆ?TiJbc&Gv*O0&|7&Dvy{_"q( 'A|DKayiɭÙ{)@8OuR.Ka [W@ĺ5pɥTr|pR\]\q)-+ {A2TκDҼ: W`jj9WJ;@J"<< ѸΚE\h<È7AIɹGl_leB6i8I5GFX\Ϊ 27vf O3.)-8NHqpssn1! &!l%vt>\e.uwaXx$ᤶIĠ_#EF<sPcTGAA,䘙Ɉe[ȼ+kȥ0tԔP&5fF=9 P9^,0Hyc& ‰<>RՆBBN3V&KK"f$/I&˔6SB#BM\/}_1]G_~in j%F?6D?hpUm1-:)ӔPz;p[SQ~vӤٕgow%WOşo1_eV]3*0'ж!% ųwQ-Wo9O5z,K [R+-R3=qWݴig"G.m2Lw6^-+ ָ4@-*SqWWcu -lgjUR.`o3KԲL%\]"\Cվh D* G"Q!Q[! %`g10- Sh8EE^x/d2Das R:oviB8aHalw#͈[r8*faҤR ;4JDf3Ls⨁sz`1- PlXiFɟe9CX@9 IHSb.n*J<W{C32fhqT2"#Zss[SF}(X )#H"( )%kI.Rj&Es/x%3vB!aGb)!<~7~[9ȓõYNAQJɻ ti`DW´^(7Lis &Le@PZ[d<\Qq th#Q( 6I %eȋ֊qrWJ.(,_ඹͅL\*ٖͅL羹pNtm.\F "r n \ejd*9䔵Eps֞I\ޚLWJƼR;Nz 7q|uQ`eGz;&E%9<\zBQEpEƘ2?XZ2 vp:pbm+FP+ W#TհQ:Hiuɬog¨3d&o'Gץ(| 2&:Kotx;~p7F Ϗ5Hx#sK4r88ǹQ8IL/q r4B(/уJdtWhP69uX_xF#xfC65bw J > %}-?=+8)K?>!/s]U@7h4^#!r'ŧ2˽AJ=kȧJ.Q{<ǔILˆ1F"cCBcS1UA\ XZ_70&|7 8+㻍 Lzݷk8H蔮CPa '=6 gP:ϩSQ2/f,əJT%`"rd2=vjy4zm˲Gn3%Mcx#t7;`d1tK.hR ն{65pK=(f赳Wpf ju쬘fgWj5;Sr9{v^иvuKYsS%*8opgZq_qew,9Ltlec#z{I2i˖e[˔HLL[WwT%U  #yQX-|zu4]/NCQ0ʥ䖗jɷҾYNBbn9}9%)C(!"nON#hia>A?pVW1ey*FK ⣦4"_vOyMGBtM8Fv\|TF%~oTF%~oTR4ÄQw s ~cNZauVRI0I-  _E?hyrFEyhpˢ(xٺ8l?<֭/OF>z%474_Kig1yr<ӵt#k-rBC*G4]U;Ƅj՞ϽA**%WȊ LUSɌR ڪE  JilzWъL372DZ=,emOb-V ",ňZؖ|lnSEѩlB'NQ^mn荗pvc ͑Z$rZF;b -(-XQw.<c1_u~8 *B~gqCUӀL`whz (\Tcxm-& ҇خz8F 1e2#wugAy q!!j+APK_fLGJҏ75FIf.4Mh"0wz!h Ckb:08e}0zIV-üR(%uxAYIRF gvF+CVZ (S _ !{@s,!d2KBHN1{u46Z}q'0AMja(#W5\?㒌/}xʫVmv5S %sW3UxAzb*'YM!bF,6;ls3C+K O\ø4xf6TR$<WXh矛`>;- M Sp$Ρ lx7$1K=הPm=l64Z7 з>?aB !wnxHPD]WQ}⬵5A{"@㯁y_Fġwӯ,IV}5:xäbZW+k 2m4 BVTy<T8LGJ IYr-,]OuP^.N·t lGsݶOߟj%g q)ZEVo-顐+04\g<+f<3T -2EJc 8 ,]#DGFf|Dh`_vP6}TgY|N1RCwW_ẫ#rZτhm[i|@^ռ>VŐjU'x&Ǣgo}N-gp/$GpXŨbLF#EɡxaclT"eG- "hOv' \@2ɇ?饼 HRSžž ]F1t\tK`ݡlHVүRE0L,mty*b$~ï=Ve\g/C|(SWtBAj W#YbV/^I L;NoF#")tجr,_`~;! |Ncd) Sژes$KIVj KdV߹pB.fҥ ` Z (Si_DbqvH_UVxDZb1ֶBeX('-?;c[9r5OIB FqAQa J`eBzy_=^9xHfBYQ*/[":Y "`-`0R!6k=jʓqW8U( 2F`F"SLH/mS\{1D@9i=묽@7$w)bS2WVx1OЀAzޓu R "Ӻ|XҮE9< bt$NЂ1Qh' Iz(A.>* ~or`ͺN6.籾,H*(E"ݬP@QU=ln<>O!%+Z-Kx$AAk1܌}uD{I"ގFb( %9$$eC2=Wǭ?59g-dLX7ZȄF6PN;Ab{] `O5BAGҝIk60{8NnOO 0 ~YTŒ%ZgӶ zc&R+@H+uC4dKĎO]@$>KnJ޳ Md`DG_yzgV[NU_[e9 $ )Nlqpjfopɹir2ţ jSbKV:b#*t*JgvFs1kd!ꊃ/EM⥞GnqexDSR0nb8m8T5rGIצ7z۹7TVc0pIO2 LFe62 /Q&[3KXI^ilw%U8ܧ]&$T058`hcC7: ̥7 ы37MPƠu-)LYaRY᝚%ɓ x1J؊j3I_q^ZC +э(QBDQL֒^6bw^@_]$PP WptJ* aK d<^5>RXꚰZx" DkP*^JV9Ȗi\z킀|s/i.Ii1 M;{KZWFk4h#9GX-`> OB^b/*W n 9W /cr5"D.zҢh6M2i=Ixowj ˗ڐyUVqeclj~3O%jiUK! h[w, lau3@"V'*F$Or(%._Uvm$2ѣ$7Mt7B-T!8UϬev0n/ U3vr\)e+,vn$~zI~Kϟgݗ^'O;>3婵wBdOaΖ.[;U5?ZLkӕ:mɷ8+U{l}(>=s[A.\h f- >iJT%.ϺĀ̉ CxF}U0KLJ}XǂC>zh{O(cL/{$zE֤EY+fr/LnZHDZV}:@8pEFJqz3 t,'|D51d2:jKТy[/q&,7sK..D3')y$R_w7cJљz(l_ ݫƗ/fI ƶ?4ƣq,Ecfl 5[^]?BH ,V$yf_fN'g$ 0nIm}#1?)x8T{,9 L9tK;yEmJXٮYIkKRJ/#w@SZ_$ @$0Pp`t܀Cr''|$%͞nutT wgU@{ /hR0Zfm-=^a&s6V<&A.@fyPQt>&}"d1sc4RdSI0PW]Pe[4[ [5]OĢđg9%yTS#"?$YU+QU>ҘSHWK2$vNP)HMŹyIq K[EYuOCR^Vk۸F?`E)|_JVْ~2[6I#UO@ !CsyTNqR" Ϸ27/T@F/n.I?:}xc+hT4/&ƥy3;^A*1, +G%3Ճ#Vc"Γ $ZIs=_8:i59g'n\;fGT\\&Y^dLaZ(jgmL '0=I*CF.d-׍Td؝>Imɢ&Q MʤTRfd(؊@`?{W82ЇMwLDM&::֖d)=\dyI <{Phjj3gBwHT;-6P͖$%ٲZ!>e m;#cis`y"Jh߿To*]`ƢLfa8a LwgsFOi]Ib Ȝsw8I3pS2OEH~@ _a!$\?lAWzwho&SP|ik}D .wv6)`l.T2-Z3[kqz,Kj$Q|3Ja?`lod~5ukJ*$\0V$`ћo|dtiYCA52L(fR}UY!?J;lsfnkޖ^bScG@GP1h xeߡy篙|qnVRls[1+(nbk B0#Oa k同XXkH QO>cJ|4i<ѐ\|6,h4XWs>J殻tV\ z=FvciCC? Ǡ05zbۯZRqjX8 13~_U2KuF\2wS~f=$77`\5^C0Z6 b:wB iZu:c8t(ct.JWzm>B)FrrgS* ;bl= $6jo 4˓|51j)"t~XUu=G."@Swћaז&U\ &PQ-#/.UZ20oE}b4Iv9z&&_Χ7s[$Q%KǼNJ5wlrMs@aO_%i+FYouѐstr J&?=JWh|yD=+gW l,TFY.4wB7%k=wUqD_dwh?3߼L (YA0 8X[Xg0z䯆O^̡$\##8iJ%MnKN2>#%Giɉ($UcSnGDGHj.FQI*,Xa!FLHs Ӭ˭ Y8-@fbɴFqNWkp=x4TɅ}|n~C*D>swȧU`dEἼǞҍT^4J2o)&{PJx&#'cBn9ztor"-y{BbвnluN`9K1͐Np4f+fBca˙zq^i {i /$[[Saۺ׶\$Ёn/Z3+>2 rqCv0E!/{ϓo{8lIpy!;Wk\h=̠PmcsNsoY2V^F,\cޖ!^8#=A:&y11#6EI2>*[#1$9-@Z/o u:Tޱf.49˻r H'x88Qj[8R4_ڴ1p#3ȇU)H[&8xvd'SzhԂKH:jBߊr+ 'BoL3sQ(L$.LOjkIE痺lzX40mh-"gkEv ey;l)Ԥх hągȸBL[k<3(8DpnA`QHgJ,U,7Hip2 oWRh $s<-L y)`-IF Oc]md=v|AݢsRޫ/'gn1:0y gJ%mpk bb0NT~NFP%qNk.i58jUݯ mCh8"aΌ \@]52#Hu{'i8޸n{;\&{$mkBR޳;N{Uϡt5g{sDI鳃kdu>lG]T),=pm$j#F؁82cBYU"&DVGGK!5]jWr>HbSHc#NҹHєB-Nr|l,۵[egeB!Cޅ0vAh+kd ƲO9!](MIYi-%o(zۯ^Uuôox.ui҃Շ”:=@,dG,lNzZk.,Tؐ ʁFMU7fCB;A[xX Eu Q\+pz #2w Ae9@c$+uqv4 #(+dT$Z`ʸ^ڥhaʝ1gT.sPs@Ѫ[Xy8boMV*f= ra6H-}VF(NuԪz\?N*Q{wNe'Gg^ pDˀS78ӑE)¹qDJ418%}[Sh4Ί&Isb!Π䡰w9%:?ߒ45\o#^mkq{<8cBM0ΒQ6O4h\4f~Yآ?GOo#Z૨kZx|#BjFh ;Fp?:Rv9CLB.&`!p5eN34.h7[oKύ`XM[ YkqZUMB4hmg1$T ɢT SufJSU~ݕ00m;S -8Ie$ {:'hLmߩݜ+,DAGS@(FF!T]%C'`l ZH!97ZrVic[0'qHG/N03 )lR \BCJ {Q+-"o; ˰vD݊eΟ//tфfE1e>WU:0d>裟Pp>ojB|tM)"]eGt᜘vƍʄ*0+'O .}us>V߼mgCw7)xrI+p5fK;s(':Mk8L3&@mVf 8O)eIV!c8m2ëS'SͰjG»*I)xGK,Φn.0KC?ӽE* Np "HW45*^/Z);ȕ >Ґ҄ئ4ClV% (:!t(>==%Dk3u?mek^QM_aLO>1vr== ;z^IQ;IU:&ũoe ZٝNY{߾,J%O(Gfv,p i5)!,<>KmpU G{6'c7LzݛN}P.yu:]rPn*AKՏNKJrP;VQ.tH)H>_$eELG&S=*"}M?HS-qCODYZцΕ?ɼ//uqߋZO\W](r]f7^<]TT*C|gv:|kA |ers\ Ĩo+3 a.TuJgƕPAp>011.%Jί,QoyҢ(UQ@Ϧ ?M<,#*vʞ >|pu}o)"D" f%1bFvaӺJڙL!y٧`kL_s϶ I|LЪ#֦{nE $7;=:64&)z}~ Ct=LgeKhe2By7KyQ9-JܻH-(LR;RiThMx# eø#qc:vⶡ@3&+P_Ogeȫ:0m }HغKƄ"ԯn)fRR%#fG[&Zsv6}zP?<3s&.:U$6ny b>ח$1Υy$1\p"5Imbs plS@$g*Keƙ2]G3JVpVa?GrKEADy2a |r (<|8|y4+(~^C<ӯXO{s: %|ܿ)\Sdn0͖eѹ;-/ҥ i< /?ͳe˲/ʼnRntP%(* lzY~& ۞$ 1t{$XݮD4eF2R1:~\j0ކ/6o9  J_ՠaWaJ= h?4\*QmjzӶ ic&KzNU |0wWجL4Pb+G{ <F}0 c`.vo.edGG6̄Z0]hÿiuF텶<{D[=鿔'+h+m^l9>MֿXy<(5ڜ؂c6NVD0O3#FЊp?k§]BhK,lfrt1 3Vx7 Ъq6 KƠb*fhKc%?/ɗLR:MDGTg>Oj?nҊ}o|J.iZќ yM}pj[H*Qo}tZ~]=F"GRQ!J>R:^k +k'0l 6BA Sd77H@jʩݺN2NJ,ꩅRSUnb6:εsռb>9yWUB(Qlw<)&#ܣ 5ڟb.tUVL5kt̟uNnXB*]Xeׇu-?Hcʨ Nl[kkt=V* .߇KD2UD}χuOɕ>`-[6lFk SI_6FIp79+D2mSF 7'qC]Y:w]QJhw7h)zAVPFf=qܼq:V64Ily'&hY5FI]~՗jF|w5kͼi.=W{9.g9T>ji6^+Lٮ2/}yЗm sJwj[ bBL:U : X _JD޽Ӹ`@OeP kHXqvX9W9x ҂s7jާb+d9o7?gw9-'F9088RvMU LuJRe<#H2ð[^Ll'c;fNQ V0P# @v㹓B!f86\U-d4H"Ϲl5(&r}Uߚ_"$ħe>˝O5Z|rQ>;:mŧC^ bk?RrǵI렇 W^0Qab[Pr}iAd >BqLvTz3tʞ`^HXkմBFV+lQ6JSӆMVu7^-o.TIV?rF =}^_"XUks39eo?qe3a̰ ~]!D١O_d" f>2Ièﭵ}J)7ƻ_65@$6׿hGI!|G8cl[>%E>$_/ X`+^k0L&!(O+TX`׳85q+tQ=;'9%%O)ܡVKGo-RbN `i Q$JL|0֜"*c)dR4V bih9!:0g䏷BsH&Q Wd9t*CX#mT x:m"2&Guge9Lgi>͇4RQy_,bK %N,9g/~ >WTQ ضbbX!fL2J2;d:D")(gaQ4NOpҳy=ۋ nq0}K!Hվ[$/w=7?Zb E3 2 T&yB |*PdM ncP!ם=`'`ݻЂ{ľ785/{20MsD(%qIf :c筸픉Zay~=LU^MN: q"f`4):5AqeEEgkQ@ Q K kFxRFJJ(T1f*ʮ,ښOCiiJ¸`W,JQA \>y5ǍΓ( 9\(RYF4OeeV 贚WquTvܟׂ;ZR:5~Y <$3Qb1f&<jWBz)&Z͓,sPa)*p0jN6[Q"҈D)2x+IP[a18Md"gkXԠ|\^OΕ?mytj *&sz5$1xr5)&9A)1H|6M=2JB(y};ex眇-.Bm(JhU!0P:x 2uW@'[p;!4ʱ(jܷ8A1(3P*A5ddc:uɥ&H8) n#-a ]2OSξsb!i8HI0&E 8,>XX\"<ٺmD#A@aī}.p[h@qc+dZN( 11&Hrj}O.2A1˴p =4+{iM?Zdu8=%X= PhG ^o10PޔOcY‹m-i!l$"Dd#:&E#I2W*(/Gk].gJ=rij( ]ZHL9F(sp !óшDƒ+Xr>J|D_W^MO?PY/ן??SrL_!Ĩ<Ө6LuEQ)+AC[cu-){5'_~6KDJhT}&E~"HIEa1Y!5Duai:y#g{RHc--f= T +@(AcZ],rhp.Dygg7#kfk 8yRZp8!Y_Z8K_9iqa`% ~CEƽM\R(鴪Lur7MI[tPj9+x߶K:!UcDX 1iQ"k UA⳩=mq=Lew`CR>-|ښ[JB)8:+]7"E"ik8cbJ.(R-2`zT3ڕOiv=O1omO%S@!~oc:TvsI-A9&YZJvXtR}qS o!ǡr!ǡr,\?O.@ VT{R(@3V8W骬@cg)+qLPx_a4?-|[JX'pY>'Tb1}vP9Kip[9ӣ!zRM^ʰ:-uƿv" ?DT:D`^UX{IQ;M֋'"oBR>Dj쁌h=L_:_CHھTu~@.!QpJ9q֬Šx6 G6f+\d=g,DOMNgA>_#7d 'ŚUBKSѽnPM B׿ՋfFw('u)l[S+?hPRpM9`VET7~ R ) 8`АSK%dGcQHU:W[=SD|~S(: ^XO)!wp=Dvv+Qgb^*9>}.eFf"F1`T;G}!+y COA 3,V)aQF__k`#^J !#2wՙ9- -2^f-yw qRfoEq%ۉfs[ g/-8,: "uE8rYLקR\y%AvRe.X \ǒi-|Fs^Q(tgI#牞7[rNQ2RI 4>`]6Xsl%Ӓ %Kv\ l>pݕkrpM "{[dKx֙ QΓ к Ώ]Ix&lԅP^_Yu~~/~xEw ])lRn99O djd,ܾ,,;ew[ɲNj2OҮnIb ;ml+(e5i0dK&ݳ;m&}@>c87!ш섖GCJwo @B̟__Q9!I&*zj GT!\F9.'pKXrpq%o~_*ur%7jvE`OOFrLGS0(m+1L_07?z˸U=Se50P#oW\FM?4PyS)뵽]Q<;>o֪4B (?ߍ ck8Eh2[ҔMaLj?mGգTͿv0X5Oy t5L+B 7u%v#I(uN dS5_ Dpvu NDƃ ,km]X DoYG"KcYZqL;t{$Tf4 6'tAu[*{x4;k:& #f_#9d3$f !sA_*s q8 cv1EQON/ɄίoM"dmhݿI9c4.NPJUξ@(_16h}B368TLxnŐuk9?D:Ԃ[JT.%"VZ7j*qojzk"D<;|ܓMb$S%eU@Pe~ !iJ7N$cR:2Yv*9{Fef(&eiSHtyxLױ(!o *AwI)pǐ+Y򁮟yx)ٿG=G2HbBq1"Qb(C78-Rg^KV?Dc @.f0@[:Z|Hާ%dTÏlj@*:A;x~4aY&+({r5'BыݤgR[k0OQ[Rw&%=>>sj^ og o^ܞf'zCy?\\^g̤{FzlLn{zIYEzvF~|kTϳ+Cpmk ,h߷G.?Lz޷9>B:So}0ۓu t937H.N cNU)sa'H"ፘ0вwL<9xӿgR|11߀QV2kr?TDTskdNJc&g꫃Bp[ȍ;^H!j'?qL\ vJBUlN!zʩɺ:tH,+L<(pFC{u: ;D_tY/F ;+0Fίގ~*ekdG3=gRŞJsO0>MoAѩs360$븙;ELNv4 T,I?g4A8$My6]I9NUc\v{nv~wr6B6|= g0r{/1׽`xi8qX}uKWT_)0_9xV\$k湗T$ܭZ ebrƎ]%d l k"C 12JAԯ{2p{n0`YrS708Sn0X/!u)6zC?/ZiLb,"bg# o/KߕJ`d^XLiJcc-I9RAptP^=wugHJdY3^v R̰'ɼ[*Xb WW1ao5Ɵw#;v}V*2嶭cVUM)8HϚ1]f]8LHxoɲBZ3jnbDeXGoG>avc6Cs='+rsF;vps&h}|OF%[g|7PUTTc[T7ks.VZ)v_ !tf;`;TPۚ]k#hROUIэL5Tӣ[b e1Yעh& (hcPźT٧\fJ-nVyeN/D=X$7_MTSK{z3E"8\ o}`0A~D)x:7?^@q5kxS:Ņ`ZbTyUڠk; ֌o@OHMQedSIzRj+S&HսK-6/BUK`~>miIÈf.`*bd]@WϥK 6QGe!bB_&55h:Y=tak7kg5d}{z|6 (SiVo)rvq:lԫE{P;#Gd3]OZ0lVlh*J;*wN"k2TUj!:'c)Ry>s3 c4|n :sBmWz͈Q/Ա6T6Z_IA +*]l٩ZA<]Aa -MZ_B%cL# bR!'LEo:)6RVt9ϓ$LvcQZy 8 dt+K\ s>ɍ෋|/V>~~\z/mu'Vwϟ=sw>աDgUywhǭ._?DYbVgmIht%FgGwS=/vd0/_z-~dkvœxS0AQŒQ x T0LR-Pkrili@la586Z^L;3bW.s'pKp)Hd) v__ggQʫcӛ >>+2Zo d ?TLɉYzhJUr VGnz,_Ъ (Gn1r,")pϓ&kQv^B ̛^ؘ{.b:&1*d3%ױζ":]Y=C H#:^Q]db ĺIs+nf=Hq߈ Rr%J<6BTWR\K8*VqKB=(K c,ClUK&~eWNjg}%.zn9;NC1fkk"xcN#.u׋K@Ffq(LKARPPn$QԁXDlt⥠n)G6@^ʅ>(g=yL,:~JŊu~Uᅳ>z|d/sf`X9=~}F +hϋPH?/@E:y<@s{V˵<=9R;|Q o#$kz 6旉,Qvʇ÷&02&NBF9h:*!EVcJ.:h%ƞ;_{0x9z]A[MM\MOW["$Xw`:sf/Gxes~{EE-GQ*zs2OZt/.4x֐Gǖ 8K?>'|.8FRVb+}-Wez{Θ?|׏G JbdT4 gdugLoOmVsL3j:Yksm_ ؠK1WSs4~8ϕQBAy2e( /%. K\:aJ&H5t -qGgwFbo8r>&A>:bbay QbpK} "`t3wï'rThuxBSȽN(QK5ߧ EmΡ|񘗲G;'uIob ?M3ȏF_62oTR?oTv>7bq#v%[^W>:B~2ɳXrP;>[J"r^M2od(_dqqy9N{1VB$M_|gea-|GP S8'F nvA ?]~îB9ҡ/ד *݈ R,Ga9]-pqt_ %f#I?%6u:-cE"yqw,du hdݠ#vc32(.Ч.8ޭS?&\[&ơwc 7vs[EhNՇ%q>z砃hX-dl,GFRckgWskF.4iʡ@"[9KNbAHR)b{ḯ>}8n^|[Kѝd7=٫Vtv͔Uws8s,NjU{U7o A"3R[]l)t49F>d3cm*Ս{]k)ys.xe sw>Mt Ye0c7ڱ^kNx%9jv;Gt"P%ER7TZXg@n}JmosJ nwWF$)zqsԚru \٪gՀa_s>^;RvP)lC̓R_IVc'(dKUh?<>.h(Ry8$ߐbЫ#V}X'O# &<>t>>C- ;AbN;_9R᪚uo^N:* i~kԖ")eT*-iiJڪ! 7z,!K.#!- 9T|ZZ\.T]9XALave}[]ϋ(7ތzY\WωSD*g4wlGY:haF٬cyNB.KL`) zq[TK Zq1V ֢A؄&BQ^ @Um'uJp%*<lpU/+YWhQaR`-כlϩ፵Ӵ}1R_ |*)33Lkmf HIٔ{1yldS|_Uޖ< DZC7HY@0Odeslcm4pN_=^lfgj|*E R tq eK2*'[a.iv ԀJإPIZ8]$_ZMZTF{W \$̢ii_ &N۷]tk[eV̬c5dfAq ބׇSH5ӭhcƚy- /V\S.>š^1O{+ZWr>~q-RvW7Փk{5/ջo=o.һ?܈Wù/_/֚/NOzOX}k7?֧S m.Mi Ve~+Bn_h L~EQ> t1W ? mX[5ICI_D4- JZ,|Inlbs%W٥k Is\vęR(cG39 Q鬍wtTDY\%vÁ Im TqbvR,9C/{VLy3|ԉ,<e# =2Ay2imz/c< spP6J 9*"~ HY!*RRe؈dRD2DSK!*g5˖cuK5VqHvYRͺ IynYo*1LNڂQD`Sbo2^IN:ܓ˚MZIqFIX*.D$Ǯ:/ԟNc֘A[]SJ&t.H-y\dɓ@|PWSD'n$CPKhx Uvfw;"\x#\?>Y7ŶƘNͶrRs"f\9$r{ӎ8/CeEP)-45WRnmHh)}K 7[  i*胍+F! E׮{l.i0n%:x> aGј?7d`LoX KXRM2Iurմ`RөNgmGPDn duFO@*ʓzc.c?,ZqVwR@Xw̤$1"庅.!jc B"Ag à@Bm^4涒'{~񆆈-Z8- jmcSxZ$ QN['*TSm 2ҶbH n^Gsi @7B9w=z@JBa'ٳRod$UJR{22;Zmu.s)Ye0<a7ܻLd#,ߛg~(/\DXzVUv'Wor?9j, $j &˒:MywÔH gS=.W[/.Sڑ`V'qnV2*z$'`5pMKrq1efs2pG FF;zɓ2gc e+Bwǥ)CJv)P Զh?z(pD'#2:2mdcΒk}%muz*67EFW)|nsDܗR~xM"88xU~ o6apPoo)ۭxx=6gkd*0 p)qVij\ t98Mj[jk:ڔ Xe b߂m?\k Mn4`FiJʵ,8nbUқ%n]d;{ʥD5᥽2vA^"qG%M _3vG19uu݁,>%%ȬZ/I 26t`|Ns8xppx?3pDT#c̾4nVp2*A=TF"[<yI*LQI˽ 7ܾfRGkI[% 6#@62;mVK撄9S$GN2-nKleHtgꋤ(:hH[B}U%Xo F^OTQ?H.Zҏhh#xc@{ wV{8wg;#萊-=KLpn\/32:Ig*!vS Ţ82Cn;x}G?ܹ^d2wxRi7}_}M'V3R>,3 W1ŦXBcꍋ}|Nፕތ=4MJ4ex~"E'KʾA.1FN?a6ӋQ'̀?Sv'*}"=v@i/yȼ{#sUdު3(׈uVg5n";1"Xq.S7j7?CΨTmx ⬶)Sg\o=۪5gfr[ 1/^juj߀UQ+yTr92\P!3|s~w'`!J3rc*|CkZ.?+28-Q?HkGh&l8VFˍfv'Rk2j)-oǫ!IzgȔS,O;yS'LӍ2EW;0Eb›c .:`ʔ[2e=כs:6jAyB.Twanr?AVDB_AY\5;}Q 6Oim__z^$~xCs{W 9(=;h=8Ku/x`Z˺O/׎˫/?}B'Zxݫ-=2uhKzِ?/kzf{ ΪF?}}a:v\f__wZ!_qNj'|=JO3K"Q+Y>tUSISX7 }'JsvlIz4>9m\hqinXΉ9{#m䦿Ip0ɘc`158iᮖ<#<%ȑ7@6+oA+nF\v)76 jg!,a%lxAY)G'zXPQa4\4ahꆌ#tJpN>AN&NFJ4찃yK4`E{kmso4[{lDh~ A#GLޮ^9!˹n8dcT,^(,ONj;tNr zM.,C偋VStڴ7*U%jjQ{POQxb`{oěw 7sk RVdoM*h k(#j v'URMc-ώ=7=PS3hl$܌Vk6wmmJhtݮڇSfΞ\⛔_ Eir(9lLŶDÙk%OtE=FD)=κPuMkFxacpjΎr[/OS-$ g#෸3AV`0ը TJc'9,+֗";ȀNBW C#ME4U؁JjU%, $8}ŊZ&9xPᘲ*ζWl8jzkCnvD-/^V(n{p8^C F"! Ւ#!FFRL2S /6s|V=WCs#KRɄ7jIlqUԳMfL6bT}r!:{/jtr#%5=eXFE(m"6]½B?٘Nwz**ʚ_L=+eLR*; `1hBjڄ gZSy2Q8QЁ>YQdJ&7ms+A 6"0@O^*7R]Rྻmc^RFT{VhjAEgcTؕ9 $0;@t[|P3Fb-ndmȡQVe)2(cDИ٣j92x4BvLeC@tʙN627MMծT?sp_(?ߦOz^gWE/-m!>< { 7#is>5'c4Y*l;My_JV~9~;R6^Gc:g h\-y2""zP>8&5k39[ +U;"6߉ZYn/:WS-k`~?lti>'~ hDṩĸyexO]ݒ'Pe%z7=T5څH;B^S1ZwFN~jOlnK1K7υsB9g̜$/*E` JHen'l_D"`G$lHX n__] PG>+ag4r>O7m~U #&5Ri-ND9;?;`Йb% BJǶ 6k-y:6aEޡݚ-Zr O]YIh˸c+Ҍe$FW1rȳ\rB,K#utm3a֝s*}ٺ(٬FX+zZ)؏I*-xs΁#UBEOp1Dǫۮ v %n*'ϢDpOܻ%OFõ'"d Xr%*JR@q aq)0`Yq;M#mnDT.Lhs)Læ0m'ѦT<(ښ]N)yR1[ JrKDۗ- Qؔs>E[ \刢B=s3< |jRyk)֓S,h hTk",Q/O`;7Ϟ9n| fgz)Ey[6VS8y^#rTȲZ.1n RhDI ULR'䡒]P'"%==G|J*gɴzbz"tsV97V I60R썿` q2)z>UFs1:R-{Ae08Q餜N2}V2z%Oo?*Ѯ!F59j\Ll*,mR+ʘɍꁮWyȣLJk@]zc;d爇7@?{t<6ų:_~[e~ꢖڋa2mim5#yӥڕ/EO;ti.k_Ledݛ >S{%@noшඵ{.b$`019d ){U-?m!홧gŌ ~Ӗ

P+ TO7pp}K L;Pc6Wn.Č S F}K";+CJ"޲㨚3NJJJ+ ]j QEiͫ=UH2$~Kߴ'mtϢOȼk1|Pl,<"#u3j?So-IH=R75kGvz]OU2-H#v0d7v ԋw^*]Il⧋ͣZtMs֙/=Ph=i$?ׇf1*b\|9[t9^|Fz /EL 7wjo;'2[vM̰ *lG#DaA;LNYMy{}'#6"3;=#%CXj`pN0GM~8;ygwKH jĉ!냏 ք*E)jtv/#etoL=o"JEqWc"ٽ|v/7N3ji*Պ'QGt1PyGƩKFEPS;ȟw?uFuOlA Qgyӧu{JGeK߿򷟞]cQ)޿nOoo0KdU9j?7}3FN?%O: ':pgJyhiSj dzt}wKϐ{0e؍eB5!w)")O(!w}"0Z(Sl':z;cD_pWڇ=0aǡxxHS0 9ppV]q7r"U1]'/smHEӽ? AݗŢ?eŲHrjʒF&9,m'gꜮ)wɋr`ndz0Xo<8rOT5Lr4mkUѕZ~/ MYQZVcsf=cL5Q3k+RlFVgҠ{7~k#9d]Y՗GbAKðK^L ePڍ U?HyS ȑ{:Xއܒ Pc+2ʨ-,5%djP[af]%z0iv)[~R5;K:,2[kkkt˷]ue05Rӱ|vnlv ȑ{o.T0"yihi pˉChf.$*"M{1L,21LXK+c 'gpJͭg=F5rjUK^mRDoF ,i cik?e'd5mHD+/y9$ GY-ZdS ȑ{zF/`jM'~ jdcPFlYQdrk|VB{-mSBN+m'dXEV3lVSc$ A\"t_+GSDLy=͍~sݲ.],Q f|+h|6(Nz>+}MŊW*mZzk T`t}G1dwAP{WP,ʫ빻o[*r{etW9g]},kWh_etAJ kjLQ]!;6t^Kx-b\UPk֭saEhlcYX/Y mvcqk~Oo&ޟ::W$vwʡ"zb,hRɖb]ZF  2P1}oɸ ۊBUKju:[Pjo XOw1E>gdƜ3n5H< +$n./|LNbzӁcr%Mp*,(ۨc.3\hDou^"-.LD\b4q%6dL1>^nMZ"p$8yC;~OE dk愊l"T VΝ =f&콋k 5HO"{(4N؄ir&'3)B2m΅c!bRg}¸ @:Z%#S.$OEe&#$R* &7`*d^B,Vs* 4InS.z&w݌WZ+QC`p$(Pr^Hdt191=肺Ò%.jlfg{_<Փ(O/Lb,E”HBFNL{)1HZbFLAxiD[ (]H̭/]ⳛ%\Pmc^T"r/^b?$ٕ+-:__*\z%3^H%L MU"EO/B|a$>sjh%.r%{KM[.4B{l[`2Ǖ[eĆ"[-0}ޣÝ튍hbܑWŪmyd#\yĸqU7^"/@V4B}4fTt-[# n1,нkvb7f./>d+}Y~gsuENtkwEq&&5~-KIH*$߉|EXL&H%qαU2L-,lȂ"$ %}{X-hUtCoWb/1RSddN9Y&'+rIXUh Lױ =^YHSFJ K:XbMIȪ/u \$`:f0ʜgT3eRsu^äy 4T3%>w]K -TIUl3$2KA,l1`Elr 9!'^l)&6n_|Yv.8oab>ꌷ⣍jm|̾VSM H&],8dY$Q34QM+L}Ft%GC单gh%/hE@)IVFqA.^rL LF]_+t9I\=(ĆL*c\%&uU h5'J>F[*pRBZ뮑>\o./B$Sȱ~5]>?pGyW^ OWpqr.O/zsf}՗O5_~G9"f.sZK#T ].%ul/x;tX Gl kU𞸢׫_M%t6;7Rj|S4o' 1Xg9%=pD*2^ٜq%j,ј}#p+'EE~zkKד%hyٿK^HݿXQ\eT`A#'=DTNWW@S-#V^:@փiupkQ-䦘}n/x1));ˈCќ16wB\cm N(KRJ>M5uCF"Qzw@t#׭%ާ\@8u%U>_㞷?doXT=G {Sx/[ZvC5&dĂȽYyv3`` Lwyy)pz(y5XKv4ՀiM5`RS߶Ȏӆ֏h4! J67 T-F=u{kHoľ"g!GzF>Vy$# ͳChu,j'Kh'H93W_hezn4-o5ppŘݶϿ ?9)Wߟh66xgV}R7^+Rɫ$." M:/|,HE<) eoU-[\kdtc2['9ZZgstyE0Pb{K=if'<*JH%9\tî_OIWG1#^G{%:_~B!{%3WX'v>HVj̲R4ҫۡ}6Z2.c2]=io4./p I*16$En AzŻ0vt];q :p?Sa>)RqJltA9hg<6wp uc&H$ia#[><9E0E^="kţmsm.A:rIP7B[?OOWU<_㝃Z#U=qt狫W{^QycBFFLd<,z/@V@Knwg->i׳2rПO:_%zҽBuP[V_9WdQQ5q@Ón2kD1k4Nm1[SSzpf0zf_ ^}p~p~}}]/&eI j}U1[JSεÚzRseePUt"^pp 8H95hkGEGBlO5d^(>_[ݸ_~5&&r.& ^kzLxߓ]+m5W֌-f@Ӑ\J;bz1(fݎRȾ%.dqBe m򪕣oks2g\Kvgtӊ3I7~IZqMd' њњEܾw\Jd?%ݲtۏž .ڝM[̞=Lnŷ%.jiL*, V)ƋoΙo˟ϜUObplo%R]_9d3@lL cp;CXu0NyETNEmf-_SUy=E-y Qg|:uHF\Ŏ/C-NٖJ ٣J'=Y N6v =O.5,f{Q~oZȹYlӢn+qz nqZ`/^"7v Aۻ3^A=EO]0dT]:b-yƽ[F:URi#^KEcբվBF{qhǢ@ӯemR{Zimd}yٺY-!5K˓OK!u1^FlM7{V𤱴+핷wl\PRXrV}w )ヽ~ ` {?_:UPϩp,z&E hm=Yx9 ͢Y4nǸ :@͍>psP^1x0u~,<x6-opaUυC=޽P 5X(JRN2 `F^`o^Nu̧[|u6b r??t MM4Rd"RHYXngħ_&-(~-QFH$cEpkᴉO 0*hoݿQhxAR]2Wq}#*geczg ^{*$"@Sx9q Bnx@WTY-X)'s4?UZtiȁr ր $J ,90ȑx=ۈCj]UT)E/M%HeJ3ȶNXksuJȃx||x !B7e5%`v\W0.Ӿ? rbLFe.3AgZQTpHkn}h +Ugj}*.E%~YF#@=F=Vi?bӂ vQ̭Ũx:[`T9%X` Ќ&6Eg*2W|{9m|h|D丁S{ۢfJF&0Ƃ-[[|{ю=w<ۇ{Vש'ofRb 0fTQ^0\SJYn;hMFE: ?A(J$6xMhu(gEhYe@?$W,D^HIFpn׭O i"[ ١+n"+AFZrf@ em-Soټ${ɪ.njklD{9؋y li#qeW?mњQ5[?9UeO֦ީM]_Qƈ1M=RZjkɦ^C(f;W0Q}%4|5aR*ΑѮ,JS%B=|hp|?ɤx 0* Nbd!QaZAŤ1'f)Yw݌OތB?vݛS-5ʧsO'ISt#, ;@%lQ;F }_v8EA&ux~k*|D_ ?La:&r+3 $EHTh-?M-)] ̈́MfUe2g(UVX@3)GN0Wv]Wz6 ԂnqPZyK~f11ZՌ *W,hq24@6_իUg`QFOå.ŕׁUku^M%z|-yКP3B b{Ȥe\NJ -?tö;,|POI,bu2ưFGJxA|Ag Dʸgk5`ڰEE gm6JK)ݶ?xA 91tTB$5fm 0Ԅ_u,.nh:EW|*S NTj=[SX2ΛYxtʁ~_mlbbY1+7x욾qИtZͧ'ϪW? wͭm+9sq4;l(EwMcf+8ޙ7/\vRJf%bU2\"¥R1WR}NN-`ޠu=*7wI8̵xWgBj&f=&-4Y\JY$jʂ~n|t>~:d vYq]l ^WvQ^'ɕ<#rDQUT|)nXyZBgJ c%$8'.MC!u(IN*[nL}vJeQ_;%4v|\Q&/<% H EpqDۻ!=zqPHҦKA)BӧN:GأPl /Sc0L?k0M?,3szN][/{s&J v t?P/fB(eRxD#] A?I` *G5!\> zysi wpv˿}ꗟP]K@1Co8LFS#2NdڳS$G ~[?^(F5nAst4G!M ^6T JE;!h~|8/NpYeCBոuIl )Mh2!@ÓE"CeQ|Ў Ӊ{*`p4220&hPO JP[\aMlbxK)>u֐+NRL U1Tqd Bnx@_YPIlvwKSHd_!M2ѺR&U9$p鄎̣GQס@0h:!M,f8ΑYo{5K0tz{3 (8I?OF cHkjȳDӝ Na}vSJqz3vnq1ȋT3Udvjxw'S[M@@!SNߎ'OLs=&dnR(ISn\~][2.S(E%ӛZONgB_:lY0)wOs9}y1$ef4{r?3LM1r0.~l<97T9nf>mi8,m`Kͯ zw]wy߷ڒDg#*_CfͬL8kKsogr| 6ˑKhy.:J"0iU*Kn7u98 s(N Rg L99V+b ;i敕V+dFB.#{:ց!ݲ& %rU+#˳Wglx`IEN׿A*^vѿ/ptE1㻗/mhv> g>M3:?[T Y>9Y J Xܮ$$@P͜5]x_VlUa}x[7xܻBvX2+JFR,͌5źjA-^klA@>o _񨏟_2BdBIM]_V\vRI HD45U Tzo?]9нdh[&\&ćqeM{T?xnV F!+ N􊀡my*/ ;$0:e`uWGGZ;x}]_ky;Pyյȼ 'SГ\K?(k}(zieZm=ni~hZp|.:LR\=BWÀIO}sq4pGLəVF_ڹ_eË~pQvEM:JBШZr *},ڎ_퀊ml쫉PhVZLE)7A8enۭDum%1eU+Sw{Xkrwpuǀ\?&vzD0rP D8(dS:Pw5{;\>Jo=IK2O:^\2a԰(SG;):STYB+(`٦ڳ Ռ`q^%$Dޤ0hN>1Ө ѢAkiAs*RϳJܹKSS@)?lpQze,U0']dnx9bӊw<}f ăPכ[g}nT{@*'krF]!e~-ad˅Z-Qmcgú\TWZW[]-Oؽw{Y9؛eœ_9%U>}llmzv7e,Dc [YȬ#,!;.}m8'FTP==!ehݖ_T~}ʯϫnJ;&p\ט5"jZ_vJ pY92]fjJ(bV,DLKX`c|ZHA'k]:uփ''?B 6R"uߠ1-E!fL1ϔ9Jk"Z("2z:/RxPӒk#QVa;mڳj|@5fjWfQW2SYo.00VjO0ubOj(D5;b!^*\2f7?Z~u`iEFMSUoz|z<臓ΫFa+mګ#'@YL`j1) )NIIMfwSP4Su:猜L 1< C;\3c \g9f 6XZeojrzL_Ҋ τ*+O:jyX]Ү(O_^8Ob>? kʝ+A jU}{WfGԼ)*ODTy/Nc!i-bއH4 M'~-ҹhÏb(gv bG*u}ZQҋ٘-khRѡ(ְ5J EAjvc%ew(lr~rr u!z 2c~!4hI^ތ #=RTAȈrVٔ,%F $PC`G . h8)!Ӗg| $A QL]z8TTzz611U48"Z")sWU3dzǍ|:; ) 9 xߑ.k(l5/K%)[BRZD#,H(coI۫EaLuJCqp bP̨3zh)jPo̗h5T}z<*6hi*y#kժDjHtddB6x=?{EXN=_@|~zݖ> #ipZ~g b2Ȗ-]MfM4&'(,?_'y݄"̶ϣ4еWz:n"jx̰VWcR5F.$iz/jEfxӶU<_bsKƾ!bHuPn>vIh$!Ըj;﹫[L dYbK|KB%t@7zWʆ#yk%C/.B-^jͬ8‫yrH!50gpد~~thM/b1T0nTimR}Ή鹦PSv4mՄz6F[z:q˩3)UrS9bHJKLkM 8^ͥT.n̄FjVJGM U.NZX cRFbFmX~fzgrhv;?gH=*mB6D<^& }JI5KGQ 9 J`xj-?/їJ8r Ryr%Azϋ BT-ۉ RCi!)pÍ?OZRqZr~3Z-t-rZ@{Kޤw\f+lHL9w[ܾ9i8bEd)O t3RܗI."Ζg^rE/)9cT4o^!ʹ=__ Vu޺{<˅ Z(T.z@:;ٓP UJ vq]ah|nK"NҳUEi|0\>yC 7#0h-r?[,C?sdq~[Bx8X_j3([t*ԱsڨӕB4\9Vs) zENIi2o!`T)uR%_V]-.Y{+fB ֳ/xy["边h~kdqk0jOŸ=PTDqOp:(=J79J[iI$xBKORB)umPdRC)'岔J gԸ^(u%䕣\UZ &qD+nOn9G#v c=%fIKA@3*n[߮EwI4&78T8m].4dOz0f4(6#lÌm͕;l2sy:-j~{4OfSɮlfcD'^:otjɫ+_\N2c qjxl!#4wMi.(E|mm8E.pysr0{ɯ2י?+,Pg\4H$( IL`NsJUt"M9Nعϝ^+()B Ã$"7*0F)њH >ѝ(A 4۶~/+AERZW9OuPP CR轗8ahE}[q̐+z|^pIW.zG 3` ѳ_ΎJk!?fQ8W LE[v;>g3-`~lA99=Q >8%7!ޡ1E Ѣo-qom~e4LJ=+)kvCb3yTA=9zzxsE/݂l iZ#J[(W(:?rMdo}ɳ~\;)T *slҗ a/W q`?(Gft: lQZ C|9HB<!R%32Y睗ʳ=!`'+kZ&[#`멽ӻ;ӻA.reNΙ>qߦ~k39ÿ^hnnSyuyC(L #o!PXڈp$jsJd0 rbs6l=/p+q?&.Jb+MCCD7x#58Bsf-J;\cj`7 9W,/[ ZK4qJmQB;< ›k% SL9V 4\^W e KN`H]3%&Mj |!RjS,~`mP;hI gPdi&Sb8Ex˦CWɲS\x Xi|$:S%C%Mke%6ˆj]9v0-n:}h43jقXD|4'9'vDG7;2"znn'l Ï~%fEICo؛׷߿]0? cġ~BGP/5݋AN׋f4B8 ffd\+şyƖIDTP*p+ϩ'X|; xBXToD;9XR|E4V&Nׁ>q.DF QE~ر}rY-WuY -S\[ƈ TGA$ 6(* E$.$Xqhh9if^X{ޅةb;aJiA4LYqF61q9%&[0zP z5YS9r5-n\o;5m} ~z/~QU'"9,7~׵ԵRךkɗbN#{ nY$#U\$#]P.JElF4NZem4Bp"k-M|E\@G V6EeM hN+&C"*Jŕ <bQG 31"8Aː#EV :;}7:}fZfKȝ!Yo5AxT$f~Fb^ v6k_W%Dcpڛhp%`*j5 Il bPf|r !4>ZϮh-}llDrlHhφi\8 \`\LI䪄J Lk(S ]koF+?EaJ{- A::-E8A%)ٻIqDȝyffgwggU%.IxWfW['ŕI]ĻtkKޑ>5W]<~wub?vLVTȁ$[I&L2dt&ZUCdbl؋F6>M.VLRӷs#ZW(ˁעy"bX"OwO%`aO}Hf=/7$⼫=@S'@8:L0v  EW﷽z2 !#$we& ժYJլ@u_i{'2Jִr!):T*Ut/DX͂J5P)WR^Wֹ Oi[5n3Հ f #(#'ָTeLkvTGL5Pvthn{#in;f۳^-m rIE)4Lg.K-P q>)^%e"9KñE-Iy)i"$ P8+1?" O/]>\d" X0uEgzoFiHa8~jE^3:r*̳gr [(p0\)#EJ(Nd}TCV޷UF䪦֩Yy֒Vfy>aBzr;fcr dNdeݜ4'?>TWZT]||~6eRJ}l;,{H/) 6zBHDʻ.[X|qէ|!ȹlfVCu[͆n~:N^_Gx*׹}pTxBC4$&B7:}(B)LZ9IQ}W8?]W_N PUGAq\H&Ёp$]c{r>HK#i31ޛ,C+;%¹K4LzCpR]Qܜʹ1*_PL>U_03nu2ErOPܨ,%(x?Ł`ېא_gC9Zz:<0ʀHdż֚*= /Y%ӿ9L׈l]'Mp23l]gc9Je<\5qKFW_ƿIx(Ū~V„=eCx(2XUfdfOnAH|oL.CJ%лCJQ\@ERd?ɜ'.bv5y!H` <{G˫i;0A}uЯ&lxb2[@ޮxb.|~~y ?|4{姗,ߞGgGoAv?k~n)Mo)n+ny|1!s5 nR~w6WiDf4^޽2[>ū&zu%9E#ي'4W?{ntʌƠKy|˘xE~ ._26ӫ+r}.ggףMC3PͬQiX OS5d< @o3Z=L_[=~ _o."POϞ*^F_QFSӏyF5pp60/]d䐹 <333.z<ʤ+}گO0&yO@4L]θd+ӋdVE8Fo L/`|2#a>JUxbn4<\͟ߤ6:,uo7zlQb棯G Wzf/v)eyU/ҿmjDqku5!H _9߈ G<.Efgeyoa.vCp&^)+ޜ^ᛳaq|ӦT֪|i SJfBoeE qHMYMM'X l+Ȟhz 3gb TH Zr̨Cz%9wIϠ$)KIR}`]?^z$%)ȨٹJs'>$A@&yD \<ܺ]R'7I4ܶyܬF^(B&kXM@*&{E1(BOPBepi,Ԓ QiE|gǍȭ9xqج'U=kU|/9ӪExwBD~E:D~v Du\t8-.1VCY\nj[ BZ}L]2'/^AՍY:>-{@*cP5H ANPPΔa Y0xe>mn4[{WB!^iEZA*܀V(*BqBdvWĀr5)p293BdkЅFe"/0U؝R򊷝)@w=? _ _xzy mZ4u9vOʌ[>tGfvͷ2U|B/'Z\|3;]q7csO5h CPD\ YV>Sf9\del*pҔ-a&Z>QH Fu iQhm>2kK6]~er:o:QKiL@T3Nwqdu˅U iyJ;՝:3^W#xug3^x~yħ&<KGᘗ Y!nwk}m@kZjMl4vU,zZ3\c@I$ۃk1֔junm¨y\ȸreB2ڊfi]ݸFCIuMi6D19BB>!&/^Owo`RL S-m),γm >@[fˍE861a-#:fiyI7◴j嵐$?M$S(wˋ iBM^luˢeweCO'g+!")#I*֑a!AQra VW C,R5ih";wcT;vy'YƓJwdIvӱs_$T,%1H /(UBt1LveF*<,o[usه΋F͚?stnB]DN,TN&v,;Gfl~T`]LeL{Yl%_qR̘>V\)3' ؙӏ'ZN[K}?Y.ts rB,Ի?eFKZmce/Vw8ps,_ޤb|;f@|_֑ۋ;1F0IoW.̪q;Ag/:snZ>VT֍+|< 5wӰuQ }jнrI%.h1jBuu(FM5\H5?1F4k GATkREc5ck\obsOjjt'x.>l5]Yoś0Njc,` Y$<SKZ)So]ۓکNj[KyS̕:FVxnLԞ PFCie!hQ0Z(sxuej}#Q i4tQ%ncm/Q(yR,Wa23Y.< P;Mfg3ҀhWf Wgil0]xʑeqԀi1_<'puJMߞ 0L&Zhs=m‡:#)9i˜6cI0QD >H%Hp}BNsũ`e|=z]w&*R'FɉQ'%ymʴC2b>Ox j< !R +CXQ !H(D,$ӂkOs!͔Z5ZD1 g[T&U@ =ODH ƫ,Zx-2R:JaG02& N"`\ƨ, S*! Sj?*M<ٮ-f塘dz5V|K"ɱ ڸ:X:VTH(R,E $Qq$"UCDk&$qPAYO("_ )cN8Pj TDKĔ~p_ ^G9xJ> ,aEPH,/+&/F|2~43jzD8ߟaҺ3q|_5ޙ`LMw&ػbMl4P$fObhB00D!1N\( a`dt/CM8D߇Y8n|%ŨaD|C Lӝ!M"}go>.4QT4uju':vĂǀU-h= ۖ DWmxqr#!`pQ<>~7][yhҷ݋Or[Ьry^i47AP-O{+^U"Tm1ܒa?x#0\.](D >O^~/z|>1ᎧD 5gE⮝:*dqJf MJiS1Be8fjZ $0"a~ lΡSi\ ?}WNk17 Lwg3@԰}{$:L<>*ŲO)H/&g f-=ow5%v.+(P?^JьQ]{ooj) MKޗK忧A2EQ(L٢@F<1r ^;Irv{WOuT&sەU:߰Є>˘,&/VKH^s#gd$}Ieu5@^su<뻹=> kp~+5 a E&X*7YZOqDq {3Ql $sػ(ܝH9Ρ]3Dsr3]砭 ieM"vj=ǔО?;_=ϏxE,/([,>C \qܳ@+|Q.tŸS3Èa9W^űRSB>&M5"^.5"^.׀Sh8xԂ4hMCPJøP@3 :)[4 C{79bXOX10*k6272;4Lky1VLfh*ƍ MlfaVKʓuꝸ \ ֋%lP6?M?:Wx7O5JzPv!&|I`߹߯"§! [[KBӓqjCբ_}z%V`#9&_Ey?~Xn=LlDDh„tXaӏ.<%EhRT'ֿsk':V~xc1S-֠bIPmzO8ЇkRSaJS&_Ջ7 5cS=%b*&b᪞|S4[JXIA pҫ n%) mF$L8}QQVo)w`ӕY*^N៧U\ŌU\32uM2#hX¤)"2A,1~11* !N55EYI)a3{w9KI;ػ ӈ^|J gH;"qiP@lbcgJJ FJ툑ata,T*^n2PeQL(SPI&h'ĆR{r< ,%T)+#1ʔ2e& 3-yT!46SkʶX |bDGURdCe)H`I~UbB גkĕRlt UoTKr=,-QHAZ #ELe9 m*2Œk viOVQ^_0AsU͎\wTͤ Е|Ӹi"$ FHLa4T Dj+JZm7R-Eȶ4LCW;1Cݡ65mcFDJ֞x#iSJRʷJN\']8%AbZqhHJs;ۭYUE ct :|︎C\ωig;=;kWDiXVbSUK1KW}pFrn%aU|yt"9"]}UKdHS-~6b*;zV%/k40' Wmqa2(O.vcM k-(Kyo+QdJf3%I1~d5>kI2l=( sIKY/l}<.>Bo\fcUi k0?Z3g_`j1Md4#խ; 8OeQaёkvEODZ+(ɨ>~ HdG.r1 {[qY;SsTK;/w@:ot.bW\d 0q|{/ )˹DrHCf&ג+E&`\wk^Χnx<K<_N̛3\.wlTg!:deMk&"dw;o=A[ Nzx u5"'&}'}wD$''.\J#ʩ_..BL:ԝƓV\J%K,KiRlF_\Jer"/!x$N1BtXjP$>;QϠ (;ب"QΎ'%W?< ]Pޫv6vGVoQ_z T2?N PY}jߘN^lOetoW~ A0-{ɨ3"׷3kn(V#;#5δ8igRB f)Wd3Bnp#)oi%3{]1OY=M\oXVN?$xc2~-I9aIrlIkV0ҴS-V{ɓBoWQb&lz-5?ݰg7%A}B4'tCxF'—~7"Kl $Ϙ~6s `k ۆ@j9G Kөhts%D0y-$j'1V<}$øxg''qQJU1}'u4H Z(?*깴Yd쓼lRlPY(!/>Dt`\EQ<<16Z~Z7vQH%2ϳ ~,Ȁ^Ca?Swth6Ds6$psx^6aIJOV00C]l>q݃/?_׷gN֡=#[+>-I OdύRZ?Z7\i D]EuMU4QWuUx@TKZ[c;ZqmtUk4OhBj X^1 ^05^KOyf#kӦ<8K)CܮC: Jam!qN];x>M-3y!C1r J9A<^=<|^UF'uA2TS@$% SY"AZ뷘u|{~xX=>f忧'ROѠ w-|㢀/;?DYaXݸ$vta Bs^R`nxz0GR#tDh9 Rza('iM ؀o0vw@Օ?|t'=ʆ_NX"l N[szM,p@^2SřQΑP/Ts!tPbT62;+#1Qu EHSaC(W5)̦0x/CjCD3%Nu*!2 A5tt\ĚvrVQa&s\+נKXc6%.YABc1$-aK>xQ:0֕3-ģL r2Ri#Cy/.^y<,~M?&{v{-2{*b-`$ bdtFjmB楒V7" 4 f&6I y{R#tGGOߚE;RۧE`hf5p5PC5͆KtC"-)^)k)-(qhp<@9.KfK@'-Og>ݺc`OP#FgMܼ5x,HU(CKd /eC('$x ay4ysHiJ#QInwmmK KԞ]q'\#sC IqN忟Ɛá!f 9t+xX`];b/90$FωQv(1ZLyi挑!hL!eBUUDXM#vCF;+5 Pz-\CGeq-+iH6{#\\j;Q2U1,ND8E|wQ6Β~ΒgI1s'qiFza;}g63a~MluX/hP2qUo58~Ìg2-:q{y) efBpT  `Mv;\BqMA@N"!ֶ#(nФck JGwh;ʿPsSb8;RINH2B0׵5MoUe K(Hu#{ )\R RPehIfWyg f ;:uqLY23YR/=ud _>OkdwK?_ֻ{{kߌ? \C?7|?~1;x2[#MVe/xTZRBCթu,'Xg\`=/} Z_OA"A828DٙYW۫ o5HݻPi$ t$BjQaT*c,t{0JKTBP&Hc2AjNpJ m'^ַ٬3)Qr1"sLNh,Xd3FA fuS񳖘=E:J6ѭ?/'l8-@( kdžlN1:)cT3n)9 %:Af䡝ВvjO(IVߣJ)V#'St<+CmngJց Șˆc$EVlOrcN`^fX< p:#SwxG Qb% DR.Ta E34pjb~@"PVqc5cްpa]FC]z}-d2^lf0QROk$=cܧcQJo=n9EHu]F`2z MQw?Khu6kGPpKGcqu:}ӧQMn~,q8"qڈ[sV~VOqO$Wcν|zwnmVt0h[ǻ[ȫ{b4B('_{\  Jys{bܝ4`ڻi69|^!»?k;FG!\#C%~;GPG)4WW<;N`imu旅 \Cyn.A(==ASH8E㨄\w8TjASEەC{K=4@K=qur {BW py͕5gT8Cb2r\&8q8`O2KN[!C)v6~$BCԋ;$[J!O$P~REԃW aø %?)].>~nBȫ*! D}XRmQX&ᷥlyH,od Q=0&1<_lͻ`b1NDB KJ^e ,iJWJ5:JT2NמJWSBJ=zԕIa:P7)ZҀC.[¿[dK5}RCK'>-@l'Y TPd&urKR9W@R~ͩ,i)2˧q%T􏔸q9|:Cu" [4oB3!G/#p?Unfia]\n$L;m[V;b{da,'u#sNRJ֑3YߒTfAF.qX5au?'anf9ޜ^oAo79iʏ}mo3DV=*sтۂ@9AVz2f6K Ϫ?{.<UZ8-lJpJС V}\eZ0 !CP6J#3TT#fǩ#?#Oc!ߌZP8s!)pz,rDMΚQWħyv ͫ)Q;t^F5=c 9{Rqfz5&{vQv:-}"7Zź=c<}NrDzES sqLNM8B8Rjycx*l]ϞelvundcT_36PYŞ*{%gbңMֵJyҜǞcv9$4;X\*Kwc.IOϦg?ʖ[݆Qms!!B39*zAA[fгֆP>=gC%jSS|Uq"mE*4^JRi\?XrU#3Uo`ϫۋu5Go&ˏ_v1vb 8l|1$ŵHAi$!OAE}0`&g77wKi?[2yH;KU7UJ3[Ia᷉>^$L0O 4 ?ھ>K&oK}/ 24'AH aZx9q7)4$R'n)%')HtQ7`6X=ٸq[xuq/'c{2Q=.cO-W1cސNPXRzʦTUI%"TIZ6T<@in@.9)ZSmrԙ74g{4Hj9w~5X1CS X0Q3,ǂy2*s\)&"QOyn &9Mcɾ%B(H Ś [G9r/2#ms1-H)QCHr!ɞ(,&@30`u 9F # ,g s-"W%'F kC2u_>ү*Jb*JEʕ~zݭB"ah1u!_W/￿(=(^_<binn."b:~~q1r-o&|@5@Wg&f׺_րt7aus 5 On%+k/}|{w).A4N $$ef4Pj.uصbb|!zQk1ӬF)un ּR!RQDIu6$#)S!;KO 6g;Ix^8yj%AO;U:_9_HhM|x(:f'SĨMRomBrѪCs@Babɯv 3kpώMqo47XC:dTC|>ę 7Fl'h %1m1aϩCI8S<KjCWOY,a !%Fسx&XTPבa\]va>*,]ڋ]7vm::ona{ .J'f-)]WhZgf22;o?g>x>W!` _>Oy$뻥|{݊=F6lp{:vyz<]}Vo~b&wd(";濓Q3*T*Cj}lWN`g4O6w0Kj1़x"w mH^Qt֣~Y *Z-Sgf+^^V띋 0 F%ݠLTDd@NvD,t|슫4$_}~puO3Cqc|\$WG[QnoGIE OC$'Q!xLgp |2F ɩiƅ9ɑ,OP\ Xn:NxڮIH6waĐYb94Z+% %#S~cH:9DeA qQmH[)?ɀ8Y)Íe()wm_w|? En~iap8[7+iӢ34zYpf X yCigLLbZ. !)7?w.KXuPޛY[eF͟feoR-reitjϣvLh֫ѿ&unL۟.yYẼ_.7?^Щ1ʙ;.˳YGt6-ruVQ/|ki̟.ֿr?ŽXp=RzJQr̍W,[\Dd7TKݬCn<1h/݂ nmHW.A2%=MyeǨ뱘2B Z\De?"ra󊽡LnzI\<@IP\%CװI}AJI0LTPFT `iCF'YiRqxWňt45Rկ)YyJOGAiMJ6SLؿ}Ti47T$jƒܲh(L.Uvw ]g?O\.^yÞcNLA !0>O[]>߫Bz|+k)bËbAb HQӇ{0z*?rhpguТPwct(FKA5[Of`/0;#=zj2DةI]hLʒ ~/ $>MgA ?Nm x GC5pS* 1E`Os%)I%':[K ,۳0"03YKZ Q5*VTd'uzVFÇ5` ʠhEj;8> y,`,2[&E^;$E "hØ]emaY i3q.rv\JKTJ  GR+R\|9;P0 O\)IUHY{ H++0͌Y jz)d1 A9 lΚU]Ѕ1D ,&|?]ӆ|"$SnCڍUn<1ji/iAtJvkABr JzMTz'1j}`rK=#P/$+2%;ȳ^lBB'|] >{<$̲xaHM2w1y[C1༳38/3ƠžrS쟬&V]C0ӱ~PC{ph+4-,YjD,U1n\~p{P.OߙQz~w&>;4 bo0^c,gv~Lҩ9`F{4* )ݑ6DxVW=e!Nf uCZpYmqoHc {[MnC B]}K(漠=Hu?3Yg1"2:"y(-n<=<`f2a-qh?I*o5`x5 ;RZk4{s\_"/bgRUf:9>+׵Εo tzA{½/e"_:<wWT٘Q^/ЯzR{Xj }x'aw}WkB/]2e!KC_k/OwlwOK!Ǝn~ջ9Η/t|_{/Α=>/%u}>1Ao_zˉ/$+849*CkUO-X;M$">pr+1U6&81WRkTV ~u{QCd TJgT1Gɗ1 ii9M2*Un{Ih,iq[DIcz}gFW 9̨*ȡ*E0N&ưLblYSH"Onu;JᖯE Bw/ 5|;FS&֚b% *B&@Q(&qiɸ$%LRM, `kJ TQK;45LZL\6{t7q9TJ)K3Gخ=X@ 2ĕck֬6@tZ3ޅh: ˴V5{b]-q:D`֏iCY&pF4Ɉ5TfU1% Vs+̭2zάl6y/bfz"a$ډ-@R:=P s"a$WJKT{X+-R)jl>\ڷ:&a$c €S}aSjljQ:&a$j }CҎ=v w3ߦ=aa۴'WJpDX'BBZU^~'<ll nI$Md d*+P d`05}x*g-i{ioU#Uc'[z0'K*S0^h (N<_Č<' Lpvq,J &$*zbU չfplP6T7ZQC^J !ĩN&Ĉ ;aW ZNDS@HJƯ6PGDS$FEkbs t vJ\JX6S@!%S)wb_T/?{]p,\!PWDD?8좋;ezڽ930Y ,2b2o!UEStmfBvX`^@owl#,zA9~t2-՝c>nV {{4^q@.SB+cCf)?xv/ xgŏ!:oqZC62TͮU9N4yYa8]zC 8,Z~l8щ]S".nt ?2@I5iSkkO{ƳNMHn?*?B:}:y?~MBy:i,қCCCy:i:.t2zHZBܖVBbޝB< OG&0Ga x-S8it#g|ƹn3R9_ƳB@]ո0Ghx^ \f)R#PSh,uX<ZQDLfZd}+c=1cch c(t'C)Ak j75ߜ]qve5gEYfKZ"<-pQ`kq]On?_N8:oӢD~cs({*p/?> C7Spg'fK^Yֺ47vWg ,9ouRӫ3kM^,wOwwVi_¤3SΝͿ?gߙ,Ã}Mp tDdۣAϭz4&Y隼3wd:{/ǿM=TӞ^V.eѸy@Z>3ƠY~n-~ 뛵2_?`́C^ޭ9]La!0'Y[3ϕN ''=쾡lx1?]Z znK=cW>{?F>~VwO0ڴ}{Azsf \{ =c X@` A0kOsc ߉Bx=Mt4>B[)"/ xᅲ,VQ"k7h~_Y\JTB1ד?|z\_!ʑ 12A(bt2)RȈ$.2mUQ y_XQeGjdhu3vVeΦO>%6m泦UnbCkh6ԫ]Te#m_t47dMy!u,90TS@Вk8ECNqC@x_/ǦFx"OXs;+ uq,7_x[ܽsr*Izv0i^- IQ%HUQQ3µu)kp% d 6KhRK Qu3"! x T0D bG.2 bNDc&~}Z:zZl??tB> _@f E1FeP!cKFG6[]beiZTez _áo{'OJRq//ZHVVOy N@qHJeաrֺ0\.V58$ `;:]ԡ(Z. -l#9GaO}z>&ڬ; o~8!(%t+[I6DǤh ,0bD*PP6‚;1'n({r E&G[=t D!0i7^zgDDB 44 ^syVN7;eUbu8CζI:㬹H)(uHXxFbIp>ѫ-kÜE#qd&vh;cTӏܘV T;"6֗Z} ,můj8aVrpvT2 +F-0Ov87ֵk$tm .߈%]Z(S1J0)nxCU"~x dP,5: \Nyː.GcXgD#4g Ct w2] $XHkA˵2Xőƍh5LyhVRl{؅&msZo<&IfLWgC,7y~j`(a:2stհovhsǡDaehE5b7wr" Lr .?qݭ0L&HMnbA34/  UENno]]GWmBi}݁*2y2'D"MEayg?!&BoS_T_={)vh{>W& ep5[_l9e ʲc e V wYQ ! {H+#罧Guu>L&{s7kFOf`IF*N}nzLQ?0#pnM8Mk*+UyřFW | Y>{;Q}Ղ@}Z?Boٽ>mnWV{:PH ֒T{RA6~>>9Odmf؎ _ۡySmJ/{"9ĩ ^[u[vC[źFE-)٢o]T|7+d}4EmpV6Kz7| uw5Yh٣{~bVws$Rؖ-P ((om]x/U:g{w.٤Nid{h8o Y|fpifͱp,(wHTkkAS75սS;qHzVrk;zORxҲV-&lf8 &XldUWN6L* m^OÚv:K$S {)\NN;͓ckkL^UM{h7=1B2q$go_Z>!Bܾ8![( x۫{fL]Ep3{v)b5!Cd i#`B1WP}Z"w@x7R-kHJAv?$er0W?d<Su8}6*g xح)xCb}+Nl#|aw;'Kgnlv$IC%  M:L i^O[,}8`V50RZ 5k`pV DyP{W ' .)TH[ҐF{,Nf PJ,أ%PDv߁-c@Z-Hj!Xqܢ|@dv (j%Δ;QCPba"rI2`9 3( `s_AC9&8v0dvKJ֐YFiJ ȳ^*(g[?\g\?# O~H ǥ9wJqھh_۾ (`si( ]td};'QUi}QU}Zu0}M}q$y AҐ%UTEq([*|k݊VQéSWTUQ4'+ v̏bhN4P}M !K xi Lsw2ZO{xLJO&\k,jƎkui[8%V8Ք+-:Nt[jtD[p X@%Wѐ)PhrMN9Ȃͷ!|l3>{zђ;&ujQ 16 ~N~H&{ }{?K{>&WYǒXt&[o KdX I^KAj:G@u>{zFWVo+|6^B9_Wc!/DؔsQ=Ļq9x:(b:eqw;!o61-2ۻwB^) ُI6,r:(b:eqwkp)dޭE[MMAHξvꠈ}!ޭzλwB^)rR8׼ ?+٠w+gY{F+?ޚJeL7A2adXQv_5IYDIȦ쉁(wUah"5%[kDJ.8+g65:2~Z[So`SRk@?uxDdEM~vd[I/B]5jXiI  눊< ?E4/o3OR5){簎.4ϳo+"6P۟]@=|W qY]"Z qױĮm4\wVxI߸Hm|MSjdFT`>{([n)|CG𮿎ҝ|DA8Dw^r5 ̸c޸a#|f[3pqv1یG@IWg,3?#lMU; xЬ:?YFE=9-vg~ n66y?2yk׬x J¦(9Ei3E71c.5#͔!YƄ(ҊF4bZDZ`Qkfpl̶܄Taۈ]$H'T'\4W,U#P"6 #_TgI>D#09|(Gw%!5X C+WCBJIhAm%%3u)/-J-KP m,7R2HqiIvȇkgϛ|LXaB\"4`&"$MHčb#Ŝ BxZBq+IŪ gd|Y៊udQazwhVy;D&2!!F8"L8!NPl-:ǚF垒X\w5BG%HO%DiMp:eof?wμ敾h#"OF4\0%X'O=zVz#Hlih\_v!t׍X bqH\F.r0N];-jO\5CG<&Ha&NJɤi8aT4қMXQrNY$X*.@XXEaN՚έi;Gk]*%^Zpk5V 'fBlV|*Դ Rg V&v|ef^d!◍)zck *i6ˆdbl~4||,L/^ջ}/FF!b9R6 1X@~$tr o s!%hI(Eɟm![ =Ysm^Ruf!.m꠿B}}]]%k%92*K$bjꇿ5"x LA|qp;C@ۨP$V|XpF`6 l?bi±Z2F#cC#QhQ$4T1.:a1_%Pt̤Х4>DbJ]J"*IĊ{j}$\0pOfLt X8bB{Rsn)iO#)e܇Cq*lE1e:w&_j|Ln k41f%CwACO}?&n.(m F P})Rb= J˅돌^ĶP}/8WjXr}iVRr! taHeke[76ܧyXʾUUz̛ۣg\† ⑒Z隍;L8^kDr)uZp6$ ,&% (@ͧU>~29}-tQ>?^BPD>A`}$D}Ed}<ǝ0wNS忬M[>,PB]t1=|@B&MAjBNZIr:X+% 4['hcg9/ge,p-=rBgX0?NÒGJ^#MBI{]jMfɅF WYh&.u2q7Q4WM0pzȯcXC(l!oӾۗs$]7ߌ\_Afg dbMX!v5xdt6(mq${gu`"jd?ɚt  ǣs"zp!Z@utCњ3>,Ɖwuh?٫󃪋"]Mdy?zk*Jq oV7\}lZga<[~LI^_YY֜2.vƒ+ ,|3 6$\u9QY5oy vTc+њEyFwp^k-yFd 8Rza'詉Jd֓x' VsC>r+m2NlCAފ1̄@b!2ǢJ*{'_/_驌ʬF}8cHxl,& ΂[-/ k71h9'ߞ4ZKpK&>#^xD %ϴHp!+;QZc-q IBjN"t`UkV`\cb>{ʍsO(  ⥻u͚Ғ- Rrg;t(eZ`5&Du9sxW\KO6kh6%Lh_-=p%oUloUaS1;a+g8l]6J=T5)oP 9dj<=x w!5P@W#Ffea HCu82Z2-zJ2dQxSZ#|oPi3(u cAo[WnR)Oc<ԇ0Qxc|5ͷdh1Y>Sg5Mͽpd+,k`6@d"}Y)҃zK1ɥf4[mQzx*\^p-Z" t2n@ОG xbSZfitgEqv ړ<&kD;.Z#XҰcO#R7gHsbg6gV$eIS_&AYbjTZWt Ƙqߙhv{{7uIUr'^MNGa[r#6#/eaVi &Y~QL&ϚEZ-(ǔ3?0)&Υ4h,4żzajANJo?Zh2oqdؽBμlRzA Ss;wF1xRMGiT{(YUfྒp3ʼnYZ9h* 1Bzvq:JZ)թ|=KnTYc"k-F9 Y3Y `DB;䩳).FP7,y**M<;S*nk}hG[HC!hpp.koMV9>sfn)^Mtd\`*1-n!@w:DN[{CŕLdj)αC7eR{A0[xK;u+Ձeθ4* A:`8"ޕq! /q_yQ 5Zʻ䊤,YTݞ!uYrĩ~*YM5lȝ^«p<*Lً1c_GWx>=^&X{|Zn9ZQha>93)|Gf4 Gi8N\ݙ[ޫRZ"]S#"-RmVʎ_LhA$E.7aw=:7 Ul™M{,W~7,{%'d2r]kG/BgƃJG<}|ylǒ;X)QX?XkKP13 j*QyQSU~oV壇܊G2R^xxwĠ M_:KqNǖx?>MKz`%aut%+\_Meõ5t;EzkXM1&)A(C?/ wgHFW5~c` )VS44o gޫJ\H&xF!\I pcw 8`~.ʠus|=$LLp:g3GP$XAb(`*8>D(2*"ڢmzÀDwN _w_Q +ڶ?%TTwi9͗Yb4p!dfp-Qwh5{?۷>nd NUe#] 73HJDo/ e q^U9ӌ.wO^Wuඛ}NdJYg~(mH'IۗFԿ$k8; }zEްK@ X gw+>l搸Q`ԋFME|K\ϼ7/pۤudv6MCt݂9f-찋嚲rkImHɻK<9/sw2O("j~(Xމ oJi`(@C`#s):d"|èpA# nl몶vDTi>ijO=>3 ڀݮTUtQ1 @=9@\ᙔfO QPYYLs; mfօܪDzq Sj-|`eaob7M-j ҝr0xDi="ˋ^㛧,Юݝ_G=4_MVF(oo  O3%/V{˴bB<:("@@,G|۽iGmt7 *A/bΚC pa=Taw^PvUT g{ޝxM$k+vDH!tmrp} Rr:Lt cJÛhC^E‚RDTFr{Oxvrj\'HmnKIyUr\1F 4^18J4 d,z!.!xr9+i,qnkd 8:!zu;Bf2~7$[/>c&IM4FBgcRUtX$BĀ̈́Hi6>bG*TmcWiU+G6+jP< L2frĦ䚏ߐ7,bg1I FW3GX``p +$ȁ{ESGF҈>(wc$2TɄ[He%G RX2tpYV[M# wBqNVlI1Zr4Sa f*řDka^*"))uűځc1H*ˆF***HVH=ᆷXVdZ.薵ijrtj0OKE0R;(2ztgGn/qw lcL Au0d0`!MH f2.#VBc ۃwO ]A4xksBq;2ɦ fmܿH C9Qv6!!H lScPqq5=h)­!("pvfC*'9;͢^<9嵹i"/|が(3ՑVF;%|h c6r"{;~<|U!]iQ;_W_J@Q})IAϡ|(,D̃`r3'sqh\"Jf]Y>qwoMʿ-av_0],_?g1KRNڿN3NipþhG f|~z|- Dʿ#t?Vqn$X!w<}7Dы#Ry1񿘫wdFbhW'ߨknuqhc贔ÇEcH4Kfbg|EKiS,(U 3BlpF3ŅD}ح |X_ Q$7"M^<fy 1]. n$Q+*r! RT 6XR|RJ MR~=~b07tَq+ߨ<<39^%Yë3f)&>nFZ oFR~~ޞx0tVQɳ& GI¿{sEK GGR9w4>c>?1'tҚp͢ŝ~s =$qd; g -pݏ+OoTֽQiék2hQ䀕6t:NT%Sj݉E-0Qv Ly,%vurZ6vX8cZ}'EJ5zv+_&LfqI*Ip~Ӄd@ڴk EFOn'cx L3Kk2+8KqN_IXx:L S85֕&n#H#?܍!bRUdA@ *6|,A-NM,DXQ I;|RÐ-?dIPbϹ"IpZrHi"e W+(wN()),m]H9TϷ% 'J–z Ҹ2n$YI{h g2c C0.@>'8`Qe$ĂxfZZGQTZO1Ej}9%Pd6_"mR8oDbY"Uq׶2VAuRcI)f"#OQ!4&CB cD$ : T#)`[꽏pbJQ8 ڦTT8z1{ʮrލhz$׭H uvm?(*p)LZB`tKkUxAhZHY!g<8{xYIԼVR #ENT)wU)VPrmPhA3iY "pEH0Jd42lwJfPRTZ[yv}88ʨdJhͳSNVk0V4C{tK"'j)¸' ?W1y$*"IHү":|2cqָ"8m 3q[|&2 [yT;x,EOGq޻ň!|FozhyM|3L^cxEd Fh<hR3XR@bs+쬄b-bɴwwcƛur|=ϭM!M #9hά&lQImvi r_BTNy-0qqqUy- Ua88@ FATw8 IE9b(6 x(MQr{vR(l*زg0w5{aGB9㌝}>՜׆v$Vr; EUQ}4qP0f>}?[= #vwW~4mm2[/=9{Pb k#COQl*8NJ%~*ZU}뽘~?nl`HfVUkR\ͯ M4ǦZEMUz/{ȁ75wJo$S|G+o|BVQ2{/Dlv{wӼxB+}Gvl_hQKK{lFS[hMюtvPw%":hl8!R(6.Bƌz#p-Lp]($T:"<^G;eЇhM U$b:c0xQ#c ("!x r<^BjV}aHJq8V \]>Մ8˕An6)U[o>, 7,wCû trƻ0:D˲Mn}XnY6?NPcJ05z8Zc~F R\y;DQb2|Xh M4˦4=}"ʕA侣w;],uZUڻe3/Dl+`cZRV)#<>pŔzNy3w::R&e4X M4Ǧޓq$W,١>!6KO1Eryl]<%69CstUUhV. s/)̂`,ga #c`>BNF)@Ta s~"<)9 C֫E* `XVa3]HxYh~7< GT>R2)_Xy,Z[5Fm8aG;Tp2ڭy,Z:4vKnUq:Uv;gx[e@s[YO.*<|՚I/~ի:YsVJ;N}t99s,OWup+gbtqxWbW=U8@dHk3 WcMz0Q;4CH t Jp+\ܿ΄/.3@EKahIt,^d.^@:0F/jOҘi8* aQ s 0>WpdLZʘf:m "q.SO[m%g?wB\XF>x\T/*A#Y"ɏ͙k^&2Ro5>뉵;$9p[/_FhCQ+lby(Ta.EzN,?ųGg~@[V](|#'h$i2jޥA'ӾK_=NWK5Zܳ~R$Q UM?ZDvLb??Ӓ]^}۪Lm};S+ƉwQłREAj |5-$ӓ;N{[p#TdEi\j&5rvF[4%Dc-U!ES*PыhL~*בr"xmDnPB1hD:g1-%*(P՗PƁ2~>0(RF(:?}F4r]etL$)=ID߃Aъ+vBaO?;u1-(Ŗõ9Jz }kѽ:,LojƻO_>aʰ~H=HRD$Y7%lO{^d؛(@l UR0BsPnz{ /]xywW@x0P ZV‰AX'@$jRzBi5De y=VC|3z(:YT^ſUy1}1U-ʫT_ U-K:IW$)`?$'aI$dO:'Qj'j"f!D(yTDUs8X]IBϡ$Egj% AR`-* L$h8sG]J9Σ>ξTu^i\H綐| 4?>hz|^y\VUs}/rݻN#xKt8 uL? %%-?4O9b~XJ{ '@Tӥ! _8 yR&u$?PԮon0ho)j+ߏ9{O>׻CٷWB=\9[0;1`E ﰑ(;OkݪJ9sOrYk{EV#YVu꥙XE-xӑpzwmݚeim@"kܖrASpS)/ot8fQj2= E;Qr׎LZ =QER2C|4f_*WȘ{7YwYz%8%yHci7?9,̓e7^H$2o1LEQt1M^r5zp&Ui7C|(qQ^'hU"=9IM,# _|8^ڋ&0U,x^půUKG {ᦝfQ!\Ƅ'a kXHc;aT-pΚ5ok&ּ-l7h %Skc PSBRNXrL6Fkv~ ͱ_4'$F_7>ړK)n:3V'FǬc>vn?;*m˳-0CNJrrMMR0La&pY䌓H3Fp3mFTdUpQel]>O;*#AjƜiĔ3KkHuM45kYmlD%5I#.>'mu9) w26+YpM-YEa6&)Da'EGpȉdz9pRZq m*QbЌy_O*q&V* βݰE_člbSZc LNq:F.G`J"`97Big-qxF6lbwG9 &4[w!7$>_jLj [dYdOzy'NxP@Ћ;f|+Q2i( >kPK9}IjJđ4aa-D9-4-|Z3+_.ZaBTccB2C^8uˉR1V}}И6⏃ƭT_bH`* (""R,D7QT97 X8#9 EDl ,NV!`hsSq7R HSvT5[( BEηGLo d).áК"*/S(j#-XAVpXoHRTVq0D\gH.znYGό ZrFO>©KPa%JtAW^vX]A5CػLj\R6 `i5_,I^K2kvr2E"c+1[FQ"굶0SH!J0[ɬ{gZW@!X,!rrSw$xEA4SZ`-\tNa#+h#pJ+wJH` J,E!%S0 751IM8} i啱Y$"('+T@63{wj?8CI%~D̢9!wCL8E"LrLGApLKP#] Zh2"KJF`F鍐iL&k*Qa_aS+i9"={K$e&m3eEJ;LClLP`\b+0"(V8KL`ygA*0j,U ,ji!8/n<|GjE k6:f8a&]OL[Z ^t 8Z𜛍 ?]*iDGpfu:ԏd6E>991p`B|MGZĂ!d8!h#H+6 a*NALqS.;h=NAZT)-|Msrhd&Ij\}v CV2Qɗ DM!R'`BTe#G\ `s"`RZ0jݼ&]S4I9SNGZ(1^3@"#zͦ$,a9AU#3O8>_NguQS% Ճq ?h: oC[;ۅ<5ăe\9wK?AS z&Ûhڝ 4zQ<35jּWW:Z]ּn4FX1jFM)a2o*z= ˭v=4/N*w5OiA7),F1yW0)I FDyj+kn#9`݇b/Xމ:H @4  @T@\h"X2++++g;)xa Xsjִf+00kU1߯'jč$^=Y|AO v)]bhjoJt{~|TO[h8붲 T$ÖZlyB>c%l## Ays)`O+BPsa~D]VSZMk їt6,m6<k ŋ{v935ԙh%$ꃶQNc1y!}H4Х{Es5[_.4`8' fh[CeǦ)R-?J^CX8!$äd>^$8~GJe=r^x'r`ʍL{Z}vrjt 6nN,UN9)> G"Lj;ab>_uAƼbYbi*U>ݺe ׼*Y/H@~JY~r׎=i`֙r$h`1DCyЬdng{*ԨGHITe.ą7GCp$1L>fNB _ dR&4 9(k+API#sf$1bDL I"S*wrsmu42%99ΰ8㺜kPN8 TV pKV<\1N/w<lwlxZjoJѢ(p]k?e1/'C4uN&܎2X"6U듄Ddw=cJU+Gc;/kbG^w{"qB0&tx"|;*LL:I>|rz7pI3[CW*9(o,ϿvUzP/xmɰl]#K~2ƣӇl3f*Wߍʧqn0xDiu uV/7gXbl-KhV?Z 58@HO.5Ml*7d$Yn'0]1t۩Hn<37>{輮 Ⴔ}/?s@gTJ{PͰb8 34oh?ltܶW.mGNӏdd@GjQƁk{7<(HNOnf0MD[[S1Ni2>ў`2fJF݆yE~@W1S^>E^v: )o;fڞwRU w{ xfK3AXnT aw P.:=\O>O+y|;_Ϣ3YM~?/{lt@ /=W_)kUZ<,W%G{1$l ^'dOŃ4[IrX^x+e!'-)"8WfޙFf{CZR[,>5I vNaŞ}cxlcT_+b/pO؀]J峺|G3zb C%z\X*k]t0TSa0EXfUi 3.$:k9C$k\|n-p=S*zn"'Cdd1<нtWu!(Mԍz5&ΨB۩$qE9ӸjGcO=6VhOSyZZ:-#lA;ybu {n6p2:025fxge=5 q'rh+H0 H&!5(<0AYni^o-CybNbvVzMjS>X7M:>~=ghAr.z7%l`J+["& SڻM*`ކjk1 \bs( 4MlKcQp.E JB bѳTs7t~SuRM.[e,FSDz~=F6˿e"a |)Wr~RADӦ;6Io/`YfYO)%^n~v-YwRz6o$Б?eJ3^9Ot] ;McBʗ+^Ϫ4Y4XJ81{qΐ+{|SЉ vd $barmbÒ~)6,kxBYcurԣyr\O3]] 9CGޒ!:i~uäVRqçI.n،a3cNd/vQidIy嘎 E=\X=8KwQaFL3h8riSMsJόmu^m u[?0Qw-1;4'0:fw٩7:]Xgi^gnY%uӻU_oD@N^aya@MPo3{`uw[6pkpNt,j+S7 7n{߄D٘03.%lYX(Ĩ|q[;Z끣h7 `۫fq=W_G,ziX6)iZcEEa Lj,⏀;XJ 1w$zoBv`E.c_jNoF|1ӣLxL1` ??VoWnW7ט1%f1UnbGV3Deg*Q׷ǃ[XL1Lm {{(=ĚhCIFd`!8;j(b=˭5?tUFg`M<^>YYU\K(ut1n9RuSPewOdm3>@gVNRjJvEˎJ'#tipO4iEҪ%LA3W*`zEeb2vyYtAG!cꔥRi1n8՞j0IgHc*ŐG6oʽ˜znuć$HJ(.@V&y$q\ /s K%"!xjsBol,orS1oS*Os[X"l BA7}*;}*ѿ`'Ҷs´.txc뀩4ոJC\EqFìNbJ5/+[O3 ѻPn]i\ Y{+$1,UN%RQA  Yh 2Wvo빿l8Ժ$Ӵ}LQGmX: Ioҳ;+ZR\ 4[EA(,+BZlZJ{yYAH^D^1Cp{QB$z>"d0^TKL18xÆ@l.[!^Hfmj,uZy2!tFN ## AỵoQbpRp!(8XS$N+m&+~9|JQɟ XJ c2LTJJ'iIXZ5ؕ9g>M WZx'jL`\kD,Jv$ni} 6< y3Bd̃$`hzU.NR^wLhEkD1g tq<KXϡI߸"9S}Cv< qSd2%¹`E9Z*tli{ixbrO9z1?ISc~J2:iQ&aÝ X4G+z ;NK3Sﴈi=Vo;S᱂X^Wl*?ώ qUX#X$,Ƅ1%AsLʘQ438W "8 \}% BcH$!e@ƈKBHZab\!ZKO& c1|OO^qkqayXsi&\ %H`v.Is]*A&4jBlNJc3<6ǜk8J_+ iLx'q7A1A7dX; F- |t' Pys)Υqr-sMS}Zm8H"\7FW/I˛Xhj1}AZPl(Ty`F(NLSWxb-f]/0So4on=[Atuk]j"Ć)&ac3G0kweH~U}A1@nckzaa0yTז Ivmu%NR"K%E 鬐3/ރ,/ǜ##f0E g @Hjny>p,?s.7y}8+a Vx@ YbeJ S\$a\(L},zv -a,0$)p蘆 p=vl0j2`V@s"9%z18#A >X'R̉(8u/o9I{YkS(S0'뛓8o( ـol>׸lv5h w/5fO/7ZOne'I}DcvlJM1wwwb@`fzQ='{2ַ'u}y<7-2ۏ; n)ٱK{x)|{k2]ͯb2kkN+N)ckp<> <ފ^RniU(y/Qg[w@xR]&/-]P&3L7_ii'*~YUY+ N}[7ԝ/KUֹ] tYD 3&hngEl\-Z,ךm<ָh5@f$vK4BvL c (jd8O˪GS TqP0?_+ 2hot{й:aH"D 3LgYjq P4p|V8# 4C`h-Z+0G܍27BU6+^O|Tɑ/oeBng6Fңm7:^^7ۯ6э\AKIq{z_sw7Ը=Qc?ZozV!=6|xQEqϿ|R_:&VG~>hn|^\`=ia? vyQ!sUɛՍy.12᧟??]6mJbۦ䵺[IV3T< TU+)z6.;/nlm}I"TYu(vSEZPKF#u8.wCNκWG|wB;l<] ;VS,;BY8;0F5i/nsٗwҥUlë,#5AogʜPvpLr̝Yh07WE0"Z#4hC'ٞRp`2`H)CʳO#+-[uRD' Xn׎hU;1(m>-$$|_$e!I,|Q9džz SBae!mcWN2 Q4>T#'A|Vry8| 77'ʫǮI ;tpN^Bb# Ξ6̑8x;#G5]B9}dR 0dUkGI>NBsBͫg#-hM }䉐Yٺ'Hݬ}U={mr9!S &NG?X,GNlp8B*GN8qrj*)KNnOlM ~9V0˱Ȏ ']Q]EȆĪy]9cd^~%Q~_1>lx?b3F^/igm>8,W4RV D7< '9hZp QKW JV ?2j a]w2fp%/T>w [Dd"QP&#Ӧy=CT8}mJAm}EYNPOeh~А8x,yVa|r&Zvچ5IM~HAV|{/؇'I]k"a>]wHۮ6FH ٦'j'eH?@Am!Kh28uXR__yկ/dKAWwH })FaV] k˺ ԰N> {b/XKÊbU,<=GLp"t[ $*+I~]Va,Iw<O或2 :RgsRjlGKO0fr$}57.F__R>c,~,_#WHUL0ݎ`ó̱Za5&E8'Y_8 9|Z@\.g3*LEQjCX0~>bln䗭Pbhŗh?Y#}ym|?Nj;3G0(0)?;9|yY>'W/ׅZ\U(|9z?%TRu®$P@ByἜ/&vih,qџ~?_1qX- 52Ғo-,)r51 QO( SloGIhR K ƣ ʝ%XZPV8NrEcq0* 1 mA@8hD;u]b`,gNjLH R@[9, Vi)| 0bD Q RgOߦۻ'Vڛ9OY ~׏gp,G>N>MI(~k\A4Rz`? ٪IAɖdc}d{ =HF8h*+P1K66h5AFºGPZ8GC~>j,J~~xןM5H]_jByX"dƬ`l(c e .r1 ucڐ*#!V EX14Ex#1G0p-@&dдk)D($D`.q)H(ɝɶ`ƎoL:@uH".7ڂ Ƽ4/+tIk= +Xl*I"6wmia^Y J |D&Zk.OgrLdI>rz_.0}W\_j_|4${a:>ל}Z(Sk 3ˡY<^8qDs0"0O{&FwO)T%DNe}je'UtpR%&hdגS y5AQѻ$"\ !Q@ !ZC~F_4D01DֆZNM BE$%!S(w5IQCdmex"k!AY˩ |/5T!"k S.&c!6DrjBL]dMr!6Drh=IY"k95AS+Y;$ڛID"k 5x*{Ț:yNh3( @)_dMڛiX/Y!&"k#ֿȚ֊!Q`ywbd ނY"kY5zȚBbY"k5A(ٻZ,l3 _dMa:aȚ”E\ !vHxE8|3⟀D)"l)GM폷w??=EO)p2&WV/[ X̏|lhJ9* wYv~, I* 2A ]`UAc c؟&:p^ Bf^lPR T)0i-h%aW$Bh!"fX PTI%%@+6`dq@PD9?E&- %T# Qmk 5ނϵ K[)xZ4#2ą2D)G3`>(AZ *B)JS,XDjdaEK 󴵠OOHL* $̶PTP#L6!z*"vve37Z?8rΟ 3)c6c cAOŗgfv~=)MKUpn{/ DH)#Ea IC4ʁ_ SD{Q Т(7ْ]B#-J5Snv ,qT&Ӆ9o&h:xkw?5 {mm G_nZؗxt&u@D-N2o-QS")/UĖHl8LOT_{itdi fc glz„t:zvrA*N.O/zLQD|6](>&v)O!Ct"cr.ɋQ _ܨEsC[dbru>P`mJIRpP0Qg`igd֟C7L "FY9rHv nؙ X|mA5Pږ\ͯ븸2FH(ׁx ׳B(8 iT\0 6?dVϗd>yaxr&Ĝ@tjHBWЏ ,ը *Hi̳k LZ$ǠK=-K%Am#*T j%^PD+Br^gTrgHh f4hMiSɂ>. An.$kV{zju8QR\QCǏz(%z%Ĺ%.f׋)(Տ7^xW.O'n{;>x#W]o~{y3̩AX3Sz`?ůނFHMv&? {x2-YnPOEA8Mz[~o&ӟ{;Â`g<[%$-NNw WT`D'CU\#Ex(?.KO/^g 5^γgNJ?q߱/Xf^?d 8;rc 7ot?|)qps2BrQBӇ`g9GGJ]\݌w x1-9o<Ntd+tDK.? `W0uèֳ^um0Uu0g.!pwag_|68 z=4Fܟy&0)yr09j0=$߽>|57K; O.is'xP)Fjm@iAW V\w6.|__ Gk_cP>9 JŅyweFո2dÇcD&L4xKj,2#dVmyՋޞ_OwVwZG#-$Sӂ`t4N/ (@ yw*t֒6LTo6GXKv~Np,Ufȁ]8,5pyUf?|gWCasIXlE˷ >7˪,-}N}>F˟^TaPfSX"f*2 FҚ1'b\&a͜ϒ0yǯ\$"IoivRf u'$ CX=7&ӂ2ru y{2KSrgE%L)Bd'#(~<[((E%ՋhuOߒ0Q2Ӿ´ab}dߩs9|(Ne^TTA|I0r([{YиEɲ ({9{W/sSTjp}ZʩDU9/z+jWբm3?9}pAghMT1l P%#K-|cwJX(Rt=QtKqUN%R %*"wLN2&YVAҊ`?ģ*?ըh!x6h Z1 (:krk鬖$=U׸zQg!BGG.KU:q*JT@Q%'F[fާԜkؠj^k-!VTkb+Y5/<.«?Jͯ|D>x_S'S?ߤ_DῈ,ٸ7L RQ<#,pLH%>vAan"1$(CԂE|dv#Ѫ=KZ#((ZL/y=ʨ}[ +i u]׬bƻ7(IUi: U a,RbgtVLAH>" Sa3yLyk'v#ƏEvVt.80|15tڻxZ|sԿTؽY)C4\ +Ik=XC$",dݷތ_ĈscV3tBMx>囑lYkE87-.,G)ŷA3^yolw_f5R ZwV BR_ֻXM7>!+N*(Sc@*Joo$g)UeҀk=qT!*zGf=?`בh^!]e,JȶSa9)[rTϊ|Ի.lQ9CqW.G+yo/x/[,ǑHx]E)%딸V Ƽ'8e rSi/ma$%XpK\w#Ťs1vuڟMԡtGvH"e(+I3͸6 )Tf2xfĬvbP alN 7i6]8% F4KY_ݠ+5ؑX@rlJ]-w!ZR]+m^ڄxE,& ҽ$C5= c߄0`'}z`>"B!Lj*SUK!2ǍO5KN{6:m JS@#[ً7(rRn0LUdϏNKĜ6R#29vYlkG,LcDXt'HJq$+Qc2Öek";X3I Р)<ԨXmw%qZ!q]r֐#Myg /)! x@Q]Ec$kF4mBi.l"iW#/a;0J- ^P̥1 OAmKflE$p nw^Zk,!l@զf]'!#uPC@Q-J=؟#wv"V>E;8Y+aVz)9݉T1-xXv9q4AL< eQaO",h<܅Kح!WHX|xK7437-MS{3[ld2φ/vuhɳn$"JE(Ye*w\KG@,@b60J;9. ̿~<0{*@BAe[1 :r*]/TꋳfJL20x`=TU%۽)mRezHT-m;ۋj["p*Jo ܊y-UgEH-ezt/sCɂ Hdf0JQC`me`zBkwTj>Cx*nlaӢk2ZSru/xH`x(Eӏ5oyuWSUʡDFBIb_IF2D1ʌ0`P"މ4eYq\1qy=zR:1=,#.cpsO|Lη?`^ls@çڷ!.JKByC)7y?9{ڮ?jP'UPhL3eإZ:r0<9 նVTUdיX0$vv6I'?.PB6h:ǥsJҕי"eȕ8:bñcڟW so|?&vV}zK|ߛG@ _Z/ܦj_!`p`N2P]Ō$*$%jRbMn. ۢ|V};?-˵/Xa|ĎlL JOg}5Z ]vq$G7r96*HXq6c)le5p/ !^M0XW=6V|dI咗 e^?7Z.XxHB2IY1dɡ~DgÞbt: Hǻt͋^y =xTfF(1Jl slDIh֙"Ү*7#Dһ kvSH1wiUe)|Qe 2/B1q!lЧ&l $NXYÇH=Kю9.LifAM ÅV+^BYt\SC#O5I*MݍV6:KY/^BɎ9eI& F;"GzhZ^1Wm+d ee{%\aMtWrH'*i %LXߺX"r 5.]j`X9)%+hhH QJAxkt!JJI&4SR J9cHRnO| P`Ekqlق׷ Sr'ֿ0_d㳉vMhDl-$s93;1  \ZH#!4 ?n~,F'0.rvV-M'OC,</f#wb֗&0ƮC%c5Bbݵf\^oǘ+ҎQ 1偡c)$H09V[G }VSb *kG;v@&N2`p 1ck29J9\rM8R9F@&O7&5mM'imrU7C|җL%M5jp=BIr@IR&h<=#5"\0Rql%ibqY@S2!j`j:xr:lnI՛`RXb+c1βh0PTi0g1a9֨!y LLDS* c`wp/ %s$r]g Rz^t{kXNץjt{;]oqV t])gk;=IY @tp~+(v׳Z`10$= N;=3]0B 8BS3B@sr f8@!XRL*뤡q`M6$L^8EcNX1*U˓\DcQnrgЌUunT3v53pYqj(΢%ѻlxD4h4F.xYn& mPVBA ԄɕJa6+-E.bs8EF`^mcr)1%lđUR~5vg.B*XKp 1 grA99oC "qXTN:rɑ#nlO 7&"٘-SGpl #*+XB\lEC\ \ i2$FBJpXHؘ6g GG`M0.79lR+_ 3ET,'\YXbfgIe=$Y?>FNvD`]$b#+ܹH B *"I(dDFVf-U~WhD"  XEɑ ґH+l + <,*kqeґX3( xzUf7)x>S `a)"$r x]o`į4@W w^Z5͸[MГ&w:Cr$!Hi+1kIQd4 crsxCȊfXTAw?{궙ɪUgScRRsEmepb6q̃X,L'eD`iiBQ;nNY<_ 6)vt c)1?7}8Q4c8Y(|I*Zvn\˛B?Fݺ=,Fwl+'!|%R.bk[:!#zRXJ|xZ+_Eqy?wo}"qbDLoϬ";o"G>=&Q.nrٟ! "[X=RXdL*PlԮm;rjvix'4?>я%(8۾`晛ĔTe?6Y6cX#DkcW!&Z5s.??eᮨN,c9Rf i-*$[ r[ɷV#? .ӱ{p$fIOq>q7ߺ/kJ9~OLnrt>5u/JIZk|/wOIgn6sq)[okcQO2RjdM{^XEtq_;eE-˸HJl8w%%-uؒn"3DfpSfr1q(ϨKg8)4 *\z5QñoĂK\R\V"Rky-\i*-57gkAW lhVhL5*E&hbӰ6~yH cҚʯJYЦ|6mBύ0.4o?YKEreeg=bx,# e$5Wrn,;;|c3DѶ{/ëae7p'yVs%y;ᅳvv:|#R zwkVe&;8|Jab/(ժ#CR|x>[b@b)xڜ= \gq]X+$ o ӽ+LS!XIa[2MXWLlτÊ%̶b瑒˻ɻlYJe~l"\)lsz`{mճtU41ǥ√lSb$+/b[ސ:%H 1hW  #u$ꍆT]gB[C:/eۑm]Vt֟/-cZ:9ՖR 2yzmqJ #<'Z&C0)pL3!E1, jSvԆ@L8Oc0aT%:Djp¥@-U#c|! vCp1%98M0X-),a "#Lɐa0ڥxAJ`G)& / vz$9b_ZjA@,SـFJ{k06s`_c Zu΁dF8sY=GsFyr[4:%Ggk 9hFKP Z[ɹx1or2N7KlA|VaciU$4w+( r̕Bʷt4hnk o p{PS(!"XRYDú{-J/  BTiǬ]#0+;{_q[I^gOQZ&֩nRs6/IHuFI$ IcM2 Ȗ %ಙu$O!_  $2J/_-[}_H(P~A& $ m]t1n/2ZBL$ ie~5]B)!sz"Vv Ws0e}| 3T V(҂~,V(iQ$P{` 5V0 jsa}/ vշpt)$.[aw7i$T{ e:E.޺I@;E*I[5*W[8J_ԩ0hT;I0,1::hQO&M=/)Zhăwӽp7m4PtRqZcW^΄C8"DKɝ\ږ3WeRRm0<2sߩiz_:GT ;B2FMͦ5`N}j?#imKed&k˳gRCuT acO֘G[?.WC {31w=,=hJggD@(;g;V(@>dJ -)P6R$0ka&)c nDua4KsOȐc:v.|kى|%Yq S^OGH*|%ʏ}JƤn7+2\G=>A7]W=2/J<&#$Wߎo*"IӌI2j=؎Л!BH*)DR'ʳk8/J 懨}8iQ`߇YvאqXp%"/\r6aI_Q~0@fj]-AI鲢ys :x(:M3_%މ ~\$u;o^GA%WN=^vKaBO @y0>!|*n{+KD"*SD*^= &HChOOgJ9'`p1Gw Kɝ6|I-d[0 [Kӏe7kdcu (ߦYj*y84s9ߢM6hC+/!1oW  7zNp6x)i{B4Ig"-$81' ֚7qeflț+MxTxEyҴEE Wk] Y!LٯᇺPR,DG͔/.;;şv;zjq7Coiv-[oyqLs:nzqP#Ni;B3 UAMBkL*ǒS$hvZ3yLT& #2yWɯ$@aF4Xs _y&L8_rɩWְI(c8f8%XǖM9$FpTŐX9:Jsi00ʡ?AwSߠ*][㔟N1_ו=M vl< ch:{z}l`䃀'XoW%(*_}Zwf^Nqé} "`I4QYYj[$<B^CN񰰮9zV_v*Gy"XX^v`d>D@&úyl @HcX?7M ݹbO;VP$~p; x:۶NEJOFW`+'jyc织ChU{nB޹6)ɯ$[MJ#z21ݎhňͻ5 nmX;7F6%GMV ѻ5A4}Gv,BI_֘Pֆsݰ@0‡B)ykI'3se-ÉHD[,)U[,IQXR0LX`VT5 kY5x `^N_e3^ c5݃/svp":?x`3E8RͰ]/,Kf&^C&8>x jM|9D Ҥ޹K15Ɯc}D/Q9 !: q=P2T|| 4E6eYQ>1w&բeyR*R*R-F**inj.hax4 XQ]%6uM*3 k$gl9emW+0ӻ{8LAEhB0Rj9?ԉ|Hu VU6[O-w8fp5Di4K'FqU>ۡ5S|8 aF;3%yzrCxvYTHr]h&Z)j'3~nL˥mJA)3FOf~ U\zo@Bє%êŬ#'qk&L?IptaNTL 1я0l ynvr{gC5uL gd$4i xi6򴳢Vdqbi[/9^-=8}HB̍u'js¼_:pŰg"ug`DT/>WK:E-ћ<|Lj:ˑE#؇Cy&`4a6\]g\9zi*K*V>áe;&u[g _h~@ .ž_Q{Ȁ.%>G,?q*!qcOhb娌aXx;x\Nțk-]a]Ks|qؓ }4c\]1V]\m4q@}r;ܿXOvD=ʍfGa6Q1BC O9(;Cmn8ZxVSR`_Ɣˆn4%tH>,:ep8B ۢv]6dVU6Z^K@u/D  Fh޶|\_n=1j3#ybM(nT ԮFouJm$" VqRM:wsIG"|[m~1a{{şlgL4[IThEj]4B}όxqVQ30Gƽ30t~ÏtgНAw~UgP9qfH[˹FfcƬ"tfRW]pMƒeXdV`gEJS15Hr>Q]G%CliOJ̡Ń /?ؐ!D_*y.[2l F@r&*\cZU0bbD${ FR-I[ aBP+KT%*8MgJєb vDhfpa-xmŽ&˺#aJer*Df¦YǐLH01T&Y*U(~Mmp"Ɍsֶ^%TH0$x そ'L`L5O'TYR lH'<{ "r#)e D +30ώhJɄ9A>8iRJP$0p XòBI. ES E.@ S#DK|;a2kU I3T Tp!cRfG9ҰǓvvn/1, 4^Pcbbl84)]妻&ΟH!F"xr 2x}7K 9=癋ϋ?!yX<]_]a"Jju?|Q2)Z"Gݝ8xOCf.{{|n,NBƈ x rxS ("{M5+,u$ĄYX%E̱-஝#~pi PPH3.%%2RVռR=]+@ƴq Rw^w^w^wU%V+2[*C!M4f@dJ62EY+8J䐋' (^$Y ^ ǐc!Pofӈaa%; j+Ε X@YS(MU+,~%r7Ic\(RDee r}[eQeFE2[k,?ίzb/yy~b(tCtt/\e)w)Q9SVy!*Qy%u(8gt!P)Hitf,UCvRrN;Rb1 ٢콾ԮT/] H7k۽hcڬpTʊf)0㌵*3klw+5xM6fmYzh У}*, 4Iɣ\{飶C4ײlq/l6n!h}ΗCo|UXӗ{7~z}~Tϳt~ob1ޓ JZ,ZxY _Z~8N)]9AB|Ί1>5x]d=vJC0hb)J@d\xgWDN-, >$Mp*$Ƕ}KH9K;Ik$ΘIdSfڰtd쎔23:=ڑ23F.X[SfUjT 8=Z ^z̖v3wzzVe֊8[ "vV61Û߫,_{hAR@Hn=OK]fP|^obՒhy‹ջ< WPwlڱ /!eL&5ll4\u"g e54XdXq[ل tl4|-k0(>ʀW K:^BTXui*3j_QqWyj,TrHp6w|m~ >¡igRT2U@%$8&:h-QɢW7=ʻjcJትa>u_"o'uAόL} ;TU|.[Ib맟wvر8<د_PFȥ\$fjvL v.;ۻ 3ݦe'2Xb/Pz5\|E+SR:d9!;98r3J 2n;7zM"w KVFaCR&}`S6μE7}p[#v'!0]D'i=N'n O U::Ǒn9TɑO[Fx_~o9FmΪڜU9jsU._YZj . mc"h$)e4U} ʡ/o׎?ohF&36/̕FT jj0sfDmIO<YlGʪD^$ϔ)VtɵL[oGUJ-W8/6B76@k66seW;([O;߰ϳ_q NI>vPA6,.äGIP3ms%`R Wy=0vs>zGWd z5[hIP[OVD!Y$s~},s FMtd(9lLAׄ }=٫Rf-;dˁ@YL ѱK(HoqTfN8:H79 ssiݩs7yVUYUͳjvs)2F :eD.#' a(+U(CsNb$|C;픭1IL|ztB$㙚3 (/-񤓀 F! Et^.ݜɚ#I6p"%u!:t`pAVT1i(J!xa9" C;#)}w M EsGVzg߂5R9Eۿ'65Dl1VKEbDp6l B GADX} awcc>DXŷfA$'Ix .UN ΂Sbe%%\=mmpqN59ښt= :eI$| , cR4 w؈uqy5҇ Xp2"T?Hx7<@@%YJ|wx[o>{" :{ P7oO.8{BlUSWN.[_pqGvt &u`ppFt5M A9=vE/y¸Cg7D4<~WsR7rM&QFc 2ۨɲXC:J\≭^/hkZp_jo`w>My]?8D|Sꢔ'2!Je5n k޻ζ>J$$X@u9Ԃ㈗Ck\/n,| x7oJy$x5|YkQk>sό.kjl5W[1#.? $}c.b;'{ram: 9m<=M]_c u?shFĀ9r+ipw_u@u<^O-0`Ç﯏1dXD۬Fk&o~8`^+ZBic> Xywsj?BENѱyd.7̜flLЦkɡ0wX-<#6rBBwx(aneiN-[4c>Fݖ$e1#>iej货?XV7Qʽ/?(}>`ǞSzh⻳";a)`0"gƤ DLIZPBg(JCИ(hNw'h˜[_UX6S)wO8~ĵ4~\Z]3i!h™H!$]t+lw-Q:B1P5f/}pW>{Mb/,N =ϞPFf/v3'KG2JKOOpZ(UBoZ&m 'sODš#]~N~G:h?],{?O?m>cClW +8Bh$G3'>i.XmBb6:2tɈ.9*AES ?Lz͊K,E -$S ?Yan7Aw_`0 Рxn(*%Yd_@JSBČH$RVdZ &VH袧8Ư 鬉OTA¿C3sĬ%[Sg3=f{sar{KqsӯH^i&zr>p&߫GQx Gw7+^zLYŢ?弇neFc$9fq3#K"=cIHXqw}XU,J'BiR[`xظl2hq*Yk&'MhAi~x@ڨe+ju<vw1K`=+G Ìxtp#:HW޳\닯I6ɩN5LUV,m[IGX; 7RޚnLht _^JR]R ̒RXUA 2rBū,VV[x]G=XRN\x]?Ք q2~˩e(mIDk DN?Bfb)ٻZ*}@g:Y9Ir2ڎl+:c+N.K< ~"{:ӟFNNE@"J8SI2`0 a t5/4xVo=5N٫u:0mwJێT48EXȑZ·(2Os@QWMFN/V|<1H/56|R3v47Mw.?2K<ݹ cVhKPa1^r騰}TTc !<vaѠJE Een=N˅5 Q"K:n=N-YZE@ Tؕd#4 b)ig$'؞\:8(viRk#rEy;8X09&a@jm{k{U!ꕰmzauf.h9(u}Q,!A<0>]Z &T=^ KJ(E8xLځpYŚL5J4xp!D!.7jJـڏ05mjk 4v`@- ~s|;>1Ӣ5vҗgҖ5awoϮd*>6R@,AK;ЇCh*MĦBx߮w3O tJߑݎң^;ޭ|~swB^TobvH< V*)}Gw;b`>VhwB^)2ʘ Ng0t 橹ʯAg#o j1f}M&VF>rsZj1/J.&XäFHj<&^@Z(p:@qc*&.OE?̌cd.cx/Spaf| #P}s!+Z]]q<Fu5xh}gW۝bڝ]~vC󏷷Wo߼`"#߲fu%Flݧ춽:2<B8%ĄhUj.+ } /KocN{e1Y $J!|ݹ&|Alދ:ȳWm\ƤeXyZbAaPG!ZR`J; CC"|h]xq]|0l08j՝&+lq:Щ%9bǧ7J>d#(fgbZNF-&d+7lJgaJ:J GQɘ,P/ *r\n?̠;;[I0 XB'ӖD}Z&*PV$J5PWqojWFƴv؋-^w}ʻZ Yb$hK? 6*Ci5_Xj`eNGŮB&|]9*S{}N#TO'?|H~q/F0)NkiO5;&v6uR1V"F5{+j+5y!N% y߽w\MZh@yU,+rlj+6]he IiWeh5ُB*2~ B)VVτxw$=wRj B (;w̞7-2+: s/;6X`frL.y c3- k:"شXWV)VyjAqg("8Feporr"Y|*?W{zΔx2sPyMnXDyqu%:x^À{Ы3^}RYIRkzݫstb tȃZ kQ^.M%ioӻYIMǔJ6 ;ԝVVިif PӮCl0HSIՉb Rz$ilv7aFd€xQ5updt .neh鑮,XҒ6싥mF(C\6Rxi $?u { Ά7p1l ږ}jPGv#FJ#-Xë1Kj݌L ͂X41<4Ԁwz{k[ס>xbh\Y &SI]"F}ԂNQ9(+cuI#kYlޞ]v"*P6e)w)mDXa-$LdT`y,9H|\ GXb0&h)"Du6gէ2/P<^^0zp6 1J4YE6Vߜ$/hLQA~X^s}^׷EK&'|Y9-&XJ<i9A!W;_ [k︫+=zťmfL!TrsW Q kqn&`Lc PB.%#y3M͔dTʦٵkGՐdWwRESeUF}M7ֆvQx aa$i: lm;EJwN;E`Dr,U'DHPеIRUi>S2{`e j",(VS{)SslHGF:Гǫǟ$}9uꊌQƫiTچFwQ2WCpy"PZE~+ybDz&!fd3So.6T[=|S6ղ}ɘ2H;XɘǴDٯN=|H9p-)%pzVIrfe YMIf!HQ蕑xDB! Խ!o[$W3)ItGc].8[."U:0\w'YQD ΢nI QZ:ңg`m;[ǥފ GKyWgE d_־6Ͷ;ip8dZ>pȔ'ż!f͕Vy3WJ!.NxJ6Hҳj,+ԡU=ֵޑcctcU֠kVN_"IK}lcXJPZ;}f)|et3hAdZ_!L *\hAe注HTvE9؎3 }B;9M4xO=d,zS`4ZuNMMG;nҥNz}f+Xy!KV3N?=L?a{7ڳ (zK$ǵq~2d~(KHé nFu1p lM`GkKbm+9k1ȏybKh21yU-rCw-V@y Cp~m>éRwֽ֑#× 吳9 NQliX9MȦz & ލ{[ RL9 `imjV٘ưnE6ŧ_TnqGn2H1wxc4ټ[9>twB^ rbUշg/'cv  m,r{kqkEQYyB؀J{ֳIX8R a!Ԛ}4* -}zAKd9rSpO)8 !Vl(f%ףذ2$gTGb flb+b"#LrTlF2mDZ׆AƩNw6GB6UN`%XUDToLZdŐW>nh)C䴣~:o޵4Ek1{W#i;1R@7tU5q$E*S)Г!O}>w 0TȦ(\`NJeFnʉ*ki D2dD"h'KًL|&Ηϔ{$@l#=qjåAnbRySZ`(DL4̡^Qh qM@OV[;՟-1OrWpufgթ3uZ16@+E78u4!B\Z0PddVK_5l?jk8ʞKN/yhsbgX'ҴJj΢Sd\ZHx ^R܈4韟R_>4LocHҜ*:HE}=rQG0*SS3 7Ocġ|gun1*$% QlgΔVtvۨuhگZ\3?پ9lx. [N#J&ϞF =I.W[bt l~v{o8J>g&Hc pL17=?v_@189߮^U\y4eu/'h%?t(BQ`|RX&h@o̲W`tcM)AbTЄЕ/m[` نQh :-x` 7,xiL/~f4`fu-iZw]Ƙi cJр+ͭ0/)4R Pm#[^Džw]ќՕz֗Q6ZD; 3ʐjv{譞 3e`o7U;X2ޗ2M}{P9ǭagBjɸ&/*;\:2#RvҮsL*4Tevk7u5U`CBvO8rytcN&Evw6]IUX7k"Af."N&Jwzl%ot M"9Fy_^Lz_/&k)gu$ҜΒ wېV0ڸMPI+ƣOc5ס)U AQhVBp4#㨌gf7h\1*Oذ 3&RQ8=b1+hl@R|4lpq ,Qi^Z uQ4Bk䜓I/gJ7F;¹@B#m!>P橖FPք)ogR]cnhy qLRʔʸ ӡ ~St Q]QC=* <ȷhݥi+ČCzMǁ ^IN g6TFb@eJ@!ƷtYP5[h]SMǦic ~Oy9R>Nj@m #qBW3L0,ljN A>d i31(lXҧ֋rQTzv]$Rq4?-(SVifю]8˫D4pMx{:#^fpӱEIUgъ=#t-T lޜ *'wtp%rÈ{YOveFss#p0ˤr|MK.֗/hiiJa~Njr9h+-W1S{' =S0,ָQ$vkF5!2s 0B@!%Gq'Q2K9?mVs8 !&NY* oחӮ/ݑAp 5Llg?x.~)⿶o|^ooO횏3zQg3SWgpSƊߒ&HêWf5GIATHmąMBNU8#ѳ&GUBae>RD|Od[ -JAQCzkh,ЉwrŅ7EP(Un:8F9~x<ܩ H(1+F+%tamD}V0Z'=;ȩton# t%z14-9S_'Ҷ T&1*Q "lWv.4ƞOn0\}ZB-Z 6ʅsK{Mq?O^{0t)WkS:/m:ŭ7F hU$(b+.Q{z$Lbڦf8;;> OŒ褒$(c"PtpkEFb%=^= zpgJpk[?;9H=F`S"ܖnAJI XKZA076^<]X CE0T-- rSO4SE9P5ֽ|D=few'yʠ:_3O_4EM'b#| );Yľw:㜖Pak_#޵eQ;ب,Q38EOQ2d+Q3,ݢ.f.]-l)mF0՚$1Xo7 -[ֵ%%ӥ](S^Q{u]{SDz**p~nn]! l@:T4:OJ.+HJhӿnk{3?OV) s2Y^MVkӬ?g eNх*UlB8D Ē\y!kejACLrơ,6*_Pe>ZLCL xn%i\2l]9wЗfl^nhMUNpb&gVo5`!8ܛzv^T2|En˰QdJ^uh+zмckkTEڼ(g+JǍJXuzRl@=ueM}ˆ3&4#BWN~Ul #3\yDŽ$G5| N5?fHv[S3$ IEaS #围:eƧ^?$odp󑴡 )'zM7\33qXvˢ]mr-͢khXObJq3n LF[MuEA;c3d Ljf5~;TI6W1#Hu [Xj9ٿnb^gZ5B_oNP ˻Ĉc0 +öm-!m%ɢTJ"]O$Y똭"n&c*kD&zRbN~F| psT{@'7{^ 3HfWDFFcFzQqoTBY3b^xThěqKUfS#``sUʋ^#T*,tbG?;3l%$Z9};[-Kζ,9۲lˊζFroné/l+uS_ʓM2T}9UMʺVY7*kݗE'948Z:nfEqV@9^y S+/P-ek5!|dΓ$}fS 9ˌ Ȩ2ǃ{ <&1EVE9$RQ*.ex@.X-!Hm K&XLC$^TУ8<\i0dD,c þ{v6 קh(y9%[|{ONbd{j)ۋ]~ȏ OR7kYi:KC0Q/:;X>CݽI#q95ׇzU<0|lot~y9X矂nopiO 6?D^)rN|o_'aJ7RT~4<;o6rG`j~:'p_(yz9]W<}8iѿg`vȵ~>_SXFL+n25T\0_Nzrv_ޑY!*ތ?eaNy g5UxPxGs eFa1Tf@ $qAJ-ÍF* TwEP TjM,qVD7F7mx~kmF=I|ZsQho*5cK]rRZװv k&X;a֮a֮a%D,ei/K+YqY-Ƹm诂h]eơz8=PFJŘEՠ f+|#VHB,TP\;J5 GK0 dZEH)#F(9Ju%EL`MPZ"e93-7@`l,D*!@!5i ZG@XI(3$aE`|VNQσH2I$ c-Pȱ /řCFxil@68;6RiZK"df kc>!0+|'::ei)KNYZuʊN+AmqvŜċ AZe0TMзʞEz+#'2f 8&dJŤvИUe()a\Ax(p*]U XOJ"/|n=-R70keNM]efYY ,"PZS赆)z)Qh4 U#6uC .uT̀x:H5Hs Rĕ\f~u&m RFd R#@ʣ^xMҼTB9ܮEbA&cvk D,tl qLS29 r8 `y9CtLz) @pQ 1ވZ!E#GHf@:R EJ22/Ž2T$zM(ci :@10&cWvk 0gɟfJYofg/c\YwiY,e]+(8yKq`qIu9|B-Iˆ#`Rpԥ.6T+n57;/M:PG7 u; f#: " b``(` qXSTE@^ILa L掺 ⎺Ѐ=[Jj5t`M Qb2phLHIæ+Ah|"촐L{[0LLn| )[1X~\u֪Zw`*o1c))IK.]Z$V$`L[ VŏȗE|3,& X(m 7N*E#QK<̰l.Y1XK|Myu:^b/XK<]7\ z×w:^ /1$Z6RѤBVEKJA7ZMU뺩j]7U-lpgXDdQ֦ber*Q~" a’J\ z4+nC6lP\ܨtq%b1J>(~!C64"5sS:eciڇc & Y-MLWn]P $Γ;`uwr^nh~9$, Wf3w!v^T2?>apt3ɀ8,]ͨTʛy=Vǧ` +ӄ ݊Ob~0 _n5 $LGi&R^y8Q=- J;V/ ⩗a6wwE0``3!w|ˬ,oC޽/Dg3`5fx4S\b,AD^Tj:`+hPԢꚔ+S*nP!&9/{WǑ}("jbuA6mEZ>~FH.U]-Rlۂ!Օ_FۄCl6j_[ɷ6q+D]2PGcJFk~صEhņ4.74HɄhw%m F$}|Wz}ltv:{Iat!Nay؉CC. GYM ܛckH -"T_he, ׾Z-i5Kﴴ-2XQr[:]wUtwʇOԼe U [?X;7Q!kDXR;7Qە![#+;vvV)ߝwwUh_bm%zg^DC;ŝ30 (PJ 3 zR2>ЋFwb8#֟́i|upgѝ5C_wg_agsFwȁv֝ng}WYV0Ye >zo{[?t;G̐/{א\{{$|\ݖ[/Q%KwUw YQ^_z|g?AJ7-,wgL.78UNtp"O7Eg_A!FTlRTd:}Y^w;Acpibu xgϐ;Ka I7 {|ͪRSμ}'."}B%Uї<G Q'ҩܺtYRB*+.At+5A%YGwΑIcZpuG{MޯyYJp7DO0"ٜEd}h9YMtg)_+~<S“{>>or@^Nvr'\䔦y>i:C!sKq'Q3z޺V K:lnyܮ|ea{ʧpx2b95"=FE萾Fyk /)î(:mD_`#=S!V<[ F'; 'ӈ4sFyo_n lt'zjbt.XyS "|F >у58OEOH7Ġ#XeOIc&)D3䌧璤%),\l{@g=^OIq7|bIE| IYk@/-IF/Q@pnD_b&W8󝒔O,5K|唤| |@ŦRHX c" ^Nqx:Yo˽,79:%oi8@{N^)c}>Xzc?~ȟUVz"׮R/=Hv6SRW}x^~G a\^;Ű5{b#0Y D)qZ|l!ʭ {昕դsr欴+iiiiiD(yX᥅}]3 6N]O}O]GowhL¼7w7{˝K Ee#U(T^]>TS^eZ_ī S"*!F=5TLKe ~k,*&)}I&LMki&leI&j SB q$,Bm%a:7 2)]zǺaX9_oTJ2\|)}ڂMLY$&7D v0Σ'Qco3.RDR-b=7ZnccWH%}il#%^(BU0L; "]p~ 5F/!doxVD*2isְ *1 pX@ "߰Z1TM)~dA gAQL˗%JPVpd~ F Qă߆!5;ߕ⃱N磴f7G5ڬIDmcPdBFbfJei߿gh\А/?[IEFU8qk>f`PS$bJn/s`8Í 1{Fūs},E9*Sel{O6bljLTe4,P\HA! Z+(]lf*3X@3c Jb) [3`ffz<:)_Gq.Yڻ[iW)Ґ%A= )2TIrh]ƖO>|#QXZN>pTc5,5ū2w駟 !n%<GTDd!<2ԊVsT^dҀ&ܒd]o|ɮ`\4Ȏ.xlL~},N\Qw/wbz;wi**tOɳ<"mPggƠ2Aߺyז6+Nyr?T {,Y_o^$cׯ˫k維ߡ3W4bۗ638ׯ˫:8mn;$c@ׯ_+Ge=k5dsy=A?Zc$>u?4\*YM.^Ts*#)wQ{JeTя*JC$")b^Qy#1T( pTBA~2oV462>Mñl߼=KNJH ƻR4J%c*=֬mEhݻBΝ%W8ho/Q*9+}P?Uh/a~TنР 16JE}6).UH\ff\4f6>|]2|U*K_<̭33lB ԒJ^ TjNJ\-rp)T`-J炂P&1'D~u[jD} "i*u >yӇp|!"Crl jQUF䯒"ȥ2f56ŒuLM]Yn,{LIDr)TJM0R7*̭~pPދHoXGo8rto[ndU tAq(ƉAځ:Ҙ`?ePCFx+՘؈ANF؎v."g@9>$ E&7px޳+Csc.$u80h$ *dq$ZfGRM3[F$b( @r8$<Ͼ0d}#SI~3yFR\OI!҃'s7cTkQ?עyd L-eyqD-}e꓌]HJNS7F^%lfm Gun"VVcaKhg'>vbՆrw$,I1I9 2Cs?1#FRd@7{:HN<Ù!k/)ǣӷ6d#ozN[ƳUǺV3lWkW%E2v` eR/ю<xGbۋJ1152rm sL!F`QH}A#0rzb7uz<۩$9x䑓n x-ڱG"8FpeU8.=$^}igCƢH~WP haI\b \L~3t,%z8^P= uYFn #Dj ߂qdi: ~i OM];TULI)`a!h{s&GIrvpKCP= Y>1mVIy9&bLj6ԔsKꯣ-A&}ChӳbНH{JBXy,{[ӏ^l`їl 5b JKvi&l&q\KAlR(I_δ\R 誤P#jRQѾ@ڋW%9R#Ae]!ӊ dP@ s@ɠhԄ`kAT)J$%䚣!iTT0UF]Ǩ+=ExyfV) P/djJ@n0:D7`n4-bY.SP[ֱŘmIը%R`1{}"텖23W{,qv*~:$Kaὅny&WL!`& CVWRzZyx\Ru ;*xp) l̀{F$(*Kqϼ2$vS)e{0DVQYDaZbUdDdy+:S:1Bo{NegCplykw˪aWeqA!AJg,;Ku㰁2oQ*cy LqҐW|0RMdm IvG.<ø#yn@ibZ Zl5ǚyODc"!PuQķ2-yQ(|EOmxQhmʊ9K'=3f$ŠJ*6{4$ '-7 _K.lŐhk3eІ!@>5ѥe$'<3j]ұʀ[G52Ve3C4܄brxQh)Y_F[*Fk$p>#W 4Y3n#.朷f'=ӠEjCDh_}*GJM1ل! D}lB9FF:X/S!1Yj>t {ۜ ;#DBoB} h1ߏ%BVnq2.H'ajB!Ft^[Njp]8Bŏ=~iq4il|bK9DI DhJmiD F4P|@'+TX*Iչn=L^9NPs>4H;ۺ;[Y̋yEq=W>68{5"aD^ɘ; dA x43%վy۸obo Y\p{yLs3dR x|Y~ }IgQ$EtEYU$=>-f -r $hg qsdɝ͙] L~)aB1q{JLtc;T>.d!iYsf˷gtAdnWqzK)$ [0w,QHC*b<dLͤuTr\nsHc)}ݔ,q v]T[ A1ƙQ@P8nE"U |n-RN`fXFn%_c8KpWAaA0,(N{v/ "(xLc">NcE}$S#Y.aM x.:%1w;Rv~[^MB ;XӇ/k f !gjSo|&-XDb*"vJ>_/)Y :TDbSwVy,XF\l,r1mrL-EG繖HGs5RftN:Sql@rh*7G3D@&J>5T3@VV !߲rTjm(54y|/ @Oω+<}vyGƩq Rm-]wB SA1zpȅjS2΋5І=h#!{8$BaOZy84_ۺgjt_-OXcBns6a<ǵ 0Ѻ80dg\= :ls4zh~8Vu|\H=ZrEbF4c H&ƜqwtuqO)H=bQF9+Q1:ZޕǍ䨛/U3oUv3&I~ʢ wc]x8Elf^ͧcN>C9,Kxp-nysr-)VFǑ ϟN'~f瓥Up翠swd?u:!}A/%D?2Qt*lB&ҼC/@,c]hgR=@@1/ݧ'?n|XM4iT~śS4*T-ͺV?Qpn/~. =d%tl}{oM_k_'dWh^Nt2gŲ\X_5SX|&K=ԗ_ \@؍dewrCeu  $H#VNd⛼FI# 5;DTvuN zqioeI}m? Y_eĸT_ D)B9HT.|Q䠲/-׬60X1B+o:Pհ28v]jtjPBP.S-E@ZCF((˲8"+ uV0;k^uP Jv#(ڦf&;(bp{IC!3 7V9~%m娟ª;r?-THs2_p(oR)tŪ)"m`4|@:Ⱦ"m 寙 9-ЄEF:M`o!欀CaazpbnѬ: ]UD 5ߩ0jq~ Zg+9ȕ UXӒ}w_kIҴNH*ual#i`2w0:bXE;,VrBg"w*Bfe(2aEQ٫'? tX)ϮTuKRI|W-˛-ҨC%cK]y%tw%bΧ bYo"GC%"j7,^?P1֘_;;n<!o#O8"(x }6*¶ ɇJ ;P c(Jwبqճnxǡg̞W;oQHSC %RPc+*>VQNAÇ(ڐw nĻ\d؉`lk3k?YaaS݌y`CceK-mB6ﶟ)ٚ pl<|^oml!.l(Rpx?{ry|{r?A +~لo}w-MgZqƏdǨ|:r(#8Pa(# XT25pc"v# Q<0[Ct2HfՌ;%ہ*87nCl;@!wR+>xgci+P9=4zMk0tySsw H1o5hQ _kl -!\:ͳ\ 0̐"ʅ"X^c9?[{JNQ(PKW&x,Mۚ6p!'Aj+-/@x pK5@ʝK FSs$/^][ {UOwu `OP%!Ȝ<$K#[ &Q. Ʋq5H*]%#ƹؿ^l+1☄-sjx~%c}8dq0bې}|!ADfdq/b> y>vnaZkj1Eg{ۆ-Y2?޻hd)^lφ|PPC;ji-:_TJ0mdA "(D>hZKJ9?Up3MBE(_+d/)_l6QM]s,[yhA4zbY>=g:WX$!$^ F@!)A3Vx' ?)?ϋ2O1{t}ǥ]N^O? ېM,9Y^Kgˋw~k #tDrIJZ.x[vn(;BdZm#[Hw ӞC.n6I6A)>MzUF=CJ%Ke4eE>F3ߙfΜZ#fDY/Ɏx9%fM3 .6.Of+=Ȫ0!;Ar.uE}rm:^<lYG(>WL(b֖kIOan-anpμ-Z̎g5z$Nz@g[!U@z3{9s;7\TjO[fXa邋+˳g sQ~ jWҞu%a9ROx T tR Dvm\޴W$bM{ΰ۴ p:n<6@Vdd ;R ΑV r̃gDR;BMN],%׾-#hC픰Ѐ2 CWȭlx"Ŷ5ĽZy޸M2\P,&+ۍNo1]ݦVߌTlvX]ª bޢcs;ꚫ6~ۈ*lD-UDKTa+\vOV`-+]bG>(>/0)(093YJVmE>rU ܏Өz7ltrsQ-֙ wkWx%9 %zW9Le=}w&j`@\\lMT5و{\z47)dpXVgɎ{ r r($,(_Z0| vUE%ޞmZPAy\p}P$h70RLICщVO.$*,>V3[_a*Jn?QPeu9D͡6jnyvM0jq6>jRnp:RQ1pQb?n s,4h#v#eQ L:]hSM>V:IxJ_)W7,R*8p<999sq۴(vFjEIG=&Ԛ')-(AtAo/FUeϋQBƽ-*gᅌ8Z*8U tqf'W%& Ӑ4(on`6O4$%PuY@P$۵C)}&PAP%AzWU7Uk4r$ӹ,N )aCMW"Mw3-@AgI{@#NP:=at7|3w["^ wÜF9

OM ^ie1[;"Ú|`W*XAIٙ&HkV2O)L|h8r5ԻF(˵2?^dshb"ozCǹu^ Cns2(o욠i>2Zz]_|*_s! 2ԲU@b61!ߊ+RyA3ĸ&ja`{qDH°l0 0BMͺW }A6nNŬ_f̏Y`A*qN9sb.-߯^k+V Uv"⹕ĕ҈hUse ϶| qԓOHwrFL׻g)=FƳO۔2.7Q5 r><%H\zˊp)A?LG,:6qpa.jE]0 )fs*@UhNJ Wv>h;0G\bVLk; <QUаIZUvb i*Jlj-v b(֒ـ ºh"-ZLn͏%/҈jj%LolIFe-M9[̢>Zl$΅( E]"u$=Y&*`%:!E2.R˨'B\c ?1]C\ܪ-\W ־\r=\%!+rPB֕ *2@ uZ5]~B*PƁ*Hܶ,p_mЊqZnvmBz NS/}IO#*sh z%%eJo=5-{aM"M"ոc Ҹci*KG!e_nт+ڐeJ]Pl}1 =#5?$\yF9# /SuPo!a$JQ3λh 58;<d_vԛp&ɏ~5!LQyURз{I"IQg9EH<:{w 쳟ľw3n.YŖF!<]7#d@<CIfsq8>"6%ak 1&ʣ6Eg' M~q6癰xM4sRjuEg a*ʋ{N쏧O;E+!ЄTs$N29JR]UȒB/"KzpMP kTe|yL}d4!֪q ӵQ`v@U+Qi'B"sJ1SՁc1ϡk3{\2]0g)aEZV :?걺~qPH'80IgWHkXrASxq{Wl:Ԓ^EmQrcA~ V! bR9YY`\ƚB+ keæ`/o,v Y)>"b*}Gi[W0Viq|9ЅRHz5I[6~%I&JE\HZɡGޯ(^Y  jlk~ ~y Gîvuu,ۯ'4~rL)}4^爵\T:G=aT=2b4TbT,oMXEÌ7a$Zr ťeTq,-ƞ-p@QfiJu.LKH[:Ѓ@*+_"~(^9԰oiFl;l!%<8HUu*8lVm& ?2ٓD$PkfJTo.ptŸ0aM +>sUH'8 poF7hj(gkEM8a9DXCn2 B9iMrZ>BF8|?cۏQ)7YP,i۳'LnRƜY6-s_ F*t l͓MkJS; ?KȉBQG1b]3[;{uG.VyT]zbBY7yq噦DXzo*J/M^㋓5&;E=p}H&J½;ݘP'M?i,?&M^D`ڐ_} zaXcqġM/}[slcR! 8v@<`}\Xy{rA`I|XH{:o f3 A,v^_ަ%Ⱥ{*hqر܎;'}={p "ߚ_gӳi|j{s>j57CmO]֨7?s~ww o~߾|FioϿLLoVގ8>7]M4Dx5?|ԁ }r|w?Ѹ{a<7)<pt$}%w8uGnEvn{!y>zI'$F XF-w IhDa%W=;@楤nͫlLLG w{N[򬾥S n݋4 rI_R#2 ?G;Կ+$Y5讔hjW%bG൶flT"C]aEaGvИ mؽNoJqb; &=`<|&de~Q_OA(S$Z~. |ew?cӒy dDYƾ0EMmDc|*JRO"Ѻ/VܾG$NEa5|uחaMX 7##M}d27#PptMu2<o<^z뻢@ 2Կ*^ۏd&TA72G ᗣ1 g_O?%YNCcͮ`l4W!F5z˔b>pVuiВߚ(LFjGsȍ٠m~9K~Ȍtd}'dј^[3ʁܙ0A<;ޅe ?^8HkLݻJA]}03k6<$tN٨щo`N|Fp7g|jNS |b( W \Z3A.sdUds9AC 0@Kk<7EfTOJGM5ibd %? ?I0f8w|}8@Q_Ws]IЧ|Df}T$7_yjZ;ܱp<- ѱvo q',m˜ +`963l*A oQI/5-h` _yj)L'깹`hK}P@N aoj ^NbkV,(-{OW_?O{ tx( P!W,䁆=7^;  sB2x /ʅGTk?7BO.~Ahr2?S'3bf$̥|%F?D~Ӎ'?܃ K/]Xd-h)E(_+[JRRN)8sa:O)6U06hڜ`9e LPJ)VU4xx).SB)KDK)ZJQ2E,][7<^(h%:.2N =pǁ|"%[a0O>PRX5ݹtoMttg*J_3!׷޵q$B圜aVA6 +eCsbV)4',ÞꯪM tdqthҋ\꾮CëI: džw 'NBjPx mFu Z4`ӏ.{H7cY4w#?o&eY71]ZW_swl\nv)W1I$p>{W@9vjBxߐ2%]#EɨC.I9m[9՞!&E6I~YЃg_0Yze=Ef߷ L 5c:pZĝG}M^yi;/mΒjp\1@G<1slDG֢;@Vc:ǎ]N?Ϣ#|kX~10Cτ晰OTɀڇ]oK+{|$ʈ@5^OَAޖCˬw)[^;Net! aqx&_-* KTZb/泯//7|8owϳHm ~v6FOFS]jZ~|t--Izn᧨$52*:[ƒ]֢(=vQ nq꾕FweJtwIOvxȡݡrXx?G'bJS0pw]l]XGzTJY}_/c6oƈvMg0~΢1RF\8Wr<l);h$P 㲄LZBfrH.:@ez/۵QLrDdvEԸ*ͫx4墍+yϳ_=;]lFM]lFb?ՁO-~1ǟ7?<_#P3nQ`y H6˺Qنt,6)f`]&!b)5G9f9S-Z: X;9:r4444Yy}KF!)ہm]uZhV}0IŒ88 ftrÙ 0c \CWvqMV9]L =㖊2V&98vr2 lFA''qHP57Z::JUƛ:)sD+x^ #fʱ&1F pӠ5LFir4{s&fp[͠P$.sº} 5l<&@\]r~8‘6y.x(-3콤/tԑQ=SsEV\>_̞-= ܋W޴?8|p.}7o]|uxkFtNA֭<>-~H׭pP.{ʠ ]d,1y)0Vy@AQr:es B %v{` ܂/fϮn8-Ry~h}J6{1 6!F!*@3HCf+_$4J;@/"CGN.B%6b"%xDF66 9O%gtN(nVg撚>'w ȃy#&8^HJF*3dF cC5`:H^g*ZbtfHuDfV2DFanQ^as;8b3Ţ;z1|b,}\,cy}%d8&= ^} l+dMu d.t}miuOܣxq6+W7K|8;cl={ORH3R(o_/./<CaǶ#li@mݿ}jǺ۳@uGn*URI`Y׽Qu]zD8@%:btٳ9"A}X- lJbӳ,v@eNnf m:l890A'(;V8Ǟ/k) )]h ݄wOh (aG=<̅b?3f-f=Û)fmʹ0-u9faB-]*͹f=_̸2nbX/Fjx[~_fS~.@8C{!1ݦʴ &mK_y! TuRbQLU{vB'BD;5)=z7dUϵˁ9pX5}];1VRu;&eU+mےUeJ~{wVelU;G]gHVޮFFlys8LV74eL2kj#{ZDE 1Kˠ)h`@zky@8Yj>G_?I\ Yu΃O|~u,\ےeqr5wqyqǮ')/r0nV;Ca7S+*!#NLGsFGnPCBth؛z  hVfb$8c%0潂nWt_ϼ y,Kmi i'4SЀ$4sG*[f]\N X eibiBJ#(juҴ6,ݎF$+ii<@y, "iNp9,ޓ`R Y8i2-9!h0&>ِ)S>vd'e&ض]wl^ wj4G[fR5pפ[qD/:@`h #2dPIH]u$ f[8%FgP4}?41 XYcȦ|`b#U5J9z ;848vۑibAc=d_DM]k̙y8KTddb@*{UdHE ǠBXy #dFfQV ή3jw+*nʦd,Ib`]#bLVW8j~Frz|~s/ٗg,r قu=QP=@:J6T(Ll>Bt &3h(9> :kcW)y?EJC3L2[`ҸXgtM\1˛j>m7Ǭ9hrU ޯ.'?]e&WFDW[/Ǜ7M=QKf(M(Y$YY1ՆoUP[QI;U2 _aHaHpaɏ UT?gO7Rf`?]U#rl+_\77o?=M?|k7tzj;N^Q(էSԕE v䙁B{0.\\Do!^W= )]N|{gvhHGg_6@+g5X[`UpwS %ɇG?{a2?NϕGWHTF?J߷j7{q zFכx!mđD| O"7=Ëİ%j8]]U]]U]U+%&Oa7z)vu|?Y-O׳fe~8޺e996 ڷ䧀*B 2O2[7?r3!hZ*!öGciRaiSϋ.88Ѭ[—o(`87˜R -eY?~合Te3(jΗQF狌bt=MI4괨Jj%wR"ު~zdNN.%e!pDJ{&C0ƺ|I굍-zs&BPqěnt,9bP"`8`dЄE)nj R[J>TUm>U1bJ#?P)pwS2Gng)ӹ qօD>"T+o>d%)j{vJu -*;FdݴzYZ +Kr G$:XUPlEEDbuOLh)oXoA#@IKUˇI.IIE) wX4C$z~w$~/I]$զdu^w x0Y|,'}. ^"r0wLoWB݀Th[&7Lox%UFG}f6ơ-oK?5 ]f,tB7~3(G޳@XIԔPy!!AXC3K0`ˬ >ȘwϧއpP'Gr}!z 1XGX `˸7+V>{XQОnH' BiqvbW]GvJ_%)ճx끁ҼNdB{$!t8[Oor޸xzM0ƸVp`Bz ОeFz۞Q^LrS.hm廟JTT=`\`ZI!W1);Jh5T5q8&ၦ;㾂:.7b;ϗwr`\&_''a0}%ICSci<Wg߆1&eV|v}s{,bQRdR W9Lf8zm@iVѯAnJma'K򎚅L{,d՚s(2+*#il<;:{ҹ{NK[ &D %a|%`'SC lIy훧FpՐ# Nig4*K+M_LşZs$LON%)Ji F GWoP ܻKP&`{^웳/D6,Lbi3ؗc %\ي;|뽘ɾѻ]Oڿ]t&XlY23|ó*@(C1_z V`ޯ`;dOMLzqjT&*῞aMd>>Ϭ7۾Gxr%yH<ϦJc053{K ($LSXZ):6%^Ǧؔx]oJW1Ef-9lB! ,5AqڕSɴ4ؖ[5reU]o+ho+n4O߹a~9SM#+`e ÊxMaI-4 c14{TH|`6rWDNs 9wvó *r[H3b1 k͹s*`!xu`=GqQs/ {D3~%YP7;*%9Y9^A\EA(# R0Z2`Ře8IgSkeE*u5䶑.Lbۘ1 ~ 3^6TojrcQaS x;Kg8/jgkD2^m_4),{g31CF/8ƶ/^*Cn D8,3(`/8N gu$1cAK@d-c9U#ˡ\9.$NhP38J8Z#Z zC9:v nM1JRN WkuR+Zu3`S6Ʌ~Jئy)gv3GGtg/ITxzQOnP/R[,h@O1{fa8:>IL a^w0{jPV|f<̓XTɊAcL(/0>s4ޟL[)(Qvc,;qX}}KNsZ_М2ўrVv5QN[Ґ7Y:UCٛ?ӔjLɆs̪vyͪ:W`daj=\QhxM^U:6M5mhWo썌pqTckٝE#pȔÿvfɤLbú$tj]L-6>AAmgڲ|QAPBx,k^i)'#J+CҌiڲ3fFC9)k>.G3y\mc4,O)^Ż,#64䍫hN1u{uSuAdȺ*;;dւq)>Bi8D_*[NqdXab%kޞjLdhXTY@)01\aˡ0AFCL!bceF3Tvu!L`K9L L-;y?_0p7X^{'0xf4߅^_y3E?ΓAd~2hHv߬u e;׈Bӽ"npޡ \-P<^?l fz dmjc$8Rv-bKUN?Ú{紟]LN4p sr.2MDA|?U/Q#SA,uM&cA^/CO? 8p(,!.B HiTl⥅VHnR'&0 <ͣjg10}wLL<>Çϑ {طx3ۋѪ%Ց_LΙcN0 HFK#ۅfQkO"SVp 5җN N skKfB<ǜ 9ǵZRK0G F™Pzd) E^"[Puax#CIA=<#uR0`Wc `,Ƞ#X/,+t(aRQi= z Wy9D1)Io w՗rA1F'>~g;@l,JadP:UYj`M#1'P"K.2jJJjmđքyt{2+3HLW!p0AfHB^ (98 ;(c,&G/oZ+eݞE(jEWgḿcE _qM ZDrs;0!m;ןtl? rRiL߱]&ƍpgv[d+'t ]DkIH9 pQ"ǜSCbV0K!X{v[OHňUhv XKMu7+_2TSL&e 5ӻ0-:O鶤5nMy:}vQ7!Mw 4riD FX[uZb! "QZH7soOf42)kehV ݭ#XpXYv^MMP S"\P&8Vx /|-Him<щd\L/rI԰ٖ%#"rwizJ@oO?VApr ?"ɦ.FNN NgN5Y k?{Wȍ/{{v΢r8H/n2q,Mvs#[MKj[Q3QI>EŪ&36|{|d[>*ǃ k)ȑ ieQJibZ7|9´(;o`'1s/ M= >sУ[p_ƹCVE5 KbbJ~{+Kby9yXyё$ %xԮMᆈCm+PJF3њfB>WƩ ׯ?ݮ^SNY'.&tVAa.a3[3(e%JrƹYJ9Uّ\|BAf\HsV-tgoʸi(xu%%+OjԒYRK @um;I7(} $3RRAe*憬 !TZNxeYڜWD ݔ@qF8VzQ?&$Dm\rrPg3i)wgᅴŷM smz|u{6Lo6όm#ל7b;]nB/>)e[Y*sӗ8`WgKIߡG3ӱ^_lnU9#Skpvw](uX-6!̹oߙԙ2st@J?q~p6sw b@͢ZŽg|3]z;O+ldPh/|W;.@AYJg{[?ȗ%"`ć" MCNݎI.Tr]Ƙ2mxd˴8H!'b!jj'/W+ofW^]7ٶ.='[y_һ~t=+^!^=KLNX~*Q Z 28_׳wkWragmYؗnh,uJ]'Ƽͽis%C]q]m IxS*D2ssce&h}WAoؽO²sƾۆZ"&;t:'́JhtdTWcoHu!"1r9483^*KW `bi}]Oz$ Q(b:EyS!^o# ?9ZPWKNl-dd&I@RG62R!ănpa*>mhrRIsuߥ pcsIP)FalY*..Y-XV"׼RHc0:3@Ju%Һ3**^xO~l"B{6sr[$yaSezHi0h 9Ri]LqH +`)hB!#E%$|w԰*t:\8n7Rx"[ eTU᪤Eԃ)ZX17ƨǷkM'ALWiyJ+=91ho~@Kpe^¯+Wkcn-ك*Ƌ!,. '.]eOnɯ;K,] `|위Nrɂow"?;7?4^r3aߝIW;;N߿|6V_|Ӑ.lMFIB r_[5%dJ^W:m#V`jHbf5*d*,іK(;=g0jpJ!H);BR!5w\j"n8#os 45)7q]%Aeg/h1ijX&^3fwK^O7*G387G H(AY5ttoH˭B}Wi)gkWYs6Znw⊨ߓ߬? 7#>݃q>s{D:oUQF֔@lQ)]..@JeX[FvnVu; _LUaB  Vr7_+\PpC~a9hDdԟBSpBZLv 8V1_ZWVdU8 UNj2KBҾX&KLNBc9:oZ ιd.gE8C3^]{8o L6nv~ͦh#u3J6m\(* un>&jPo^F}˗_"d "nKU50xGVZ"F,*`js˺5ǰkk|B;!%rlBץH`Ki֥qMU*UrTR̘6&%m'%M?:i.|A-hT2&^ c~}4܊t*ev ?nЬь_Fz~Wgahp?onMۭi7mM n3XH݊mͰ wK )jAjgCIզnUޥ+BԥG? |2ݪ?r\bvzf֜tlnavWg7~RN轎 3;~*[7ͷNeq,(FyQZJ#he+V</5ĸ?ܼdu=sʚʹo95RR"pp*)b,{\6x1"N #C3JAc-B%Vj"NIkZI>t{ZVVa,=FF΍W31:̚HT+[+2wsR0Ϙ:(t R`J)7;Nԩ/jKjːUY9q>YuT\#,2T/ͭ#d : w=$8Vd= !L( » [\m,ŵ5=]ROe>/x'jj4z) AOuRl,əaq1!@cL` '!Nሩq-KCS'0ASO.舘:A5Ywi,i8C䄬GﺉHH5 Ef)*][K˚?*\SP@Y i9ڌ;+E%(GAu&Ήl ?3G6: 4!GtIzP̢&l=8wrA]䒘o0A(  ӳ3 1EO!ЂF/t ~Eny~1~L!sO5`QNsݍΞu~tcyzu ];SQ?󅳳C\T\mEȰxaW\Љo}Af5,â0DZ@h(;Dd8wAםi((%r5  ͧ؂l챍DWڕ}O1RbtltRNVOeN)sS<) E{z)$;|)#r`(%M9)( P ЈT"ǹӦNL ;ӦN,NHz?Fɦx0~w8"aBE3tHΓO$\3*Tn+Uh eatIL4fPb< 6 'cxCʊ^K_i.ġ3ZZ:uQ> Hog>޳Hz#LCRGgOvƦ`:`섶o2^JD DkC ,6ל󢥍8Zz vv m͋7e5}-‡/ *0 bOGʙeu.IAA #$`O5<ŵ"ZH9a"||:|#;ά0B:\BK;cnY91<8E .xw|zpaFWyB*%;;tc-V yGoAW Wr؏]bTu&+>bG0b%$wf;YEŨZ"A`lRsUJJTWڲA:?,rƏJRȓ6Lsz\gj{8_%Í/A|Z*kWv=_5IICr(L GÊtTwWuWWY\7:< 8eB 5D[$8(=eFωs9[K*wg4Q/c:M*IE)[{e%S]Ů(Z*]t*& qjs 1@o *Z%1?Hz#6)Ag%!czjl|K.ft4?b)rvK0$N/{VZ{W9&ע2M )Asz[sbsJjh<;!h l649~BcS!9䡋tS"Wj!Gxa[aC謰hAe<-^2/rR0,IZtNE Jwu؆2'#R{JrȒ8GtSAF"͕pjWh%-V/z[Lc#ɬhOejK)RSҮ Ry IaZ7GE"j:H|3D0Вj7.֚֜cKuR3QwMӵq2lM;Wɶt-VJv,~k֮'c.8jt*V~NJ) Q7乫y7{`%^GqZ*X3H 8*AyݵV={wD&p\;=@$(B*Ff5,cHi]@3{G+ԁm#?΂x?[r?eݣ윱Y)䔨(Q,w[yQ|)ۚJ 0Or5' 2H 1(a8ap#I׏Iߏ椝kb6IUՑ%ɱba&fYdab1F)DXK Xf̻=n_o0u}xÏTTdlǐ'tYg3:\d:/?0^ZxQ@nef:=IL( /y1up l[ `j+PR-nf| C$ZGwu\\z8ZZu3SZ{rY{h UmxJ'4m~=:KvO@Nm3F'mS9&&6w{4UX_)ly ۠pr8w@_4xThPp/g]ݗO+2V>Jw! ZrİPsKsu8T2- yc$  o]jE5jl#w1ja)ʭhCI2,e rb9 O"ž`I]|f4 qF9. BxUL{:Y kUh_h.h'2=)ֵНTu⦼ƍ艛ZP$z {d6>bC,,< &u Z!mtSFSv9X:9/mE@ 9^: %o_rLVʘ"Єt8ns:Mǭ+:9 ;B@ fs%v00A"xZi"ny6Z𤵚c,s[.yWONfd#f^g3~zEYUKD->G58 pY Gw}Վ ECٺ׫X$h~\6^yL1 WkztHzvIe=+J[ O0홹 1>=ce;j~C-] x&z 7]DB){W9ɧuZBzI NsV8Z$*^ܻI5#  0l]ݘ[Zכǽgd{+Xvg" ş&`~0`Bk)^} ֹ_R qPw?X\_ݦm\i~ӉOf~m@V[rjItLc58]Xu5U?]-Sn:cYMʈu"&O3R92"M*D[Q:M+܊,5_9SYpsu xYeY]&UMoZlUiu,A`|<]}Lj1{f?o-Hrh9Ƌ#FA5yCCrLF`dzW# .,dP׀lRl<­?z#Ak3̍0sI{]զ~@?Gag&!)uhh?%De:UbZC, s%v9kJSSfJP6F46'E2h*S69Rg Y@J%ou\V۫jYu^رbw1D4ݗn>'ۂrU-ڥvoFj|iYmӗtV1i/rqs㷕 eÿjYfi S+qn*>E]|MJt$PUv|>_Q{|/A`;ynP#you,~rH 6^yX"{Uژ4rlz[9vK>KF{=V!(Kᴃ?=QsSqX#wqKĔVPQZi1h2bzx?>1@~lzA3M.wy+8"QGh qNr4$(hPDZ@gHLȱC F=z%8 |I10+K9A$;W BT1͹`B62XTXnbpVG7`z.uFbGsR;81Qs;* c>_c ErK3t{B;ph"# bc 6i1D*<뾸wES #K V-pրJѥfY^!p̻fbPmf1hr4zvDO{V Beh3 Mle /EW.Zb:Z b'9#tR\scLnEa /֣8ȎX?=:bA3Sp;q]/ٯ;j{?|0f %l Y J b;E('%152XEEb&0.e'(l,Fa5JÁ L; x 1E/{vasIm:!EUwӌ퉬[υGJsJ8t%6*~S"̌Mԛ>g˩ 4BZ\l)!&=mV"&Oyv3/CKL$|JG7 9K 9K 9K 9+d2&SmydZȰfT89#1A(A~ 0K ](m<_?g[?Ԣ㺼S)c~ʉ.p/XdB(.)\Z[!_{ P D݊9 ctLF-u^; "z9e(&4tTQ!N+D$!&KLSʔJC("a ~Ea^JŒBziɠ&Gs-VI ^) ^#'( :aVE6ѕʚ/x̙4$1o TZL 6"o>"$3ej.Heb$wUHf'/1Dg) K9Q4& jĉ`a 17S~/`ɣj:m'ݩ;%`K4\7a' @)ng*!' CO CHOVh٠N6sɂ,gOO1NՉxSi>3'pƄ/;^G u ` }kE"yFt[&\j1M:(pmAa/~cH [or?+vf1z2U(qTƯ4vt0=[DgI{8+>>x^[ FlāЧ5E$#V)iHP3ȈÞ꺺*tosPi{:1II=; tdÔQd/!ӈ'|u2k$қyY}1Z4p\%o.5mL%`e+\nꊰDTr 5 >z4 FH [2sg)huK=&\=B$[31kKYXS]u1S('cIB<ŖfFa9,|P,H_:2 XopJZ굵-! f]EK#*i% Ů*(^ s!W-!|kaV&Roy' ,S 7&|S1=M+p^S9I P).h &S|cIeE](ГDM(Lej =eGÏ_2qq1ORURRq;/niPɰ{%uJA5@n̘,3"Čkh1Z$g򽘥Pc%THZ*( VF99 eB!+ry}OɔP k̅Ng0D-Q [ko&0eTE)Trp*iRI̜9ThD9mr1bJmpgp>_}հ:?V Gcu Hp!d+WJkưX)&>4?N"h դ\(@A7u `6 ZGKBn8Cc9Mj(YSCcxv_BB:’Y3l,h(R*3-- fyN! 8uG'bN@W,-Wk/5sck;Tbc(T"iGת8U>}VQkA<߅x2\$Xr&0^B`8M^ql/E ;sv`[,*)g_L ECƘ5N lWLxIXLr'5}Rmy ;vJmC8q(Ќ@2v1Y9.(wz81"T:\DUYMM=v+bMiCsw0'FCL>!KXht̜9Spl2u8"H{cı\ӗ SJzu'?jDR'RjBk/5Jij*ן' 7 4P.d5W>41\BJo%5d9<Zd*poҢsrQ k@hXڬӸ硜maU.CX"v7BM9 h!YWݱJ0qVO($=)kN/F<.Ey w3閕N#DWJ,쭬;#~pW^qD3m/3WcP PKq>.5M#E,ɂ\ИIx8bdHoc͙㊀Iʄ0ݝ[ݹ]"ڊ*h5m;T屣;W6wC]*i<)K5 [[ٝi$u6'Bw ib!R%U[HoFꡘQ|qNC}j|㴼ڦ.P5~g{KzЇֳ u|:zyCk,×QBG1nog{/b|3;̅7ԇ7:AxjmoTK<7vEN"#OKLgLrS4" \(mh ۿeux1vzt_5'Kop2_# *fp9S`ǯ^=?<_ӹ"?~388Iw;/_z/}{߄q}Mx9 yx*?G7ʇ 4$ o_1yZegr ^? ;`J㏻c\_i0?#&Pb_Lol Oz1znAVa|]7$׭.Nm8rǰ׽ry~C ? G L,p<\/ `^; $'GBL^m\L]B_7WCo.cp_]g]b0c}t8-5 6_:1P0$Qjp;(i]soO/ռ䟀Y??ѫ{帟Ww}>0# z7`PQ IGa,w:׳ AߠxEBWP{ep`}6YԔf5_h.:WIƤ|?Wߎh:0| nݻ_'@1"P0BcB&LeH\y#/.e^= eQ(ɸb!{plcna9NZ-g<,3Ȍ 3P.cS(ՊqͥV0>X7nMSÒRRDa'dqc{Go ~M ~W%k4F|t: nwԯ-g,+z,aJ/A=2!ETl@co}M[pi}6#w:i ~y Th6YY{!7pNRI9a7{R1S%q`Ҕ@FF荇"A07B~I˥6 70S;Qhb@ _ R#ߨ5: Imړ?1@;^TIڀhK FŰӄ_^>; 5qFQDIE<,H6Rpfȁ ȆapiЫ9 M&E33 >ց a18",=N"Yq`1g,,g┓[eQ (F`)B`!G !gd qt}U}7 %Oֳh.!L7GŲ|Rۨ?ɀ}ƈKa9,,c(b[ XI  G< ڲH)4` @bHo22"'QK0u8>IvR(sl)t1DȽ^{sxHf8 \561J1ZVɘf%cMD%Z 9$`"ƨ9Ga7JJbai:|uIRf(0:"EM,Yk,8hpe5: Rf  g3J qPmD~> }+l^ :pS2J `\Q 9AxI9 'V`ø=>[0)рك,#A" f$N`ANؾCMXuXcڌt*9B`#UfJ ѥKBqp ޵q#"AΞy,.AA`4xD8OQhzֈ-V$AL_UbUBF>WʸDcM}*9HШo6* cmIfo:U5nF489"B{TX!\Gc$% : B#QAQ%Ґn>|4oJ-cg 1FTz P˄4(]p# lpw3#)Pa ̾#Sv<M.^9!#^pCH z@v0@9S&0c, nBpZ8+0@w"ke<*I b ͋kC18'~bE'B."]`;!=o]֠",J,DX_5}_e/ue%f_(Ɵ|@Q"Eq@Α A[u9 Qb}!J4ֺ3%2#cQ7=U#(S93{?DuBIb]Ÿ|͈ Q䙾T"wDzN9@Hn=Ϛvꁢ &OeoU6mV{/|wwή%oOl&2wa-3qN5PAtl`z^b4nojA<`m1jR Orzk0.aaa*g“,Kذ0jYF dzsN_'ӶʃX=s&9ܢҬ\rѩ9~ωj2zPvGO*9ݹ3QMFUX w9ݹ9^MF ns>3"]"&I7{2N(;)wB^!x}gK@ OZVx~?kI3YE)1ixEb FڑءÊ6SN?'1FfK#3f! 6D$*>hl8;m0br0jx6D plgØ^&=;'/:kL F8'p)+;\cluHwKO\yѝdt8o:Mi$qS[ZD44]p o,*$2=VPL{/4Z@74/MpXޞ[eӼX*^~QHqw a{aZo-UX"0$tyQ' %>o ݒ<]LfGxCIQ^ĝo%$^ Ƅ9RɈYq.H*aDH)y (r*ER_6$>yu)S)()x-nW >JoDې%-O(1[;)8yA, iJ\ayJ|0Z\s!ZR^[rq*D 2M4j_UMK%,#1;qvg^0$6 8_c\#1/,Dp !N%EQ(]PԮ'.iq놊pm=qe8j LVk RS-HaC˂jFM'7wmHvH[Ɓ_P&n?+҇G-b S8B99F Zk*PjU. 4֮i4捂 Z,̟ܗ.ӧ7\;fg:(w`0ILN(`pLB$(\(-0FL,E\/l"jUND3C=Ig%|z1b݇k`~ n%@ @,393zRܯ3a+ o_FL#ߐ7T;_x׫߻{yvGhO__]L|ܰeoh\=zroIq-1GbgFOoդcnOFm36}o~}s1Y-]xhUԱ)L/o~z^i΃5vb6G y 4#a\ UyX)RcP9H W8D#]ʅßxJӂ*Kya iIG'aW/Fxɋ7i *\ CQ &;Θ\0,3,\,Na/D;QM$c(`8`녓^"%#[p@-METhD,c(,IL.!1[YnBR>H, w6Ŀ*r L1_ T` i-8ΔRHB%i`/ 0!pt8@a%]AX4EyW{ETO%v"{ENQ!*q9M PZ^*JB#B)ٛOO˖T+bVMa q{ZHISLUl+cnF1[?2wf-KPfB8ڻ7 v% ۮ_g2tt`-F̗7 vVr0m.%opn;:8+`C;4]K`M(X,R,i){,o,vֵW)\;%r;+Ayu@*Cia;Uvŵj_DF}aܨWaKiuA*5jt.&SU `hԶ{dy51.XKc =2lO+VJ}H^P<@inUCq@V5Ø96.i60ic`./=I_[~=SIz2p.TnPM^ >\߯_g<}n7[^{g^^֟N)K'ʷ~f.|w$ r oW˫mˏ[l su)) 4q,#(RxʸQvև7*z ;F~^<;r9El٭*C+WDq7F$IE?{gY|0$qwlY0;&5pˆjxD?^( JtySœr*P6qv9Qg%E4$mgg4B;gErb[.5~>{YΙjڤa6!f-jArjphq\8['pcJ@+?cl31e5LI1q]-_%: q yZP$5@SVE/pIJBTVph\+E(cqx& {ar+ \XZic(Sw-4g 3|z*Q- %Gݵ\rʱPK0|IKʀ!Ŝ@PqKI9JNN&tS9=tL`G1וEOZ]>/㍫S#;q1G̊fV䚶TO=ٹ[ơd!coʧ, 5 E-aKUAtKMK8~R1jplK5MTQ ems4mCeYAs\\#, L c=i[` "q(lV|_NVBl IC|uN՛r-( 9 NR YI;a]дn]cv )t ;JWPo1- {AiتI5<Tq?Z//,@‹ˋC /OlVs,۶Rg%KkOq{WSYo"k7xP-x0"SJ^ 6RդTPS.@ATsk!j&m8s{\{ҧHuzj`wibex_q:pp> D?O~M~56Eڤ-{m8쥾#hZJj]H*I4rN[|CSM2$b0i+^1& -P$9̢=O']`JtF0p ^Pf3$g{*$ Eu4bN1W&c8'^F2p\pt|BW|Fngw|c w }ew5cL;ZNh*rjB T!˾,v q @e98HJ !,j/%c&F@q=wpXg!yfNB4XrReQPϻn`8 W1|],|1NN>:#pe9 uK~sݤ|*h8ۡ"-LZGi5׷k';KV\vBiMi{Dh^o g˯5R9tc0m#Y^BEU~HYN*uWg_Ra0'~zH/ %\E{z{{;~S'>d7 l7+$̑._!cFqNAgƑv謎n\Q#fl") ʏ{%A"u1 afK%ӧ 8Qf D3U'oKJkI:&SyġO*_Rt5wֿڤx>URzTtuk=ux[`]90]Bq>Ga:rLZY0LbImnF")1-bs [YS/$}pUP[v- sDF~FIS]KsD&4 M\}Y*)C;Im>Nk'>j&tݽ MdVEՑa =ix moނݽ{h)D;\oD!f#\{>6 ;m~ L*p3>I }ˆG5sk5ua '=MF3^HbIbCcOfZؗϘ*+'q|ۣS|IHsa'\Gƅ`W:m~8rۊswd[πƊ0h-hgnGyA1P% `KQw{|5_.1*/V#_޽77w9cc+ (=n3o2;Re"^D6է rbeMvQݻ'0C1(>2GV^B J/a_'QϽY}$n-@޿ȣVm:x17f.2|Nǁ6/ f}>fncbdrioԕ6uM]nSWnS7~PTj$*%+QJYd0Ot S͸0Y[~>g_.ocX9<]`v] Od aFvft6`<~ m/wċo1UUE1*O=-eE1A]Į|+Ew*v+.+rbkD&9|_(ƙak(Ƀ@!ߣM)#*ve;+ p R׸KL×SG='I|+ߟ؟TҔ*r r Q:%&LaXƘLg:xb?Xa$Mȑ)kg;w߿/`W`}wyv䏞̯x#WϏ7ř#B VÚSCu,"H1*7*d%,5Bhؐ9+8E 0Vn?X$ G7K`8O MξiPԢ,CaSUDcCÜ S%Q3V*TI`:*3 .}8}sgk؀N/mU aQzA)ARIc VU4Bia8GD29 nF\j# :1I, A\[{‚0JKyF c`v?ZcRf&.bcmNRR]iF6U lڋ1JzѧDFMqRXάU*ǥMH.9gSrĘ*EQQi}ƼVcy-yB(&Ju8-Cr`c#d6f\i$5L"$kK-Ca}1JKf 9" Z* jD`FMl -%ÅhV@h4AK<R[J0y9ذKYG4qhiT -qפ,;fnZ n !\㖺=t70øv 5 r*(XN03k8v`eRg66O\l(VNv0ZM!Şġ&ЪF᭻ZC0ܬƢfatp=N37a{nM&@AݥO%ElN76bĚKūW?!B `sk32&&$Gt jN2ٻ^ 'aWC+t}7^E<~tA}XT^Ƚo&3f.|Yon@6`{xth4Q9;~zsx2]xF#B?ܤ7 2;w ~|yO6ڀ  1R윂[*ŕ~-/UHj}ݥPyӀۤ{r`AaʷKM R-5dW %efK <$&3 G爪b#9*$8xr3r۪e?)f!dwjĺ1ȣAY Ea])bS'azh_eE|]$ȓ&At}M"@ZzIX@Dӽ _+Z*ԍ#wٜ'|jӰ:|74p4dg̿< pys,Jy]]R3G9Jt0g[3ɦG_6\qjpz[,<G-Vݚch5"Q"ZR-~:I>+h-uN-;?w̨}f]: 4l5:}.N]8F 1`(AjvT!a®Ő ؆꽗nkF<0Fay0Ob3cG~BN7"?!_*D~ۃP݈wVٽPpe. \unj1{'AiU*QՈf<>hsnԬ`iE;r9\̟^8a "#904"9w{<;Zțq.,1a&S̫f9FaLvysAcrAdLbث9LaugUT"CIPgF!G=0^c"P'0qɎ{mLA|'}D Ag+ O\.}AҌ1#\jzʚ2cZ0WSXŲYnE Na%+">a"dCr1IP0r@!Ŝ@c92$#NeIs&r"+܇9ysNbT׽db֒l’٧톀u?hEV:7P)T1L*1R)'zL *uXpw㶗2$AGrJښCL3dVEs0fa3#),\4~262AֺH{U/zJڿx.r>, 4g|fy'foz7^C7Ƃ/d/g}gAs.흛>x2yZHHwIٕESΛEsɗ=үݴʹVp)EuvEb11h^Mk艆j:$䅋hLB~5-oT3M :νhU䌠22q󇟏F)4B*1D*^==D@"9a\ 8ٺ^9qߍ?}Ń;WyN?>{)W_yZ}1y}u͓+ 57O^1oQ?~3s$K-,IQ*ƜL㾵9T+3Yk+4wPPՌON`La؅[CsmVm44PT&eq$~ƚ"=HFٌ^dQ2?J}){8;pDж`zKsX D+]΁cZA|q^-FRHkJȠV +AZz*Q\&Ι$D)Q*|sxd嵏Y!,g=YU [J+,,azS;4BVI )A`i :2H,ZEcQ;41ܸz!|{ƅsFzZW|7|P\|ff/.KD.+y>*B OdU\.5CFxYN@kQ d8mJ998p„. skG ƣMf!QJnpi0L{mn)t5<*a bҬs)};1FQ`(=l k/n5(8f\c((_6"OXīP@BŴѧLpIѳ֙ $^scdR$ `/!GD-<|\a\hÛ?\ݦz0Z`7SPcuOR1o,rLfs|խebyὃU5 GU9MdLEQ)=#- ZXR6h8-pڠ{ L}ă_mqD- %@K0lǖ{!x}S8t`&\lXhn7fm,n `ۿ)s??3u^vgWШn(4EY~dŤħ$ }xwA>eehD8YoٍW]5XHXɌ7Sx`*xӼ~}uru'J3\O#|a@Yaa7B<5%c,-.-{8!)i<1<6A/0dkIiqB!clZ~%߸s7]z9-9 `g!hܘ!7X2* wS{;UK&!UTgeþ\{;e܌-彃Z ZRDGU w.xa \}| u3,I9z i a޵m,"s{LiNNE:ȣAƒtt#K(9I|;K2%A$'*D gg<eW⍨t:;ɺt|NMR`aXD ѝ){۷n-$  f2’c,&!q`t8fBiL*r da :[! Ⱦ!`=~/JHn?pO#W$_uߞuOoןLڏzzNH]At4n&z]{7ˋ~x4qi;$:v/'az=o2 UIzi|y(޸iH=iepn›ݚ؅Ɲk7rv4 ܎kx)LOֶ[*Sr]ưa%%fJ{X?t;~J${]/m2_jQV_ON:^TuL7Rc!:.=r9e^u>7h;jFu7ohDنա^7|KHuTuip쇹 _ ;Q&#lϺɧ^Ť8y H//tl~O~VN.ێzw`<L9(=cRswⵯ/ǀYlz>_Vl鶙ntm/2Ѹ_E'n{( *yt 9/%~h*_TH逋 \i8cZ1`h7E*9m0ѝ24X x(#any)g<8hsJ>tCv2_ Iʚ)oORC,p9uX"y(JQ؅&& ƍtD`:)HĹ!ӱ:D,cJH 2Cj[6lMW#cјnLm߲cY[/ǣ~:?}ylA^;} >_xF@nYǔ!5hyGE":1on^PE)ZWAHB!***gj<;UF3R,Z1&Nr-|2G*Ij#fp{=2ndLܧLP+sSGW:2q$ȗ^Omn6{cѕ-z߃ʸ|c/`ǭaVk1F߾}wӰ#9e4_.`n"S6A{ J lǒqedwH2=5)ָAH}'僯K 9aJ{ՑDZ*(X!8@|W5OKk,j@)j@)xY EDT9gHTtΙZJmQ.k)';F6RNj#̰;q ޞ~BR5*I-;N5MJl gVA'oAPgQ$~Ba5fq nnu@e6NE|緦W WUL/nڬNTN4s&gnr3W`3oX캼R %8hisMTzrʨ؝dYqMrqsӧT㲎;NB]*،D5U+8dJJR__mB<`Uk嬪+-Zi/`GubcXfbq0`_k^u >Ee*ؾ5ۧPǢlc+IܕKf;枿JݛPG.^]<ͱK<ǒ1%?4C#wήOvn̉p`&kc.F}^Sk݈ qaPKi啟!†&a,`RAb2ulWh4΂/L 3K7c4+aUT| .<P|Y$7t<3J մ>]~_=XfhUQ'ˢa(k8BT5II`j 'l,j1Ap,i6qnUe9b5i %GZhu >jBMZu1G~45W;R^3z;N(ʚqXĚ7({: U>ZeZPp%֣_!H꒤6b>H}=G[ K Ȁ0 c, NI_f]E"PDP2Ud bq0il)w[" a,nR(ȕC\PH a@2y$$ !Α\"#8+-4&!C"2)`f dBlgFR9fԅĂH QL,nop |9SXVIIpHt ZE""/GLbL%"daDjB0X+D&Vauh(&D΄\1 1lrT1a`VEt0C O[[v{9lB?(r (@].&CDD3ˑr jq0PpasZp bE(lK q &#&0a}K&.Uy] VVDb+=M7_ϗ`Ň~*]dM@޺e' Q^vG cJ36?bI?zսi LK37Z_=9>JN5~ MSwmo=VZZ.68p^"E4њ4f\ߚvDSFfB㞋r;ZO|d '~o?4E`VrzypGId4U$;B^BZ$R?W'v+,T>I6H~: X"+WOL',|'Ӕ332! ˿=+Š%XX\M}쮇ρq&[`ء{?m0kFo%]C0+ݒYxC&T{oBR27|Iky!&4Rѣ:Gw 0';:!rs7 .G^ԏl)klՅKGt%B:Űsvd}Yp6JN5wN!ucTuO8Kmݘu7eUcwdu 87f(䧭0x%)R <{Ck~=Fp֘nx!qUNb戯@;edNSkuX#B %mjPσ R EhȄh)fAqM4!aՀg\<^2re I6-xo 6y6> 6Y!K }ȁSƊyP<xdMiAqHC,Ŗ:^7ֿx0,ҧfQq߀/4c'r=;$:XhYc9ݹ S^j #ʙ^7AKbq-)˔PK1\Ǵ! ø$pD2Ga"ơEH"a]"x30dDpvl] V=>(I*Kz/7e$izHA|sƧ1 [}ًݼa ~Iz-tn̨놝XNn'I{Jɷ7stz/>ܝҗ>PJ_@bt1wAXEQL@Nö!d @ a$U!XJJ;>|y`^B6Y#`q)g>"7kDb녎M# ,' +?X[3Lk\s@n ]^8HyH)xQEZ7BY~c\L@Hf9tQƞ")Gd% rߟcۿkV]M9ZM,傦;gVq1CfR>knY ?UswM@\K-o6eJ ^gVGCik9dñcm%<>jiUEAWF 'ƙhÜ䑥1F8!TP (&ŒlqU> J+aZ,Q[1t*bЅ6"T QΑ4j,]ێ#7A/^c7ۼ_'wϬvBf]].7(T;SLRUJ%A2"Ӯdi(|!~Kg3+ƪ~qgS>DJ քTt8y$B3,)6T(Qc۽cUx |ަow'7퇏a2hMw {/qrߒ;:w-~l++Hԋ7Q|߿sfm7!(C~mJrG*/BY z8N;812)%ꀭFxM7'cV9F%$ 駐EJY4_ 4kMtI81\]DVx4OνHe{+GVWkp7;߇8NyWڈ4b]*..+F+(L8;'\\^?I9 )=|ʂh ,鉻3ʘMrFn} xҼU8mKUY cĆVKfvF!оXVo6IiH&ʔ-k Mco}씕a/ْW\m_D҅\n))0yn|3m0O',׿*иUL(`o])ˑDC_R޽W_h@as ,I9ƵO~O3Lzf㉳+ '2y()߁`QxKl[*%{?O|U2\v W-"GuJ 6c"7#(J5Y2mbg7s;Y<}C?}7hЍ+Q,!9;Cj wo+q%-b֜Vr0ru$%‡7=QL4/M{N^a.*0lkn VK/hX m.PAPK zE$`MrM2Mц,vzSmtu: KaH}V*Zrl.vGQ #zH/r`ǝa-(A+ak9M=aK5娦΀߇ CDaKU F, F֜f hX*4"ÜF+jÖ6!+Pd1 J"BH2jŅ ^ 'q@7k󿂵+8͡*{O[ʓשf9G%/YoSp-%4-8xwRp(J!5۵+O)mWp4pRL4g-% x\YhO>Fi7)WH{X: 0j\;u ܶb]YiS)Wrʢ\m%^E26{ 5C'``MU$f+Ob$f+O͉f(T1tBj, TN(Il,M5L0#*gIh eMWa1E>TK | ?(uP:ƚy%M敤wWAI1'!I>D@)!aJ( 4 SD[t1lg` |H䙄CTbUO NRNÇt#Y1Ox(ג/f>]9W wBBBFygۺy˧{ntEBJh4 3xO1Da+CgB2:?+# (5^9{ $G r&XXtOF -02߂;G|lq~78Ɠ66{=#\I\I4sEW3 ?ƯG?39 Cm} (/q qK@TZKaPE2Ir*kpfSJo88cӃÛ[J +c!΍Jj6Sy .1`*`*Eu)PMyV8Nt{F.779 SG ^%N:M ָ}rҁ:㏏30\B4 )/>M.|9 )qtGs?d^Ng CE:UvUN'%&e;59Y;m\왌RBXCtV"pOZSoA\âVT,C+<;xyj8pIOE./'$m\tiߖ*I<~ղ!GٴlO]}SI>LXINi %A&IbuJA0v}rʘ]۠v'5B dcGw7$쨇={ҔQMwgu$.)H-嗞y.qH2~ BNnTtIScT2\OHܢ*CVP[%C޿鱬$1dȕ$=QJ݂iѨDnvXnaxKQO< fC.ܢ{Bؒs;NT+f ]Zx"-/Oݻwcm$i$YwN_3%:qJG+S6gE:(JBaNvnI9rb 'H)Uvr$ ]K 5h=NIevRm[P=Wo\Բ>gSA C% (Tt^նפ0~yE19*uEGαr}9'a5?u+Ɛ֎@{|iry@cJ9 iTζjF(Q}Տw^fEHau]RtbIL[{c!veqMa-)@hcА\"@C,U\:UWcw.q83H ^s ;[`"/a $.@tνr(H_{m:g-Ӗp2ߢg%yov?,?],\hLQzvhT&A5]b{L2yv^uȣUH>05HOBX U۸i1F('tN( LvOg,(86^Eѐr񽒘hI?b(h0*eBK{'vRN).Ձ6Vהw4i_[AN1-|x Si^:hPy8flǨ:dX7-ɬdqY< aKJv C[pҋN?A4ZΙηϙO?GK瞣%I3-9SD1R^-–~@ pXV⠞o=ֻoLt&gV.U×kfHNqm={mu60Y.?qR׿Ğ۽XxWj jKj1ݸ#w"sbb6e>~UQ8 SQ8F.* dIrppӑN~I(`X.6?\~h!bf'1U3&j$V͘4f<>M92+CT`pjx#G 5aG|@5hc> rQnCG^l1:F~l_m?FSJh8PbNJ퟾|φ̉SeW~rxջy3~=B"n Cb 㪆Ab8*< `ML{=F&+F= JatV c# ^o9If_G*f ݸgw"L(N6%tw{WoMq++` =^!B;a1zS!XE-)LŠ%>C܀/G3 0,WX'en,=Ƕƺ 3a  ),YfAgI)5<.#T4N.X)Qi6tȢQ i~䨳\PD|[`r XB{#.!x H`g92 &ʁ@bVMԬ1cbC'V{Ĕu1# >^` O0%AZ6jhYhd?Ο(XS-#_ ^I]xp^W:\T 6N13@AJ!kM-#DOWDКIjjKt;`[vͺSGTb^qN`"c7`{a?b+.G%\aS&.B:5=<ΗrM◾lEm){tYL5\3.@Sַ-4s\q@.=lCį Vn+5D)}{h -׽]pKI0hcȐ3R 5 jTUCM\TC"\H(m$ʴ*e2^H([G*=0ڟgtm!-8=V%všwPk0/G,\;w!\PuOձ_(Y(HaKBCF.p̴"tr.^9S;Ƅ-jsz-4pSӽnj,bx;|0:?sZ0YPfOǥ*Kg/E`v0-yfl񊯟⣿M{h1ADTV/cţD3߾ٻFncWX|9Ipk\TԮJ;vy.Ȍ%Q&R3^+LFw'mwW$#*SνO>mVpƪo2b5 lg}5@ cٸ:A!B^t} k\ՀjI>Ƒ\]5il ׅ7|qlەoO'SټH77_~)7iP'2gR]J3;&GeVۦ{d]eBkݪ.̖;|@K҂d$ hRΉg{|2PJ*Bܸ RO"ބjŚljyd,y_\5f8|[8>)(3p_yu0'j_s_-PisY(M!O0v>!جs<Xᅧ!Re0L}^lPAuAaT[40xY~i׀W2uy﹩dVw8-vjIBN\DTq@k V[4{֭)NgԱnkj3xU[r"D/[hS͐eppWlf/d ʄhּ+22|co5Z^& a4~s +L l0 >b[Nv>$ZnebYfHqZ Ұam:Х. ;O@Ld0'1*,e12LmMiJY[#Ăs;,sJ۞#.WmTT-׎)`w,^){~B>kq.&YݮW1Ja1۵ZQZpJD.FGHmmڒs?7T+({l7Aagl/1Ul7͡bf 9qm$S|pźiR8(G֔UD3Xe#ºn[hڐ&2E 9};v u$y-9]AI &ׄ5A vKr}߀NcS[˶R,aD!Mhx n>]nf!I!Xe`GA"$Fo3O73*G"C!NcB&K1[NnK6։`yGLˎ j%O%X9bZw 7/v:lnW֯G@RsAHUʓ=W@2~uRlz8b N#BIJ^ FjԁΒso1^v]a1\0xX &>7 <Y6ϑ W ǖaٰ1Zܵs[Fz=e^ ׂ^p\+1Dڶ^OLbMo[Ҙ1rMoߞvAΚcGBv嶝 Yh mR(sXO'aW89N;u=V?7G?@g&\y\iF+uhE'[g:znsNwQA%# t*~ioˇ~0R؃=W{yIҽ=}VZJ:'vVjߐTe.Ŝ@ĻH9kB% mJq_:L\KM\jUv]& #竈zp&񩺃V 58^@AY%hQ.f#݀2"vYaJ?z\NQQc!YGɥ_ۇ\ !/zcx0}OօDvo^>~\`yp2!(3aYwiD/7/:0U&|}lxofT1e@&h)Rô3j9ş^Iq*-2iшwFǻYOY*<|)3 th|AY#!+$ x3i]{;tgVyq6qvu'PlqQnrG(2nO, PwtoFQsh'GDWM3}sBiE RwjFš3d@vNH".?/ٮuSg{s_,w&Oz;ClJ2tFq³OZGh=W/߿f&b4H5ݼy_8(okOdBx*}F޷zex2[~|?x4soǓտ3qG|$ŷ{ןQJ̅F j~W}cɠ+1 DF:9 u3\susd ~M;o>k t_A蔬x>ڻIOy%~jgtqr_a_׳!L$Km۾u۷nX=Äi6,ieJL3hH*UdLe:@:hDYOχW䏫h =H=5kއ|k'衱F{!JMg&?CL~ |/ɇ(P\D&$+v;]ƹ(˸>r&crN_SƥOm&DΒTg!^XjMf2 u]٣//p9{v %a!aFI$D& 3LLM0B+׫ 'VItAq341M,ECW书Zj+^a^WT9\SY[)>M>+)oM: },'2PRSF2[O782Y ֦ &4Ӓf,ZI4m:H{[&\ǹ 1#F(8ŬkANiSahU3 Xc FCtv5t>mp+p"e3Sڨb D'S {]a>u!3<4YKD Q2)8 @!Ј}XS}t&XD,d3{Hyoܡͅ >͔UXטJ*߱P` jcь˳h}13IPLKh (r0hQ5d ;Ϛ@͸ap6:B 8ƒ4* V* Il*(&BY7NY$Dsk4J ]pf ^-X2&(l^bcptB;5!R#8_H\Pd)3N=Oy|Dxpu8G$n KShz9-:q. 9U>3AqT(*A$]Đx{yʘ 9hAa蠡%%Y&C) Mznf8]IAu^Sʬ)+p9-|=6J'bGKzX6Zq#ԌjҬp㎕ NHB'M'G) AaȤ1Xw,]#u& ,ʑVrR't iH%8kG,!7TPtmPMا,ylm2%s$,]|`JbfefNX6)~]@rҳ҃difU>…A ʀVǼH]YlPTR&Vz!RgS;J>&>Mǰ" |bGmD,IF}RC :h1djStC/;)Vp`E̊fS0wu8Y^;fB.{q;)PSUȕm;iڲx-2D+K?ХTz2o0(b58PTR`Ԕ.[6( \TWțŭ_W?&5C؞n\O`^67+"D* )F~/WE"S1:2z~ ڷH&.Wj|c%ME? Kԙ{PuOeDЅFdxuyi$GjF8äW)LS֤*3) q8iu:]SiV+ &/jBO+rlNMo=| V^9pyFBH 8_wCI#@>ۚsIJUXS&шuRkrCc7oi˜MȌ y]Gԃ5  !6c\֎ƅ0T[S/l֢ZzJT+щ8rWHEԤR0SsʥДb`)"sYT]8x^,SM5]$wbb3}xAHc,zyOnZ0 z׽'Gaw7|ᬘvR}טݣ9/4}Ɣ88qp1!(ɽ9e>-6x/s~t.7׶dR0)@9!ZwU<,ESvNv1(2q8tv^d[)Khc=yiS$-F0ukA}Fv}`{n_4ֺu!o\E[ 檲ʇf+1wɪdV=ܑ=Sqc_ >Y+X%iaDPQ@, w(k1hMXLS頍cϚ6 /V.>O[)$T ;[=5+# +>Oub\yL_>SgoqRo'[0 D*:]M!nz6 K/_Gn.W NǙp E<H7#.JA´p@k8>>Rm1PV nŤb /%ٴIcXm]PeUqʢ+>FL o: Sf/0Tƅf, Ɣr^0c=P+Q)v 5cdF ϑ+ i*oGJu28VU DpT𥼜U2X., o{b]tzQ0 ]"icDaBv)\x/f52/ mE!wt0j6xPRZg Kymd5×D%#ydh}/;6]` #}jLW3)2Fɽ]T  } ݕU B+A HoG'-fN%GVChA?F Lj9!J="R{D`C#.f3z+&<)=-;JiqNOk9$a.5mqHCL% ]B]Ś#=&]n.W~LmxLmplJ.0JHrEU|w*E nY\vٸ+ 0Vk TssI99I$cn B5Ę<ύЌ>@㰿PLl2 ˝fÞoT1*oU"d'WQWnޮ_6bcd`D)TY=? ߔJY:cJ ,@0ǝ+0Y2'1]iNgQ!Pjp1 &DLkpkB*-ջzv&(!~B](fj bc9ٻ`fbaTQ_cx QY,:hl_pj9 #I$"ml 9dD#_" C5&if+~\M+ >&8 js&>^/Ap2f6_̻2L>eA~oro+Oin-:Xď}k#%]ո <I_xax0oC @S>$}:J5Y!lhM3l]&a J캹h<ha7I!S ~ #5dSvGDpE%ޜ ~/V.>O&уV~o͊G*Wg?Ml=de _IylCR$NOG{|(o&꼾ʋ^bLDͣe*[Q*!vm$+0E$QfZ*?;bnQ1uTK{}9BS=gm<0EdbG(DayAGݖ S=}xxXU+cMϦwypsi?M}GnQ8NNRTOGu5ILRLڏIϵ*IU%^2RϦ90%f(DcKOlbSqy~3 M׾?]~{ !N*BR"->'^ì0GD @qJBH7Xk֜?'B~ZAP3VbG-5r.P9EB(KFJ]PMrrrUANH(ϒ\aTJ؜zЊ .rͺ\J]iyT.Vv'7;vgOA/_OP]NB2ܾitTKT@U*̻:dQ}QDQgrθ!q!)Pjb+‡AF˅aXB*gXygi_|6~V/W:1fJK0t9# tN'7Kc@6$ゖ[=Wv d/MŸwf'Ջ:/:.buz ̃o 7y߿z'W[o4~7RrV %wˬUŧ9UCR>!{g~3㮲 o' ~m#TT'/-LHZ]uT6S׷] 3BLJx)q;_] IRvCMH:/1ĸ)z [A"|$^͝!j Hd=\$#֤\ ]a3 h2YnsLgےrLb2ǤK'K-q#E}.(E (ڲ}JD[(:}SV?~Ͼb"EgNE }XoPpAHJV$RrIwo"#)DLIk$۱`oZy<)I3Td*\ 9$bZQ]? RKDGWې*4F1i B]fP LֶwQKuЭA"X0gŐ1BhB-A4@n`A ĿHcrG?7)`lhY|4^~ fA%~J߁:M+̀8'K ; G *x%㝒|w.1=7쨣^_QQrC*$$z:A_c~;[ O!f<(FԱ6XȾK,6Jklu܄W= @ ӍN93NG Bs]?"L&LK&7n Fw29: 挣(O6s*3-p` bU4HXwDN5 rKrjbas,K+r Fp bsN0WX(&IfN8iU6yN`-LJ p[DG5ig߽ulO "ԘZ{/ˊE XWvVxuN d)R\\S@AVW91pBJHae+ynv ݼMp1 &^3!sa \!s * P;brAZɂ0& mbJVbt9e3NEPYp5U +0 or&QJA)6 ZG}~Ӗ p< (8s9"Zi0xCW8B?<&/Z48#ky=_y=s˱Uo[m&4chlB{ X5kUc wG,Wȫ4W1ꗵ0~ c#[P8;G0AԂB*1$0qyxYO͗VSwmYzYdg$bA363iәM`b[dw:AdH(^eyL:sSHI"맃at~|zvd1P+t<0Հ "M ~/kN|xLW.ߙwk;5X?=L#06 owWØ4l| Gnp{S#~eTJ-j8$ sMSZrMe&݀9`zOӻEKs 2,cxG67A%5A~tօY &AdZY҇:z7ͭ5K>,T.UnJt)50^ Pl4?րY(vY#_7t"2FTVo8%TCKZ", vM2SbX`م@%4\|z 0!V^W] :ɞW (0@dfvCu'Z1LQA5xzxk?/Wʴ㋆Sd?R/ēL2" }nV6"] wNa?mxq&bR _Baer}b~NH(WF ;`=% J+$At'J w!/ >9ԽL?y7 T0hC"Lޚ( :s\WrD1w7T!'H饳1>@TX:d5)|4B\Pz$$#q%l#i#f3ED \B/%RrQfuX8s@iQF7<&/@>Qbgɢ#>z62M?|8W{cT~3'A's@u`~~q:H}b7b26v0l0ş w 涕 N 5 7IIpDx3\n6+]r&4`ͻ614AΒV[ΥE %"Fn-5't( %: ~,aFB &p 4wW~Q+{m`:؆axXU@!2v2^a0-OEDH*ۆl2_A gU|HgsE+]i˻ca 0&aRGPŚ>U,X~jqCaeHf{ vNe׷s8p?GE'R(6O,ziQ#; s\s^j.ak< !psn%'eOFs &}-y:̼鈥2G,?bYZ\Gi/T:JSIk0' @.א$H{ -e{!$ \ganS0_ûIrxƙ7x.@z4!bX| ѮO>z=\tyCg738X[g(Bn=?:[<{αǪZqwJ*ĕAhŢkk1>wcӭ|2jѴ0qy][ghJk+Zż{}6hߧjL8DH1Fgu}?Z5Dq'ah.Jc/nbr/Z=Ӟo[ZSjt49.>qaBʶh4[{`׬`]0X0ˡ; w˟~wK Bm JxǾJ+Po^bhHDe5Y1O-xzee!䘈H[lݝ>)#&O;Nj>e|53_Ug7řIҭ*>Ke[/T'ݪ<[+̗h5EŞ7= g֧Q7 Z1zN`LDhjbJ`R*%C `:ᡖIn"qlM5פIۮ%z?܅*0 ºuj|_Zٝ>rId G͝[E F>v靝.ۥ;5_N$߿UAd8H~+2]|bE&brMc!ӊD.Sׄ2gk`WT$~O3a=bH5f-31v/=تqu"ʓÂL]7'5&V#5E#D._]o,OlJgߩ٫]uo lt7̍[̢!-#;[}{rTߴ:d2ͤN֙SqE>lj)J&֬#T߂wpLsG8,r1 .`Η5))lţB+#Z9BiTTT*Jj,_U椠؏,Y>\Z f0z55JufIL*!XKio]Vṇ TG J2GKe7 `i(e(4JEBhI 3ؤ!lG.x.*2(U$!EΡ}=#ȆJ|~6Q(:ؘ1Y2m|ޛ*c֕ß/wt FwY`> E$}pbFxj#V.PcK١,k?K=܋zNvobM/RmTD_D)\rWT'v_Ed~][zJ2(d֑d1tZ=kU:󒧫Tg'ū^2՞K}EYi#o]w9NY/ ?#n[#6JP.Ume_vd##)"3#CLL%:$DȈ hhF2bB x1ZbNYf!ͷ>cQsƈ7׼6mJЕ!tt jK9_16g}Ӆԭu`+}+[ds9V9L5n*V 5M9o_[8x?T;T`7kxU&WpH=e}OI-|1Tփ gZԃ[`J-Fۂ1oy,<{D=7@֚[adHݢ'#_g-bL "}U..hךp%׹f"\u 0qݰewAVF?VP 8[e ar[pF&8" /_ES/_DI2M@Ǵ7`Znt4`,1 e q^):~ksUr2⸤]}׃K/M4-?xz#*\;B/Uu<|TJo+OZ2!DPbwA,mgZ{8_!eq7;rڀ}† ~ʌ)J!)z!)jq XԜ9] 0#Q:)Og<1 -(^瀖\1XJ:PI\w9;ƭK<#yG%&hڮ6G|*GU*e)IUZJHj>o>o qbcl !^l?8d5dWkN&ݏF^]4 y9)~gd ߹]~7v{g`LIyetqyZ1e- )8tٗB%%(n Ƨc.3|黹-io3aOIΗJ2'.hdI $Q^z[W70}ƋIj(PZV$sL@ wH1uHCG{7+JC=i+e$Ej+q+ޞp5Z#;u k71y~V[=KkZʺ(JNxn Ys, 0~D ,'-!hE%qN {iGy?.~L˘h..0Js= 4@אctXOhd:{9AgLұ9`ȘJ2pژ\0~F kn֓#dێp͋45?!sJl:40/4ׁC@DjT'Q[P۳IN޵mH(Kz5JITV*7b'!$!o3B)rkZсe"*o1X] +K+n,v.3oQ&z8~f*5X0bGxyȻhųTÍ?-+{VR-?]YAI}zV2*罫g'7#Jp|dO8>'srl=~zm\{ɷ!t1WWe|O7|ˑd$+T˙0Se 35^aBh=Gt(@OG99JN)0Mާ9H6S1iU啦Y2agY)D):-أJd21?:@2 n A1D$@̠t轅*19P"k6-%XQOBqqg\!UjFՙ3v y#Bs؜^*ց[? d"a0ʨ񩐁8)T?%U) IQWr0g7GfjP"\??ң[>+>3bq.ZMC`Ah{-[ 6eB}3c=gNnG[%BCN 2!Ȟ4Ky^{W^6R&sHփ?ݻg4qR=Fl}ZQ?|TE[sw I_xoQE)%8c,h8!vԍ59kz\VF{uof߳0TSb3Vfψj֎S* %}nN]sk6ZB? VUW9}EhW*xvyG?'fCv,{zL͉6/?w 6L(?yu*+ځy䄷 s@Y z`ʴ _=R? sXYS))o}նtjBvGפ_iN-- XI *'!׾Z^ lT‰.6ZRv*$:r*E{(v`¡ ^vؖ!e1ڵV' ۞ibQ g5Eه}JqA n]d2^+><\^esj)`\\Y"UF(+^ћ)A! 8悥mY^sP!I0lyjA8_띲 JRg-g aq)"< @)QJ[3,+Osu*ZyE.}c3^|KF\R%_юfq/x9| fn@Q9pk)}zUh݇F@Eexmc.փ@7mcZPc*k%IZꝉV F6zDU2O%CxC` 1 1I`1a 'cu38p*5*rx&M@oDbH@Fx#U͒uPL5I,yd>)hmU?M]OD4fY+?L>D۱i6{?:M-zu[ۇ>|Eo؛_.XTÛp\egs?3 \' _]4 y9)~gd,Æ߹]~mڿP jw>MtzMtwqb틫{n1^C7NJ9)i+]]":MPI{ז`xM5[D0"S4ks[3]ysp:DAI|04c3TJKٝ?\jTK|nnLEA%#8W2m'$`-KcZPmrWn8( *YQb} Pm3Ja=:SbJ396C:}NHEYMMk>it׮~8}՚Π,7,,c.i"SroܥW, )X b_ ?.l1d!΃uCvaCq.x屏aKL9氥طچ෥oioIo~[: 3Lq/p49xt\LFK58ʉ+,c-,8.9;-': BAg6pηoޙJ@ߜSqbp3'V>'= P` |YO/'+kD ޿ՋЯ ܔu01NSyFKO!UK=u9wfa i Xg :/dM_ ˘ѷ)N;Z&^(#ya<'R 5*(ɬ )NZBztS5y[ٳ5${[`3-F/3Q Z@0G^jGL(ϙp@#8yC* kyT$`!4wJ*kEn6ne`),H+ X4X#TO8PZ@ l/ohp`1j/  T/N5q)zJ5J vR|d1v<-ƥҒ\]ø4-/&.~>zs_1(;g`zwݻ/Ӭf4cNy9o˗hzE3n2=yt2\sWdYΓ_xӸ4f\=nn<6OϹE׷Ŵy$+&2% N^nWQ5 EtQGs Xoڭڐ\D)AJ[(D (+Uʖ}ܔJ-ڢA+ŬvZ*"H*̙)\sM,k a@E1(tjMHW[֨p$Zi4oGMD};G`iRް '7Fs*N*.σ0"7#JzQY>;1ц(IYY-ysL1us5򢩼!WtT6̹é:$Ç۹ޢ /9;@F3g`P')^ϕ;ɝwٲ9+bzJԻ)l}׌ I ML d*w\g>q޶v #l\8:MrBgJ!ykr[oJcIoV* =e e{-2 t b$@p+bsp4֞Erf Y1%BVJ$R0&@#&N Z ukLV})DO[Wlѯ0/e}awV>2s1} &qp_45Jξ1Pd_I14pE\7n XSV˥kGQ_KFK~NVslT;kR{gOTtJQ4@4!hQ3둗am姌64NxCCV0&;In}ТN; )ߗho4qS$TqI[8eE} ᛟ ;~ykiۊH|?=nΤ slb 9iwաo~>FϷoⓍE^v­z}wn}na/~Zc#у@czW'?]_ A^̸@^Z[{_ZlVת`0&lӽ3p3JUF:5z}=:\#7nv8{t  nv"QmkhIV5ZQ"1-7^wǸj`Xg ȯ, e )mg2Bro;ZH* Fݢw$T~}A+Bd*Z"ӌ2'#\0y@E.l`67 ДJ\J \{7&Q%@a[ Q\2b Q*_$0n1g2+~ F5|@o_ݿK phl䗾Z@vЕBcچI^ JE߉m3R tAW`օD65F"݀aJi0nSfDʬ:mE$0xmA{wK)[A3TZ{d4 hs{yѰ$sg׈<`SXeKV,epK pMhl9#K?썬hK jZ4_IVӇ{;sH_ im'q_JiLxuul]JOp_r)JV~/?Lg',P,04x*W{I s$޵q+"e{!/ 0x.8-Ό\nN$p]UXE.L- =\GQJM`>\+𷆘49Ǚ43[3Xږ[$,X_tk8\ 5]Ɉ&txj!{ ")eEcy >2TBdEMB"b2Lxi Qi$C0 )+غT)Ѳb-۾IX2 7X" Cw-vn7iH-Q{ܽ~.x,.*]33r|˃GG"KbJ!\MYyz"BK5 JELJ1%N#"rGG|)$T # Η4J}x}ߩ`l,X_PKƂt,t_îMx`1 T5zF{gmCwy]D1|`Tc/X &pzkޥUXփr4voch 9nbO'DF-K.dgfnj;gs9O_ljog^07с,D+fdiyhMh13/0&; !%RL ( 4(gJXfC74k/ fv`1Wt۸`1bƊ@A\vm(2Fgi,("L vrKA8)jg pQ3PtSO+ZeSN`+g5AՇJWNNjieD#yPȺD3ꬒVi#qpf]@ޜa@@)afY-!Vi2Q!Nɍ)`- y 8h,iRuh/RBLN$҃Ld1u`ml*֎LMFZ ,i:ɄbK03zi5ÞءX ahNyǼZLD#Op+0B4!Dt`i}Ŕ.F<+NVqr`vhȵ5*]yp8VH(y︖‚ wK<7ULv: ;ykV<CӰ"UҚ,Zs ƋԌ&pր  zb@0uEP*Xj 2ڬ4Rʰ#LLww$RRocC opsXNk/%H@QŃE^Se1%k1>a$7 Ӆ `tcC>xi!4[tdVE"[hĜWBxgClEQ<%W>8ORx8}> BDS C `$>C#&=a-#2:4~}ꄖ#{^b\eC+8ýq >su'AUs`| 6PT'jU}\+n.KC!fd^ZE]ȖZ$؁ޛ^.Έ f.}p?ݽp㾾fKA<^GgrS0ȋ7$ӓKɏ.d9/^6m¯ _{sN M_ AgY?־?-e3KJd~08RT󳙙?*n`ı h*eݮ0 +1}S$AVx}${}1O/]4X rMp k5d;Ў:rPf̎fAj 5&L|2Z)X`}OH31Ln&%=RnweYv@ңAPQ=A@ Ph;:CirPrm TqzBP |*O1Ʉ\sTŞ7J>S?t]fhͶ=FD>67 /Gz,~v~}zӬyODg<2Kig)K8y1H'~o33_}^DGfeZI)cܗ|m(,Q".WUЦmWZKn]L -To EV%~8+[Dj0ʓU)üԔ-j% aʲ/ԘQ՞m*%m`k[J-Rz҂)r2y\Mt@&Ks5jE[[ M+h{DmDD2mK Ex8B۔:>ce^|Px{}vkSjLVkMK$6,gwKªЬf;*Mw?MsasX&Gz_*~M[3x䇇e; ]"q.ڃ7CD $K fQÀ4d|bqMC Ղ956,, )mod5edRߔtd~Dw5QƷ#4.:XBǡ[w+Pi%{+L0&׵((a#u!BB@ ;xX,.$owo. V> G@tp.Dh`.HF(.4dRQ7Y&$(.b*d`}Rp &,J䃑xY댕 F0KURz[8(a;ộq!4B ͅl3BJ6*ߋ(?^D';1U C0ƔA\)Tq fAqa;9=?73)ڠeKLmw)B}Wy~Î[y0 %`Ï`}*"\L3[:$yQD$ œN;!5>yilj)iJԜ!(5('P rDȸ=T TVHH=P`-.o+ܓƌņzj0Y2 myG XS>v- hEC_w$R*a`ڟ|9!U&X1$00'!Q__8MO3kfQ#Bm ,dnN[Iƫ'{ή\07.gPfݏ(IxêB횯z6|EE/weoLX1B4"Qn#2Aƈѐyln'ӊzXh4vlN>?OHKFn߆WB]6Qs%TLKBRV)Hh} k6F֗ 9#dJ6]5ec+KK#7mcdACmRZ0ɜWA;yY嬏{=.:У(?1]I1Hި)`y)Tp^:*&6* }Ar>* n:֎ uMԣN)u=mZ=BCKD~a7V'lZ kW ]Y ҡ>:Ppb":jb V8}ESn-h-gg c/N`5 BRޕ5m$ u IvLRXŎ5gZM.Vf Ma2++3}Ebroz<_ YN.3 9e35i6KY]=(Kݤ?Rm߾{ϝynLh\y-zE0,gՍ^>7?>)M zG LN?lڌU!H I[zDW J.1Ϳ^&?"0dKغgCs!!Wxxz2 a)s܏C) ~M;VR:<ď3j5;!}NJN p6_N& 7+dmLw~j,1JsN僰sd eaDž7k\%\^TT g`ݮR$+DVfx"Xix:y6"p` *I:NoGѷfNooWdfF)Kr0)qJP`)KM{A M4YNp,Z+%Q Ā %4{ϲ8jlFw;өRkӨn@C5-N*>rAbe%sĴ*)Tj۪՞ :Mfi kAM F"G}B~̨?Lh.eڏJPd*KdQDaIgPP;)*>Rs\ JB LR @BK\ Ui{rmqPZjiw RsEz5صkHVTT4TeYS15(9H/X%$PE2mXjz10G_ac|F\X'`e>/rPwnRT?"\jǚy2hSjnF'ۊV*[(B)Y&"!D3?LDQhR@շT`<~F,+K`0X{ki`*͐'Q`*E# ;{ogSܹ@9ȊL[9CL9.1,M ,%Ps@Ah^RNڱc9CĨ%XD4-\ cm> ElD!;Je8Ҝԫpafl_G;Yq6>#QJqjN(1ªy1-g>⿑_L ?NM-C^ @^0XTWX | nTJmw=R0zl%ccTTJnɂs (mo[EDl*@%2h#]rJabQRBCۃ\^B~8KL=TClnjfx}hˊl-<֋˭sk.%lkq+f/:b(.v,x@Pz/ndb *a9.{nؿvlH܊lq30 tW=HH+a՘ Pg↎B.6=8v{ϽK[?jhh䃘Q`[_o[fV7ov4 @7#0Zy[{8s` r0*kA8)GeJ(KXҲ,H bZEPVBpAʅ DC, 0`vˌ',xCʬr˕2C!dP^^g\-^@vi2Oӌ)Ar(jTjJ14He VJ2aꉝիј%oN5Er&lGA_(qF);4TS@j<#B74k5YlK!xĭ"{qshņHvq+D&(K,h{ - ('9NpS//z cT뾽r8?wN0yF!N`ʧ#QU#5U!Ǔe\<}LZ}3H?*Fo#63X{_lXt C^k:]ѡrVqKqX‡gzjA2EF^N\=;嘎+AK܅ xH*EkIM\5DxQ:PRdp0o/t79~yurUOFC. ߽5Ac' $=>0C7obBS>Y[/FՇyhkNO6Z%/MLquI}Z7~c? F>U͕: d7b/^XNE0s5/>X[Ia{{5z.x`-5LR~zy^ڊv)[M:>cLsjp :yꐛW00WkxbF5lFjo+|/BJrb1 Շ؏6~VtLK"WF +!&n X;'[ds>^2bP 5\fkwVT (dm C./.h ] P\r"%S MvLAb":hݺn0<qy/U Q/N߮vV/.&ڭ;GJr:jr"%Sޟnܚۥ扁F[w1SSt{p@B.\Dd }IZ~\c4Awdܪ@«?C-L^jEW { '\;ABY4'{c鞱\[HP,Ռ/L֟! hЅ![?Yt ;\Ocܥӗ&+c` wfN@\J5o0Yv4]s`Ÿoc u=¬!İ;]=hs5ԌFA1Et @r`p "%^S%BNĒJ/qV0z Gr G=J 8+\$eL&()F@/X^H !̌wwnj#h.*MK0߶m ep=1@ fΥ›e/WO:Z#XF/6LS ົi$#aM=pG ~_T:}3ޝ 5ZR0Q/kųB g\.j8GGO˟Y;|3*~JJ/nM-seTyT= P7{B؀{Uu 'G_P Q cN^n-}..c/0,0 q,{}<{~!>-CD;ѢO eVͥ󭆆Y;=U73UjJx:y6`.G@be%sJsJ1 nn?Qvz prճ}Oā2 ]x )_w=`]_v]?}ju&ޱ-g&3mK[-u[2_"X7UZh̒\aΖq 1BVa]mu`a|% k:=LK2o>E}'u}~.t=a w)E'i@>eT9,lV6dTJ&:ӑwz6]!_jVňϏ#Z|6iIS?=:9r͞iV$93Ԓ!Ja@k"(/Wi})P&U^Ńug3̝fK\SWB \4ҵsT:Eϝ]60/ɧ3R۠~-1AaN~&R<?hvzqyfQ\}VMt'խ6 ЬP`PFᄏʁ2wg0r-(;C9ZZF*6$mHzN2%Q4ʃ:GaZu~Rև츈-SW-X$%WrX s/ʒCK!Qj`XD,/]›0B!Ԣݤ2ft`yה0bmZ JSS¤9E{:!H S_t kZh .?]F1uPXT0 P2>!{tq|C}* .VOnIPrC*Dր+3;!8Gvץi%{ Ƽ!x`5߁1jR3̬Xѽ=c,Ln2J~hkz]rv,Clָ!;XnQ_F[u71p}R n٧o9 sr“6-_N0Z0Ӽbdeڐ5ο|udUWWbu=} UIV8Vk묦 A ЕuQVbv(TK[%c40yV,RLc)29BE|]FU!;4 j7UF/)]2}̢7ot+4_ٟ{.&:-ZQ^oȡCKpZћZ-ВVY&#eZ1rґmcA~U0:HIuAن_=O@B_MGEEYcNhRe!S:kZgH J}m(J ! ¨$9)pGL2[*PϪVsg R̄02DMP(RdNNY˽S2 Ӆ[(m:~"dKwi-z#ZY'aO΋,Hʄ E_B@6E'= [A KU57xNAK$'0r1Hbrh\h-#xGIkKҰ*"&%4:2вե'G9?89?{WO"t~ng'&+=ө8=\V}xFmbY:Kr؅at׫8Qq[s!Q=Z_;ճ>т7lRBAV46oK9]B㻫'jPb|m1 `U*/+ゥ89 Ż^})zס͔ޅOx.OtAdIå@•|r:`ܿq6t/y^FBp=}jn7m0.1^S[~^&'j fmft!H(J`_K KFX`%(ƣrJ&ƬBCѶT0%(>jƚ}PRz2Bx,.ciψڒrI$seB&zr F #9.feS8S(HU0-ώi敍1iSJbjbߑY%dn7ŘD1d.|BiB*6kb'qLn>^*7k# . .HA9GAkW UΓ>˺ҌLI) L O )"OqAL!hlG*E7sތ8ߎg]ë~u[n+~9?>~oeMlćw6p5e 9D3wv֩V?~zwxO.fHx}zzr!ʫ5?sfm\/9&pmzçsˏ _,|x{>>s)hx@bFgE) ADLJ4q"7R: c4,)4~ޥa.'`x'9F% tx}t0յE4{d&%.;(;Ta}ֆ抴M{kmI!jL: 9D2YHȪ$AHnFeאBl)$m =U2K 2`9K Z$A4l a-ylʴ3HAh*aYfYg]$6bN|s>l!{" [.ׂZ!I^^$Rmu>/SoO ,̒ϻUWW4Fua>N ias¹ :<EjT6^Dw|L3rzg}Js9(0K.fAg"as+zZΏNOBj2=l+}W.x\r< 97 9ގ@x}3 Cdy_B~wǛ^y_zny`2AGR"FCƜ39܅5lcp:c}ihn%i1GVFy^ͪr LE"i)A\-Y dUAFAiw 2_.Z2ǫYmc WVQjXY!6UE-ok>YTZ.[ۛ5Rzl~{2ukoH Jr;A [1en&G[uلd"6bO6fZyx󐪘>U'3tquVmGȗŮd|] nrnHQ&N6vzfp~Eqd[)S v3,* YΑrBXi6L HuSxb4Oy+ko\7P̹n;mX1`smba 2ŪB+Q0N!GKxeF4 )\y]r\$.kWi>lX]/^F]x~Ay&2 Sb <MH/+Dᨓ@ɳcz.$hG ͋ƶEA/@oғ b0 zHiED12uwíe( G2 *r*p (ÕU.$ Q [6p1 19oi  Jȍg-́ESWf@Ra8d*6; KLs\$tB ] tT5<>Vooc;H̫L*R9_!UH߯zہC^'j%+.$ ۧO_TXxV[3;M V?xs.bćǛ$ !˿]ܭ8:'P0ד?S l4c@y'^ew#6)=fAk-@x"BzHq3 K/TF382(FDB;*.>[{-cD6l@jHkԂf|"/@Od9 *yfgZw"E ƣL%Թ/:U&5lA0tt#qPQAv bz͌/ȖSθo7jӃn=/ nYEzh_˚Ggd /qqD,7G-/0y'[SٗKOp!BP03QFG?}lVhA%D_8XJo ;d'%NnyL,dލ[@`dgm @ۓpVkDVh gV׵7ڠŁU~ҌL>jA)W]UhAJ6]Uj?4$e*U#YqT3levgnCd+U$URɠ o0 'QÖJqes 7dd&Rp{GL='& 'mtيZI?6v/.duo-66,X6XFQ1XQVJGHD?(\%CN5u=,ټ1 QYY$"xw!rw˿8Ǜ|2[W|5N׻N{ZAP&t:}{'Ѻ uD*UU DY(B yaZm3y!< τ C߼`1n:PR{\CAR>Su7kB<0>]AZEd͍ςMm:+;!GP=3{i/ O|}r/9ɼO8xn[2 gPir;vVNr; `4NR]Xrj$%XS]€@ _!߄Z7 w^J WPr'rw6EDV7ɝ?M+룺sn*T$jF4e gs^+jΏ>Y~ )V=P4oO1/'BFm7>X򮪏R= cHU*1V>fo' so(lvwv{!bj2S 2OmWKِ\lIlrt4ɼdZ~_4cu[׽Qktx/e1tʧ_U</7 S 0X'o(3J1" GB\s?>lzmmcvn*84p'ņpHS9`utӃ3sT RiY{8u^ |\꠺맺~ JJDzI~#e^w_~Wp:]oOL('V_ a_A6Z^:~lT"_f=Li˟§W§S' E /F* U:Fw1pBͭ}rc\֯$z], Eɯ;F 77;FU1Y:򬮬|ט:)Ky`s+g$ R1wr-rB2/{(8 "Qp":,36'!nuQ=^O.`^F="cw'p+8OڇR֬Ȕ҉BSXsT Uvf.rJ.U-q9"T??W X<+݌LNGNNMq;׆X!+3ճ,P`0F::WpWrk3@sP:BMFO i#<6%տ B`Њ3oǼ7? 4tl+f;u:"NKE1I>`xwIS 84~}cNd[=#.8Gt2甭1g8FZtaJ5`kؖHW.I2e}Vb-(Iv;HFnnHW.˔!WH>_;|N qYV/ohO'%%6%(^S+nn+„N53 :#b;y1yxQI^\3SK_)p3sK5dT2N rZdB(FA9 .ؾH S3 -%X+J8)/XrY!ԂzgP^ㆢγ=`AJul]<),7jmp_./*m1t#Ǜy|#~7u~\އd7#3jk2ߖd9ʆG[VϗP_ůq֓,>/O)XQczXKRvl' FMѵzq+ʌPhٝkFݬۚ Yr1BGΩUb;YcݜyinQiIeRf,} Ӡ\54e wxөF]ՈTFmoK`CrgQ@71fȹ"#/^}1b*_HM\jH+Pfr#m3 S\=kÃdj! Cd:^IG Y==8NH?@z^kB#!Lx81`a P2$C[}wIb[Þthco\}^FgKOԆeq't>1 18U[w[ovǟ>iE-[7_+w t ~RzB<T輸ڥ*F 0S+I*VPP MNԨ?4Q5*,,suӕ4Zx%cJ;$ic4%M!(jqgoDgo[ @kP9c> w-5y;k|u8D z"7oTʮ=<5'm3ZͰ+Ϯ{𜑎?;3@GXcŔ곗rs$&%BM®Z`ozC2(DW܄9tWs,gכ&`ߋ{eUٗ_;OY̒0I):}}|peٛD;qjvfl?Q&z" t#;_;}ȴjwDIiLv~@U/Aj6^;F"sTA + }Zq\eQ\ .;Rt# $N2=Ϧ,F5䩟VXTe$wW:#-*@9-bi!ʩX Ja"mqt;M|o!GXF}:mFC7'ix8s%OÝJnj18l 5+B">`v&J28!H +p+n&YZE1aB0@((},Wj(\-ۙSi~rpS~WXeIX:o_\fS}n*$~ǫu9v:K@9mJ=;Is; 5XxXÞH @Z" FaB(M t{RPT#!Erflsx< ^Q%LЫxXK!R53cDИs,ldz'^k` 1Uqa ɴT$2hA!sZX/b67(2Jn|1n;ixtcR|^''&J~#zY?=%|^?PЇ>XVCjYb15c<-xiE̻ݤ*'~Z,?qp>,fC\~ckۧ;h 8brB~ViUQICa^9pa<8q[/X XBcfT߽\HCDecf6.SQzL{|}[1A>.* ڸ1bK#CC!n D2TXйu9Ee]9rbTCi T9,Bw9gW]Rqik|=dXY|)"r ܍oS_@%`u4ӡ: .,ך{`dY,V@)HN9jiIȄ #ek9Gj4d=NHy5#*mX:=ƳNW 8 W_2DKt$B^+5"a.)|rhSk@ Wm*Xm y&zMI*1蒦Yww]Frq;&ifUgď[UVS:H72678ߣBbv:nnoP@˪ BQ )c1N!+JWIy}sZd7%75cYƌ<$pX'*@h0% jfx~ۦ DIw^['s79UL]rAvPP d!e5'/ָͬ6BR"J2I\BE7'LȂ@k -xLo5)ɻf@N BpGw( :#PSµ!Z/RXb)y!R8C$t("?<|7Ԟ a*u)I1':\ǘLX\H!⁷A @u^m&y,6e3r{9F1,(eYLI;͑  qJ v[!4'Ȅ<2.a=#J:( RG)H 阅#"%74pw>l<9%ӷ)c4mim P R'!K=F@$bI#1"Q.V"5cĖWhzP8b7%Ҝu"BMe/Ij#.Bp`eTrӀh&Ҩ` $P ˽ myL,UjT1Πs԰EIoKHrxY @%G\ qK2BZsI8`C,; w-ase^Dk>I*Wpr]́Wn ˂R}ep,ǣS'UF ]Ii] 5 u=e;Q~{˟ҽ9FN"O ^'E 3D %-HX 35ki5#LK*)om5܀dcNI)tlh]ph;qHBB &.?S"7 6ƫ,Gx3%Dc> t%t_A@\ ˎSZ$94qN9Uj4qw^iAdgT,QPVTlyG(7SJeg`G Vak׫42?zM{gb|²"ɖ-]G[t]Pڊ@MPj%@m_C!gsؼ͝lf xIw ʡ#bxrswgzƑ_1i y));g39EJI8ݽd~նdS$ N,"Y*,n*kEuIZ0.Q9X;]Z؜0vJZ{fH("7WI!Ծ_kx:@F&8S!9d` Yp.ur||=7KkVX̟gOT簨~zxG4UX1 iVJ\;ቐ|gР#W[iM⼻F a/vax*|M:;q)#JE` Rs>M2h2ZAoư E Yp@SsO$S@t5VV3i gZdmL퀒!b@hD_|3,;3tpFk 9H@E-T+;,.05 @NSמJq%wS2&&i85)#ศk\5ju%c&ǦFAB_Mh[H cU9kc}Fl/v A:X(g_Yu5gݟiqc 0=:% }/DZnK q ?+\ǓG[iw2ڟ>>m1CÎ-lĊ!,nGjpCئ}`/o|12mxFmm\[O-mM莕M4+X/FG|m4ʩa#F?\e)ijHRBǥmv^k;" ?cQ5V!ŐRqpSYwgNMk -NW?oc+lEyQ9_&?pn=&_o=? g"kݤ>+u։_ׯPI)/\0t2=L;VHi28=X;^9|{&E۟8OL`!LDgdn lTO c|1Dz P}F*A`%()4URTT. QQkOvsE%q%#T`r * %Beq\ڨjIVR':Zjΰuuɣ)AjAX\JAJ,PMGݎLB<y7SZ%]vTM?)z1B/lj_ .O5GB EQ1V$Hrb iZ)\TES'~5T1[gByL+?fDp4S H)ERCYϒS[镊w+mU:Y5ZzRr:R`f;>3Uԟb*1RhSLtjSn`")va!oDSl ozF [*1Gvwԩλ3Ez.,䍛hMRY@gǻ` -Iw;!ngޭ y&dSr]V|JA$ʻpHyd:Xցq=ڦS, yDh$NL;WdriCĘcoV.4W[S\"uvQXpi z@:y3h7ҜA(FF5?]Q_*BCWFBm:3j{C* ܣ/B  !\'"ǰ2LkNO=}ZO?x%Ah=Xce_& r,ұ_fZ%ͧ)=\eP7e}v/{O{ :%01YɌK[ )U=d Lt.3 @:w `s˽l5lh=ngK-a' ”P |kɾbqA+/6`7oTͿZ(zKx( g-yIJKyi^hfB ES;J*fbV͊"ya'2=~60VNBِt,Nc&S6PvDʮTZbCD +Lsmycv%W^TL+%TZOFyW 6-!o:)w*$P'TF *@TYY%zJύ入..7,~e8m,)]07|E?ܝX|Ok6keR9ȼ2"`49y]Q44b<{Z??nQP_|~wY_ **`/_a+3³~|󧕑jU^TVQkj޳Q0NF.N %[!;L]hRtR#w9^H2mC. k'Sty%՞jxQG1Ud^h y^2*=(W) Tq$6JׇX$k[ jY\iEyS䚑9ͅ%ƈ,%.rF41vR&v,cfCfAF/w[R"(XR1!}H / -u`$5 y&zMvF E Ԙ/kcuF%'-!צGAF/rHSGT;da$]O*]-GWnt]/ƾ/.{FF*R~S|Y#CۨP!0.Ug~Hu[gpQF;3?hծfpKp<]]hWLQW7RB6~9ZBE(]$dBsQn[<)-tbr|{'S9kK.6Sڈt~8Vk!oEҥwSؔq:+k'8@/%Mn]++u?_Y[zdMm%1).K 44=3!>;> V7y"z8ع1L!7o[(Lua@$ЏrZ?S&π9J+NL+RØyl#Qspz=D^f|J8-ny޲j|loNo͸` .7h }ZWc|LսGjյ_|Wac@{uT)^j$gދ#ei'dTZB LVv@NO׶7aP܄AqMuPT sCAp j 炡c r? RJ"sy(%TY4][jGItmm> "YLHE7rqbZh k\ rW dނxa*! EXAghP?WXCx浥c`]v)8WRph,i%y)Z 1 ji48~h5B ,V@<ci5S-n;ISIaSu=()Bň޶H7&QfЂȼŔL֢٪n៙̽9eNFFhK/:TbȞV#1ܡ||.V{v(õ+ CpS fV+!T;fޟ؇be"M~l YFҙ>fcl7~EOCe]_8{EU }Y7zPS/zrgw}?;~s}VNi)JT ,3jӗ.g?`}%`{2O2X/hHJX/{ 9o7x_Qi5".;G*h֦G/~9{N]D>00: OLM';!UɢALAɘEx,&XMVE}n,Q)`smmX8LQI\%]dԎES;߲iz'Rtǜ$u;V|[.wq`ݧuiTɧmJṢe8y\ﮯ._{$fI0 }ɅDvИĵ}(YjrQU[-o"+v5-ayfQI٩ yK{B+U&i_jԗJZR;]8J;X Ub89)i*eH uXIvoH:K> CuA. ܉!d zН~6|%}b! u]󢬭ԜT]f剙RB6S{// F=/O+DV:GPWFd ݲM"{CzȾϳ.i|8ØJEr a69hx>Ɓ[iwi٢-Rֶ?f%!!`/\iw*Fk$(5 -0~]49bLE5g~Nd6OmsE=Toj~/oQЯ qh[7jkt} mI$ bbz@:AB@ЄljqȮ?|;Tj 6}|]mz j ףsKͣoHM*n)g-W0^@):=ګV:؋^2h2Y\"'>'e^ps s&s&JɈ-y* xGroIP3*:AT9wEꥮMnhƜix [e<–1JP%)1$|spRIrjl%gp-P!b/b1_w8p: 8G]F3|aCy&wub~r\n{7osz}䔻]Ɠ]f.Wq<-,.&qYg[o[K;r$<@.=&;'ptÖ$䅋hLSG6` x*po-vkCB^ɔv*k9FsѴ94o2?/ٟoo&8RvDj";~LZS'RBTܿHևoYXsYQ#o&ͶZ=26*VdgnBׅ{Ng؛`{ɛ-柳:P FreARCsXJToIbgBCׇ} ŊhAl(܁ mՖ@"(g%\2!HdtzU*l36Č`>j7:rNNtfڳB4y,/ ?/cv 9w\tKū wsx:9_~1A2wCc*9'SY0ӜRluK8šSt@TXm Lڈ:.ic1 JI#Hq3qiwZI`פLCd JKHTS-DOB+T"hKojP=|uyb(4wk2>f8 b%2#~Q8qy!h-xׇ۰.]dCHn50'ڋ} @vI/'T9'%F=҉?l{KM$*y=Zɾ q=ÕMh2=euy ܂YFKM%{i8*Dyɒf{[wz 5 ((=4IpuUB |t.v w] *̩\^:ඓ}T҂sgk .#\3='oA *:5wf!uִs ֌rq\DŽ)3P"Luĸ6 *Y-çjvt`"iȀlݯċO}<VqLZ{ɋ3wO}nO7,<5Zaq<v~'ɴQxYl<5b^!2=k9vӦ'_%Br>3%4.g^>Vnʍ[q{+7Ž|-b(M@" Of"IN$K$!ZXEQ|vXqy%±܈b+D`TF`mk)` YChQc,RJ*TcMQĩV$) VÊE 42DKr s4j%g1Aah=ɺ<3v6]&#c'>^b/4el KY@tqUp|`q(f4=4Y dEEd1E-eU߭J$/V GgZp޳A1v2X W{5T9j.I{QKLUXe)2e)RX0@qiiau@ő Ye>NPG2>rl0n) .؂qO=1*.0xǘS͕!%c1\PI 䆘C5y^ c9T [;>3v|/ȞZA3: CGD*fRFXD8ؔdP( lX1@ V`͕Bnw RH6V cj)<52kgILS$ LcSP (c\PM*d>\S-)b=[|g#LcmLz GSpi̻3%vzk'R. nd2_lW1j# 5,2Vj4 x6$NG3ퟗ9zxTՙ*:A ^rI7'>Uђp )XWFV[(>G6: _Եv vkCB^6)%wG1ץwGwJ:Mm+i j*a3P_I+.o| kW{ko^XW1fȩNY8_zm. ,ܥ=ndp1]@ŌVҏvwcQw  w6m^~} {Xw*_ֈ{n'x=|*2.'gngJP$oퟷ (Â9dy6mIk>2&+{w1/9 P%띃{96y-&rId)SS 9:{ R-擹 IBGzk]}>1ppRwpK82A?U08h:LNU-Cأ384&mtӇo.ʓLuI8Y8DzPEL{^\R v',EY l4A=Hi Mb)8F~-Ƃ;.ORb@(RH&E"F, 6A +4:gZ8⧻-8%19Wb$_>$.쬄3_@J{z 8 bB)b랞~rQ-\8>-K54l&  v%p3Brߓ 2p0.=Cl=RYBTyl7d6%WƷ(;]waYUhI*=fKI4~6\RZĵnR8qfcf?G:aYO#w]_om'{u V(['ףMfmeL*jT ?3D8k^|uTyy>94wxnɦGn%P 3qb#e;KGK|3W raXմ_b2MCxq!*#`I>G$B("-CV~wc,s~NxN_W ΛFԔy0Q"`TaW.ie:]EWNx(\ݦP= IOBU[Eŭr]795Xh8Apl[.흅[h蝅B>$>=vdƅoh},NǚIv0jx{SDs8#SL$mI.NPw5)8y^]q53"~VM@`\r'%'/ɸ1r4@ UnEdj*V퍳wvARggj5ܙO<&P N (.O(i@&8{p,H (PJUܘB b#/5 c6 aæ/&T˨rpɄOUDHkxN5^jLjŁ: ];Dͯũ-"T:A۷xAQ\/ JVh~۫F|6KkDYrs11y:h6|8oal JB5EoaETXw,VcތyP MkvK >7GBa+l 8Д~e!ӯ1?8-xdGY,r6~csG6G79xq%* -ApZdqZNћQ#pnz*Juf8" mHw.d*j>TֆvS!hNgnGQПvkHnmHw.MdJzz\[z_V&N.ni' -\Ω/[ћ7Ca?Ty 6ً,$lf}ƒ6m.\ڟ&t*s' Q% "8J?ל=lm?_ƛ^W~/2P;/*L\f8PˢZg<uHkdYSc_/8 f9Ä@\sG}ᬤ~#gxP[:QRiXu) dK]౴(Uώ:]suH봯zn3Xn;36J!+K',>NFYQ2gmnJM)K \ @`ƜVI Fȁ3mVi~ JreEYr8Q.Mb©S)ː*p^ΧBCrL: ZϚ,k:x<zƣ$ci8H}SkVPyYh+)en uaeaT.Zpi3ӀjP4њP,y! (,7Z,iG fw 3@8e8alfPSEfA~7!ҳ{8../z- zjOiEPHpZfʜiB8Q%kTVMD ]!4V(@oo]Pipx^IP,*O16g:qMqWC-FӚP x>ӔE[.l43YEy ]O"2B^[T@F)kZ7wګqk)I}D[9h`q׶i|έ3Ѽ]hPh5 P%yIph 9PHss5#@PMw}"ˆ& $etTpBo!I qªTxmKMseɄ9\r`z%x10d#8P 1E!fKRK]Z\9ҕaxsNr_0 ;*Q@ʒHŊ2 ӄȹU8e @d]5)zWC6)bAsexprSEp"} $M`Q#q[TI>r*80Ę(.v$}`k}\&g^o>{{5y Ԑ % 46Ik&@r1њhNayR}p>?ZIܷ9B[œ|1n~o~6nbjJ+%Y!jl,4܊?7o9/ >{ovRMy@y)tFh.0vbZ$J4CY,G-9.sc!#OS*qNJ@$4t-upSOUaxd<3{{:R.T+.kZ2zĊpI 6\7ˆQiD'<ܔS{n緙Mq1WYԣ@\y/!`-4<68&zf68\)ŧY!whyMM䰶>f90otqk\K/d7L8A3ϸ^2j!e},r<뼟-)Wu?z׷?nPOv$'Wh& k+Cm&dYΜ 7܆kLb-.YR_,Gb \N=ߛ `CpFdъe^m-y|t(rUaf,J" EInLY4*UQ]-ddžFh(^@[Ɔœ^:ɧUc@m zGx~ %l0G2!8򂕡ZmPTEFLA>n)LyqYA[阘B@wyʆ֛ D"bLtٽ#wULa9x2)=uɼxB>r7T;5,%V߃v8䚛]z8=UW/:ˮIT]JOZ󀐴.:?h?G\ m,ZUM!eq`9~}UiPϮ_ܣX"ݟd.De n: UkȾ1i\P;{5"p&x,z\yNjß~xTDi>)^ڢ}K`LIr]KրDb]4,jle2^V'`N#ث2,=fU+ˏ7?Oֿ6{5>k̆Pe_Pe_ơ- !͹s%Lމj6%a@?wlbN'ӳ}(;++IX#X Ѡ`i;hzr -2NC~͵hϜYnM[Wjq\qlbPNZ^s9cTZֺAրnO TtKpOQW_RD|WTZpi͹eDtrxX' AI&h ᢠQȔ`DʘhWsx\Gu{D0)TB5>iQjKy !6 Kؗ faVY;,TK9ّVL=k{=Q4d`e$'ge,E)tԚ &PV`+9'RH,9B4K $ ^'c%s713Pڕ5V˜bww-QrǠ{_e )#jW֪YLjG6hi&JG3 t֢|ĭqZéAm{КjDYg.wH hwW_.,»cga5½rg~3ڂ ލv3[b CDч2$VS?\qAVNܒvIɭ'FjpK S2cn zF aPy;}姕sAH w u2q&;rHc,ij +{ u/?STP.(+1O[kzuwaY:AEFGhγ֣#lm=Т7_6G:*xib*oze苠d0,6nyݗU@W!Vs^ūUE,/]~^ltҷ@-OsAB:ƨ0ma}k]:ToYPݫ d!Gr][;A[zhmZ\1iQ'.>4_Q./{V-3#Sqh9mջ^Nz ׍TPp}L]CY;|q9/}}wDݯ֣s}:p!:BGiD@] #,G;x|Fd=dD0mc[-n7lEXok:Z5!#"=V݅}1"{/~KޗE?W9h7Tow ="߲Al9$RH>PcӉ$l } Ӿ Ⴡ /C{:) )vՍzn~> o>P%^ jBCr=dfdT7Co6֗a5ga=b Jp~4%'ON:7;| QA ӷ]crwjВm^>`@0,%#t< lEU5T`7ǒ Q98wWf(K&XRQni~])-D+T&!y$Iz`!J.(!29g7bhsn=;@5XJ|?] T"M 9>* ڔ\jqN!@^ױ OE/(Zӊ(@ed9eDE8)RwB Z.6.4jiלI?uv-^P_‹g«{9>_=;G{fOdr͇ʿ%*y›ޝۻكpe2~7pF$!(zQ@&n.F7$[b@xhk_Ӄywv߾L8V^ضD.)\sajgŃ>»mP|MxO'yf7BE"WEz*np{}=ˡh|| "|.Ntn GՐ!,o|Lhseo/mp|-4K~\PF8J2{-"rh Kby(4Tl<1 ^1cD%4QFaz(׷Mahz7e`ȏU傞*@/:+!4$h[ CwRM2hpwCHډ{B`= 2WGeҧ1̇ rByܓ]o>Zp-Quj\nR 1am@hm'L*fўkE?@8NZ]M.F`BZ󞪄 D-|E2CX_F;$B7 Xp:H%=R֣b.z:ZL1FQVTrD0Xņ]|fP:ł([/']P.RuHݪZ7!V!DK0HxyE~)x > 4CcMx#.$Dh|Ʒ+>Zgsɔ[LOd&puQnUqcf/i AsJc~{.Ĩm[ÕJ~K._3~8tLsIPx ; Kb#bHhq$0qzŷmetQ6U\#HB&yho}fN-Ε̹scxugHnvOɎ9[] ϦbY;w!%Ȟ7UP\\ Z[U_ԍCdA- 4hJ<HI#.%jr 6?.rKS=N~? <:=x B>i78Ov:ckUz496o1dl9 k1|T^Faj,JR6p *A*li6`GNzgN1IDU2I(",ndsR2sM:`8yeN(b>0#Q`sCVkɇ>10NxQX_I9jWUE4s׍3l5Ri2.ːB`Ocpd x7U*( *E{(#5 KxCh>$c c⺬h=zO`._:4!:f-8uH Q!r8$E! &H(WhkWcPi?3iu%МG0T$\ZN߁Ƅz򊣿G{BO"TYH2Ǟ4lTI-$64:rͰjF='Dx@Sr= zbe JVHs4}U‰`{3lXCvY*OM Rot.P%)lԑ)OЎ$M0.І0 A$MZcD82MpUye;["/o7 iK `xjqt-wMkRy[ L! M)37Ъ9bt'`LM+dC܎gnn~5(E 8돼;q\>!-ex,6uzhq7qy:/~9:TTrګw'+@]eqe5l*k@evݻcvr֍Ϊ4-襽[ ZޕkD\4!ޫ Œn J4K]C:ɭi}7xKб j\t.Nl^iv/vz[ Ru?V9k6*%5SH9' [G\k+pyZC2||KX zo:sCT+%y3GNR:k=L=Mi}tL1*\1|5&F7:HcxA2\%eh,m2Fs;`=}>K81FJ^00ͣ{xuw{/+(dT0(/O6c/ G?{Xʁ_f;lR$t; N^AW۱-%gueٺ؇manْxȏŪXd4;;/{HRiYq ׯ_'}.pr"b 4/{. 4gNzkuL9r]~>^\w$OoV׷$簽k;nt5+^X95z\: jͥj&8MN}M0yqO *+Z7Yݡ˳i]XE'4q*pPUgN?]~P ށ5 Of>4Wzrj{]Z]l+) pZ~`⺮>@Qg5mom\!/s)k? s?ˏ.QwK9/eIxJ/3=yIwtx6M4X!kqZЊW;8GԎCfjAo=-`} $[+Ud&ŒRgU5u2+-:u^# Yyt>uԥP0&x=!>G?i=K!+B(5Zb-LRX(Ix|5mЗ׺{;#$!R?t``qT6Lj4%DiKsaJ撋d U F%nE__ȟ[yݏ}9KHv_#wtO| o K$-e0M)5 a|@r 8 IB"2D6mU'9)jZ[rCJˆa4\F3#O$G(C_[AcΎ\*=\7+8.0J YizS^x <`*!+}k4Aj]dL1d͋Z#ZC @hgk܇8wJ8 - B];:y8˨D=^d?Xԏs!K6@;w7irgR'%|Ɠ>[ p/wd.Yz# wwu?G=a5ZƉ#$>Rh(y|Bɜ.eϵ@YWg! qb#φLsā /UB@yo3rp9}>Vi|OPk}>דFڧi?g]_ʔՈxFKNZТe"`z]0 _q%Ͻ$j|^t}]vH6*,` se#^6{FS{ɺ+y$BZ0N ܛ·OEp8y z|>Xh8{mrpgwԔQF&#* D4k55aŭR|TA3flDgB=*Ee.:SYⓓ^5Hn vլNI'o>C'd;χNqGzX~ң sԽdf/vLoȌV4OВɯXs'z@W֫Rڏ'@қ$~yz^^7kR{k%KoOswc@@[Zg͖?ZYp9p=lu6>S^xGv.<^g,' __񝋧B[keU- ۵{c1GgwBFS̬u2r D&=À$b[p*K`I9))Is}>K!R,\>:pn:Z<٭lJpb&o.lDz|/5.zC~GBdY) [)p#DmNiS(Q*<))2ؐx 22Tʓ.EJȳ,7nQ'cq}6ZЂ i-MWc7zo*9{|J́C2<@qQz2d0 d h*eND&"u %zSrÙWD ߻˗s~eY\'m9)4&h-l#aUfI.GCPb E 5 )43Mڑ ] ds5RW so5z$zKqk̀ ,'vM {9o B+t7aB<goF m1 7!sjWF2*Hwz*Td -' )H ʦ(8L 0*Jɇё!s18 X!>Ic;l0݊;2A>fK'ړ^uJdI RLo w+T- 1wrb0 4$;qԁ+Π" u&fJ x4I4 47^ƪ(-SwaG&m"W'rjEE$C KFA&k%PHЛb߮y3Qp3Kt$vF&8!sژO!"rS7CT;-[Pr49u WCjŁtC,CN%UC$j4)Hm۪) %zhK2iPHbUR$%9Zt"Z7TKbZE1L eZomP!f[ s:&/ }:%1Eg|PC&JQG@{%Ʌ!n)8 -^$9.A `K Wi$qm ' Z1rMf#(w$9oZ, YEG0fNƿ+Or!߷5xf)J5GJ6p'і8uA]]z"[*%? p}Zh1xa4}Zh1x&;x_L`5R٪9b܎3 iu(J P#qc5GHjDao7"1m"zM.x ,g0*R.G -&Jњ"GŸFٺ8@DO`^M$7m>-4Rf1H# V8:W,(2 IB#/<$ #_zx:dߣZ,Oӊl;}VʡAiRH[Z\3wAbZy=K~tj M$q +-ȵ]񚕻Rĕ͈O7*H:Vbm!0*@],_'}50LV]u,q̴fw:a+ȀT.1 S^qYbbT:O7! :D΢-KST^PZhc7I)(WVBGX<ܙŠi#+ydʪ?GՆiJ>}w\R 5sz`{l5*po9?/A(o/7<5CVW 0jS)1e X,ԪJci}5_tCM/Z½spt Y2ztvN*}Zwu M42;7v.(=B XZ%v,fxJK)XSk.DX$y`:ǜKYҧ SHr룣Yo'asTC28NW  AXFVQdAbD2st/H&¸!YP9c+g:K 2(DW`>NhRR{ИYEhŭN@mFBAZL˟, A(' e:8*<2Zxg/r8n(!.jZ+beZR"**Vjh_$Bİ}nf72)ʴ1%N@R+% &ίYrgm3I%6 F(<glPxw( l?+gl4fl6fWŎv<([Ŋ)f(Lh>Jq/Hq B*֧| xWכ xhc ,(ya87 Y@7J0!#O[fAMK%ԲVvBcF R]:M9WsTR*3n2z]b%-&o:W1/E|nH5.Q9Bmr݆Dy ˤ2J7 KD ,1 y 4ḤH2b]H%4W_3 PBQ 赈E4'M 훲t7+Wp )_)qOIagJ*Yz\jW]:!3`ZsQJ3W1ME1l3GƵ` cAyo5(rXf3 %V GyAh0Ժ]m̼\),gIJփӁ)28Z z&2c*I'οd12odM#cJs\.qڱ݁Xrhc0" &eT{RG P!]vI9nݛau4J~l+N7pMC8GR0M,.SHHFfަo!Rl8 O%Sϕ`+ϴX̳ :ИqY "=]_?LkA#r͉6N8p`4,dHZ0 m4`6S#gL&~~VBbLE(e^.8"y1 b [:[CΨ \ T#SʱHnuFev\34Ku[KNv k1{ҙRjB,T31+Asƃ띷ѕ7&Cy<}?g1VsZ?tV 7 J_t.qͼҗh{Tc.]? 7%7')poy/7"0V,ٳ}W7y-#W3x-˺ҷ %$!_mo>;oofogԇrhW_ x {Wկ‚dD)ul~RlW}i7[Ǿ4n|QuIzIACA$Cm]/K lCw's؍*R:߹~:͞?{<:h74߿hyUz}.f4io@auMl-{gɼ}20Eq V_럻@%zyG)s Io*y|2'z1$2}/cV4YE@GSN}.Fz]^ /wy).=8GV~xP}"T`PE^]zu&ڷ㟮f?|95/-[zA=\q^n}zi9|O| z/?/xkzAŪ~nx`LM BB^;}>b w|9|z $L!j *a*W"ή24bU\8m_{ ^/uU튉Tp|L]֌<:\LyDQd=ER %s 7~%;in/K׬z+tNiOu۳ڥˢo|_YqNuG譝 卻)k765kŇt_n P<8l?v 6$Zj(|BNplQ~Fm1x(Dx,͓tφ寺F"#,@+$B 'L3{6^qRܞ6?XQ:q,TA\ʍRf0K!K\NxKN{@{ás+mY85*()*y%XCk3JH9scYeg1+9YU*El\i`]] fQw4]{;`/MTE-3 " 3Xҙfq  a1p*g=IY61[Tb־YRLQ4@MvT ,V ހiST_X2 Ԓ#Va\DN)b+AH~qo,$(y:KcFwT2v7Mu86Lp˨W$ !&£6qkY35ZMg̴\z+g\) OhM#4Kpn= ;j3c8H=&/Ϥs軟1u<)5 KrEH"Egr~?dz݆JtpL,f??NijgM/-ֹu{[l\h%y!?Hz)5e}~W_]\[ť ^vija?gͤ3R#!63aѐxGoz H@7&"8N~iEhN;Wr;{m _e+zu3}˦~׿.ZVS(&:`O!&Tp5m}6}/a纒BFCW]l0H+),e.V[rZx9dH<+6z/ߺ[oa9 w{Z]A2~w:X=O @2/Mg7exUv v/m8ZsjgR[Zˌ01=̘Ef(lK~Lw޺P 7{>e>,/Q ]k8چ_|uV8#>LJA\L)uq#]ht{{&p .%ʠ+W l: |LJzmPl`SS ¯.^4y3}3;? ?N_??YtxU̮/7MN̔Rr!)ϭA49F6N֞MZxћ؇T-RvOo>`[LIR_I%+3EA|*5B2" $'0IӋ_\6Audid:EsX]L|L( 5 1k5U ,n{Z˽% 2OX M'qN ۓ?|ooOHz>5Li-p~ʁҞ~g_ kE)ﻠISbIԏC6ѱDҤ*B|iRAXJ0eG+C ld`ƆG)SӬK2,`V ,cȂ#z @:6 \5qeA`ZĎ& <,@JuSg$ 5/USX%u,"G1&2øS)jOfYT>h}!Lh4VU @q4YuG%5L%°)h5+QiAl6Wa 9КDDdg , 3f$ͦd{fSh1;+6ujΥQ˔l?!sr"?J(>1 4^ƎF`JЏ fO -x\촶%m5mh)jvBhknCKGV݈}Th5mh#Ӊn{Zqf -|s0 -Kn ri 14jF뉅-aVNct| J`@0 <颳ѵKprZ:@J GROzaL{uVO-u)lYZjql! ~3eZ2/j ӈQ@1;)wd .e/4;"3 6=(L57].bY2Pk6yl8⊗·LeVÿ>xUvOkfmήarvߘuf"Ae IíjM`Y=_o/^)8VbkK7|:ۘT-t25"]Cb"'Y\4Ge;Iaĵ4B* /.ookYh紎fvq!e/PUT)z`p%ئWk5 >ߜA'}E>Nf~ O'!/yqyQA.~Ғc>_ҝ8`Im/;'vs#Tͺ_m{ ;ٝ-f8Ne`m{ h!~!k8Y*sQVzk=Bj±ɢD= 1QRn5 ~j_5iSx)Z:r&Ami-cv!.w A`]d .|~=f_r/9 __r8N_rxp̾0J_ˎ8|ū/ ٮ#yX޿x_~K|\i~^#3|?mgwsx_n>Lwshn|9?z61DķvS{4DU}Q׿ʵF2j!JGrJJ:!lΑ "*h>=bJڰpF}7 *un&= `co׆*%<%<%\ɮZ̐4ah37⭖֪?a(ؽ鰌[(Ʃ&U &MGA`]؂-u6(P>G_8t墌TK;coMPfa`b[K..g]o[uY7$h1R3 GcB/  u%DUİ{^x$01V%M9]qysp=24@zA+oŰxVjhʬ,] vxsJ\A@vEbpc J(@"0% AZI51U"֍"vr8鰊L2Nl{A3Y+N嘥;Ń0R%,`ۼϋ@ CF!NV!l EXekuJJꌪG+9\|=NjҐdMQt;mZk\/+V ]l}`lPgP2^>UOQGRk]QqyvOΑ3Db,c+o5CSaHCyHZY݊<7R$ QSHJAcDQـjZҙB(ZY˃V9\~goWcˑf0P]UŬލEpf_7nrP\ګg .\]W4Zi19+E0z}X˯8cQ^lvێ:X`2CqVy^Sv6BR'TC͟tR+|#OΟS۲"[RZWb<#eU-b^?wZ9nUb;̓{V9_z|=y8%dʫ}M~ Kq5z;5N}Mr:zgD¼yzHn'5y!juuk$]ح/OӲ5?[F;\s3b?~b_sruUKքbD5 P* @jB L>CIf HCDO&[Gm![H ?ZɽPSPVU@#Q:*}BR%po0B 3ϤBcJN+G5`q,,$XE:矽k'@i+4^{'ߝ9n)lgpxgjWw[.Wru <>#L}]ZxCv7ݧ_/;Ԯv†ꂩoq¬b?nnɟǿ]0W7 qÄE1TC&ҖkFaԕ49=Q8W<^߀5wْjŽZqV![qW'&ྜm-VۈWkikm jmO/B-Xh xl٫ujª,27 en@Mev_*K dx|hM]w#Ą=CXjB`u]rB[Cr8h\ d( + E$([Qa\U\!Xhȵ&%\VP2 攋BhũaQz3y8 Fr\ &HʼnF XS(++I+,jKaRƂ"/;cZׯT9_H:fn;!G7\98Whi8T"N.e ?- qIx ,(ߓ(}eS)%uCDa b,-b? U 42q @oF}F % eb]u??U~ZO[QR6nNݺtT՘3=ouJhaIc ^6p:8zKPh`6^7_n^zSQ%W\. :F_&d_66B?M!swH?:Qj}snw?=Myr goU-@~1򠷴`]60SڇDžSgC)CoawXZUa$+p7+JA,8{!x-ÐncАs5<#Hn8(=DiǮ˹*GncШSE=HhcB2*vi]3dwoGfƵ xzHގ<] Mxynb8 hAU5@KQ|LoL0TT$ TJS拕jEȭ8Q8҆-fA@r*"1WKE".kc7Mvd'-Cl|L1 A1h9sNb܀ 8DQM%b_4.9:Kh\{| eX_u%F/P|ճ_&| hyrbe5G[a 6}WLgMj~i 76Zޗ$M>aR3tO^ޟ1V3$Ym+^|Z[}Ƶcۖl%ZZZ{I[B2I"yuvxYg+d-Â>ْ3l<[X"P1imk߳i3q/|"8>JW-^|[vʧh݁:'ڡ'f$yi}-c"FIkp1\O쒣y{#pr2y/5:KEV Ð>6?`X>ͼi0RBY>yyA\} 6/I%8G/LstuCڼ8NozDuivua Ǹ.# MK0jϻB0E;FD`n0܅mIpaݽk))K capIU _cK~̱*4n1&rWT V>/{|ߧx/~M7TwȹO 6EL@ 9I'(.~N 5O9-838s@e%K/)LDY%%p|K 33R/N_8B)II7ՄCI*:8YSN!e]XkX୾aH)*ok7 b:*(_}xHCv2ChT#\}JĸFZRx%YdQ:FBrsTpAPC%tJpɺco x=@G4y>;XhNw\ۍ5i4wm%ڲ /-{mD(x$[P`aK˂! [oy5䷗]~ͺЫL^uº|ǣ^HdS ~3zOR1F|Zo&:Bs'}X\㦿Q]{\)FvmMu>˝  l!hHl[IyrMO\J2_R!ɐB* mJ+TQH&Ӥ<>TD !,}P2A~ 3WhZInlHLHҤdQ B^7bnftk4u?ȹO^&\)mܻ_RҢs~\|DHuY0so>K%ԅ,++đd!m2Z l BEܵa,*j)Yx~edKӤz܁C̱{_Ru*rȕĖ\ISX-4g PLaesf(a=oE/܂| +4pCi>·i {/OE)V9vom;plV>M2ǐ!4LCy(PGy݄ph=(BւFFی{Ǫs;̗O~ '_}ˈO~G偻P]>Qŧٰ6; y3+kcVHer1cdX,B<`C& J!EM{Y{(WX|AZdC~]rQ٨>ydeArnB0>1sQ2עCHuF%D\Ņ.IJ: yzBYcx9cSb$,lb0.#cu.<[ r)E^m!lym͊aL\ Y}qмA6[xGstuvڼb1KqdmMR!˝Fouy}?8Dz[~vS-( 2 .54wfHb߯vZ fl]zMg(?-˻y\8ÍNYfdpJ7+~Y^ԯ} !#^݅xM|0N J8stIS#n ybϹ⑉G3$PP]! v 覤>N|1PJ r 6PU1 lj! >LdڂS`0. ۩-)2B ="o}jbCکY@U )hN"aAeԪͦXXk _%@SL(RA9=CFDZ sG9B_G=&UUYV>!CFAdS(ElⵅD$WE$hx`3 M1(Si 0%ԂX(# C[JgirT3hi Jn|We2B gjH߯H8(!sa8!#D ^J%TxVUZrj R=(d&U)"S)57)8¸zM F5KmJ$1C?JCtRH 1ADIIA{gM/MekpQ\;쯌d~c,oCu)FnC:){:}7iVo[>o:2|{NEw1탞;\mw =Y8O'w5NP/z~|Wi}{`F +wNě޿&v苂!a}RȩƐZ:e KACjU'z;!PNl}>t!Ġ!5( H[_EYv ,-XdEX!-vcB-2&dYxq/0wd矁7&0ؑ  z? }^V/@CSjJu=QZ,dWJĚzD;mXɌyN B]i`T\~r\wJ^*z9( 5>}0K4ƾHMPDL Lj2;a}> HI;Ej~ IOsE+(Z2POEdēԿQ& Iόj$v)U8=ROz!;,(tQjr΄R uVc[y7>Dcy|M}F-3fӇ _7kȃҀs_a͂J⎗uw~9=K~'m<}֠1bthNxIt; s׾Nukrb{;8@1%VVA69 V6.hlR>8 [4 /EtM8+ 0?2ɠq|@3IC!bI:l'"fn}H3|O 5"7;Z~Xs$8ǘ:#RZ **K'Ի;Q- Q=^ϱ)չބ'`%a{Ƒ_!n3v[ H6)OIgaJ ߯C#r( %z6`S<]t1͈6 yh acU4Mm̉Q$!Њ=RU٤=) z=vQR*5l R?Yse^AA;ЋtkT c](4EE_8F/V4 U3-NF'%"0Ig,Cy!W's2DU9:D%Җ&RS2Ўٹ2AH*Fͣ WmA13GM źgU()FC8M(w]F,fqDΉڢC1)JZ*SQnXc!AxR=|2qqJ0g I*ѴPʃe+\22Z(SgJ'#bZV8<X:'GsT@ K3r'ޤDH"2)PJ]ALHfPv`5OWs%pZb-߼ۧjbf2k!p"~sd敡6`/klrkDs-T#9/!a`|xKxOvY= 5F OfRħ4MHxԽ!T`$)5C㞙.t]њw1hJ>Xn Bo B+]juhl^K#1VpWOIB6ɉF6i\;:xZ R5W?+=`./N nO&M>eImGf{RUL)q )DpDxP9GI<4Gq(XGd~h"x_uuy獽:J(KyA`T"lFSʑ(Tr )hiJW@E-,phτ'&+&tCQ=NcHm! ﯜ `y?J[EXh^.:U(QMڅ&=&"NsybUb}`ABJaEڐs-]f H%:ߎ${9[U~j^MG., u4: qVp DE`1-b .+򻿏 %_M3@ws䷛~M\N~苳/ DD3|yly/l㓥lgB* [^*F ~OeBP=oJx\J?\]vʍ!~%MpVȓ$we0PCkZ͌ppk˜1C`SQg zP9RޛFMB4%zQӅ">ͥctUm{oCiAYPq6(J{&aqJ90Tk_.>^wZ>Rh@_:krORkEN)[yv0w!=F(dL䮗fq@<6ʿxY޵_azdi[?v`t#فvoٜWNQV7wpy\n[+ggM⢖M\py}ٽj_6D0,P 3DXxb6xA잵M^vuu Woe>&ί.@ys9PVlWxP13ɏx=zg˜à hMO&t|~3?w͵m hƔ99_uS߂Ro(%*@[6\_  Rw'{FS `^6 TeʁZMzl}܃_~= jƅo%DI~K>D:{h-o]lLJV܇K#2nu[Z-]u8es?[C*9ע x9BpXPNbi#j"'IRo8HQIش\ػպ^1f lм8/M.q*0tyP*Hgy4:t+uL`g!Rn}bUH`,EuB2ǔ'ELBIF gN*/$> kF i%?wYvs:k59<͐Q0T㱪PJE+Մ;nQTʸsE˻Ey(Zޛ8o}]s&oY%[9&ThmVOqJwoJ7 ]߱5 cby:{iZ[>SRWeo[nzj$5~_R ӯʒEK:[E~psmS-_I')2)& 㪃fޡJ$Z%@-q*1W9]|1=$6aD8Q^=:Ճ1~BVY_+z;H BY$RC b )0/apSRmt Ug ̃Jmu+$s븳DKf$V_RDaE7XP z&OuDg)V+  u<Lj1); :@ ;}o^s}!>*' +{mK lܹvYb|$j8bWNc#k0TG݈:57ށw9{[gD E1b=V4:Lxk1wPԊ0]:;( >A42ɔ.s Fk8 ց( (I>QYYX$U2o ]M3S6UutU$B0 ym[XAsIY[$m{ woǥt KkGS BAQ_sr0T`Fba9^owŠk4=ΗCtb])Horw|QS,D_(J(BD7jqy_@|W4yaPĕT[-^?.S-y.f\.vqN@lc(@?[mӵBk'z3竝GdQ9%^dZѹ+7 RvE_᮰@fzKIdoˡ$eȡCi7bA6jzYnE^XŤ1fY_ Jr9)~W2:*;Љ =M]sʥY0ϏƝLLw߾ܤ -T0cZ6_A4Nzyb+OWR,vbRY"% ( B}|3;;3)hLU6^M/(eqJQ.W'Dl 5m}'|:e6I}:ic,d;KD{E#rNZ"_^&L7 Lt#-}y=pW~T2VX)/cf]tL0JTEf'.@"!DJMJ @PE s%J4<]HgXO-b(6a\:-ͯ w>f obhu{ImF0J(hMXT J"ZcKAV hm7Gt$)-R*Kn[#AT:QsM5I XiI ւsi"B[G`p@-qz?\"Qkp)X4]:4@?4N.dfdM:=$OZ䖸P5b͠Zd{+TlU_ Z2Z7N1(^A??-ՃgSq %Ӄ,q* (0{ 2a&0L85 rkpʬJ9HBFU4Bs_Z~ޞ$R%!\`&B`Ta) .)]4n%S/ld ]aCNJj|0T ?|)! paTp ثڃEAya$XUxt 'X?u,L,Z1t¢kEb 2 .H7z0 HSl8hOղ !N0[@bΤs>N\s 1I]ڈkymʵ|zABzͧEQ"q6HY2{fA38ﱔ*„gYGp$:A`Y֌҄H8GD&T-J&gJ6Y5RlOp:h^cmP~2`gV:)020)SL+Q,֩F^M*,L1EN*DC-`vcM͉4r|(imc1(Km<K.0A`:%$^̐& "܁ud}4Ł-`eH(L "Q&X,0k'0#3kA?ᔓk-)sCE.|{gO_N).M?3.?#w_^BKǕRX|І;uG=O3k5=`I@O+gsU`!}Jŵ"m?_,=z\+zzmO;YtOi]ޙbo3}{Wl&g_̾/m~)3[&L:~o ߿}(Q@1_G &:Q܍o .|w:w'޽88; ܯ?zm?FW[}v9 =3Ȓ >DWEQ{):(u2=a\ v ]pWnf 4΋o B~rxpvD'Fp0꛻*-ѽ¨@]? 9ꍧu19;:- ƹJ{)W|DND0SoWg<=;YO W|/mTU]%m~X᫳߾<|}SYJ'Q+M-rZEO^%X-苛۲ޟŠm +ͅU4G ;AC=&/鱔ZRE`֎K>vgF+GBU5o/e5`>bWh%?ޒW@#s}eyc>@U5+ia-[*>~Mh'!J^w\ sX}V;כR$>S~mİS' =?2a/'exqq{ o֜BEbo_O w܉OYgė&׭O?ߕtF/|:e Ej(ҵ~˲wL۸8G q/D4OGƃO'a/2LwPw?2`xq)#CǛx,||~(b /uwOIQbT3E`Ē oiNOZ)ӭـ}OKOo]op{/t=bё;QzvOg!dh{(f}y篟;Wa /WWU0+a 6ynz02/n 3z[~vu<%GYwdݰ{HN?d >^| 5,~ṷI9L)EE#@g ~̺,j7`4ri6;_^>|>|v N!y Fm1e;Kmڒ,ea~7A{2c3D*] d#BeAvLߴЌa= BV7pQrIMhs;pWow[.Ӆ%1LpKq%`BM_98npn$eܑZ[3Tʥ-=ꂊ\c4?!x]aAm#rT׶r~EN JI"I׵OmJL7mgd/ o&[y}N(xAM:uTor5%@.&v<{YӔڷiIƘ;n}3^:uT W!rj=9+@+h*>l->Vt7'<0S/#վZi1y#xȂ6&wkQ\xC5%T\_/R^6KHbɽD kGP\zjbX֎rlj 0dn[rbu:=O[llWٸ}nY֮o/lg{@ҹ:e|]֯'C*Yn͕9lVMmfh[ke;"z yZJ| =fn%*Ya&e!ou=ե^k%V-vz@~˫-I1qPSxI($۔芇vuE7.Gaþ_a%y$\A^HT} qDM_۟,qq9J)Œ-Ko|J#PTb\'Cˌβ9QUXmnέ~W{Ε7n7T S\C%! {oCL^2(u߳!\δG/@ss mgRt]ľ?Aȼэ<Ney\{spOPOx*]-x&*cpKRsBwmzId_ؗx9/IuWVIzmԐ8"9d9YD5gK0DoLa4l;:=}~10SX6v 屳QNqDZQYa9c(;)c܇NH:m .;N|Jτ" Ggӗe:X R|I3Qm0ZDZɆʧ&B||jR⾧oӳRn܆w>&lqHhi[pDf-a_Z eMIKnqR! Q0 Z\HtFkXNG";9%9m7ئ v뼺Iecs-$}EBQ{n7p[DTq9UܢYFh,Ku^r^?X?=h#қdIo/7I9Z+zLJ#FѲ)A%j7I6):odI{;ZfIK?ŽQI%-*Vqnt$#hZ'< :O2ܢ+>&  ܾ~"Ց\OMԃ@/{sT]7W8r9z0CAq H> @\ "d$ Kc7TX*?q#3klp-@V-$N K6f_ P %x}|/"'4m>סT_sSl|J) !vuqޮ:|gh1g1Db-#O`nk(~U8ni˜oӀ9QuUjdg7(B{M0#mf0c#Ǡxn)!>:I9YN>ӿW +O~GPY: A(oHf$J!>9CP %Q*>0BǿnϢh>AM//2NI߉>vO&4ɻ\Kv\չ8ns@:@,B$Q4:x1Ɔ`%U"f&Bam[͵KpV j31TKJ6*3SMqK͝Fw`h sBI< 퓨suNR:&SYGwHu\ 8)o\Lf4U\U[f :~.&u_z tvBM4)&wW \/qa}!N,)d@OJ` +B,j4Rfܜ3A`M )P ){vmƬeاċ1'R7ꎌdQ9zo}{X}\z=%yyr:KuXC-gfl~Ba (x(_/;Hhq9=ܔ1=B]ep!-G( @-S׎ hI4>H`hPnM "$dHO%b?v!{7Rm|sp67lSn>=8! w |XS96mx.\s0Ooiq U5ZIOVU ' j5qi>b0K[\Rg$Q0HK2e&vednA5/{}A"#hYQRGHH EOnIJP[*D5o' E 7b6'lnP=#ZtPwTw "IGѮp+&wIPwMpn` 96BiI~}$9/U"$~iYg+n"I;hQcS A&]/0FC]|qM"4x,uc$o 6YT=j:=iٓ45h"(AsNݮ 1(TɗqIBQ7&T'S"b_GWgi3*9)uc^G _Nr!auT3Rûw",}NaonrSa|}w}f3-^/֝|X9iݛkt[jX0?q ۼW]X? Ẅ́O`m8}+}GPR~ȗ4T-=+m6cV+#!"%SV#Ayo)S4fވЌ;oHE<7Y6$ Rǵ"&:HIOh@GcZ#ifJG@kw_=c'F#xǷFr%SM؜>f`Z=ݿ V-RM=/Awf"OSqhW:R"E:=([Fwk^ NJ,{7^K8UhyA{l G@t@J͠}}= d̞7ct~1j@ `ضdrMw/~8OG /SLV>CDVny?$,x_&-P!xlJ|OʆS( 5N<)Iv ar3!ϙcyBb&o|99Kfz g{3x $]EȒG`q-Z?:c[~e??;O>%CN^ј%:4LMTM *Ȃ,eywa l5iGy?se0g,HkhJϙw5xiJ2L˦ZҰY%5+w+&1iXʮ+\SpD}pj _ 4z6siN˭㪋uqCPg8*Eí+Ng3zd 5<|||[[ UM I$<;xxcٟ >~6~,dX##܍~"187py@jDm)<9( {2^nOVrRNb:ΐbټP*5;fDRmQ5+(^]2x*jԺ[<1uBN~9#ekgjX) V]5,Rv)LUMC7'=HT+]GSUڇv% ,EpVSDrD1ǕC4sOrV N6V._);pb# $Wl $THeK+gtv2Mjk |A"K{e% z3fj6쐭{$- c c%bP\ +)E8/WV(Jc;J7}$'u3U&Y*U3$œ4]nXCow O,>)zM4eNjLXeɱ4(`R2g&s8,o#s:&i GIX{L76GA+tiҚ,9NYZ эD&w#BߜCYtn-q'ǔ<NF8Y[9#]ǡg#\6M^xί ċWʆ]6+cm#9Gh_T%dKU-*A.a&.i~>^`)6]Caڧ<]ϵrZ 5QcIfUvId &J%F݉yl!)J1u7]LN3^k,`b3h N-a F{NpStpTim/IO8zӃ7x_*zDc59ߙk<&niZt{*n_}埿|W5.꿿J)8-Jഗ>5V1zdsLj_V؄ %q.+q<6gFfͬKnX+ĭcVAKo,7L+ ,(hӵ Q.R'&h"A^&}UbJ˷'ߕK}ĒOb">.1 ʼ5Fen^;=371"ĠT@Tl,!E΅C;j4HoW?v@@ըB靖4OxARt!xT$$AC{$X9gui̛I+Q7?.61jr֬NquuI2Hh(oT~Ɯ93,0W 3.:1VgXj|"iXs kY~~Nyx Q593{xv(!,% cQcPHFҝWuXfAbQ4KbÁ+QrC:ɌSE5+`x4@`t$˘ɘPqP !h*%Fcsfy.PmӠ11 &@7%aGεV}{93ؼ¼QhG`Z8qx Qjl^sY E3MFS3`\zgPa PE%Z,R;%A k*svHa  ]R"EN&d1*:7O67CYNy'0P5 z7WLRH%LO1\ҫ0[NSHq& J Klh1@1GZ*=IPE$> *eOV/Xo)tϾ/),"&P mA$Zsǁ N@qFJTj[oUr")'Xil@EhaBG "%qjcNܙ4*$ePaG"/˪ q =b)`DIԴoYlEG[wЁ̩r%e |J&0`IɥY' M*Î }*,q:w;gQTj" B`.Cs NO1@͈B9kC0E(Jrb(~'F@OgE""#z PҜÊ̕TxOEJwӡd$2ǕoK2 Y#yuѹ{^itPJ%Ds&T@s8ɭuD0gJJՈF0"=-jgk%QhBGCɃxA(6GY5#:NWh&hW1МMb`F**X U>. 9^o<*X.^pժSPN$yW/U* Ii2zr}҇uJŏ7?%f}gwsnRnZ^S7-8 en(}niC&m ߮69MGnVOo&|>̘zm)б%=sL]vܳc~ )vWCYØead""dr)rY9qiZ&{3er 7gxbV:Qтy DZf1#$Yɛ D C L|\ɂBF4&z5Y+cP\{c<1S 0VX5pLTN1߂e0yW"^gV~q66X5'#gT5oxIƬ[/_ͯ3 9mWwv#^4+ 3]&  j]|[x!զ2d%jw}Ypnu<9bE<ѵ|v"n(Cם-;FX0C t~tUxYYe.c%Iuv יj4Oǃ #B,l,I8vV99h)p[B2ٯAtKv27!11"꒻պ⁢Tm!G&K/<:I1\֔i[)7~luho1xuQ{hNUWB2,ݽG 7;VЙx2P{t4t c!\ NFS<*4!Mp}M7"}SoUߛ`y"n- 3=3E^Q /̳XDK)ez۔PPぇn p9zdÚKʣ2(L@H3#[6Afs w Mwk0n7ahw߽;w=]JdWfuq{&^ BAe^ɰ'mbWtPK?ly\k#Mߨ"LW(%m9!\0j*j*$܄Kxɘd@_cB  Ӄ1/`+drQ޵,kY\cײNH3B9"$ҷwKFY 2w갂ܪhlʆm}H\0JT|~mT[4C.V=dE"~mn߬G}r!gXrdw/b{$r~U8ֽp׍ ;1ɇy'Q(ԅ! *'Ky/'FgiFgRKbeMOϿ]:.~-[~.X^ U*H@ՖVŭoSWf5=tf >DzFo!@:DSُfv6S!n70tWhZ"nRx"{h>Ab]gLPKx)& OЖ“1ۉ)+(+ &1(yqɊ D|Tgh >[L>pԔ~eu,R9Τטyn-1'N B35N }Lh<ۏî%Qi6`}/cu|LOu\[[oqS|dH Vb:n_SYgAE+y~ 0\d^KêX񞟩b`a>Fѐ| |g}٤A$߳I0jh/!?/~6+/R/<\4\_,ރ\~\FJ]iQBcDždcj{ e*`+ -HIYי{%gz0Գ&](d8hZQ}q]htZN~[51ULq *Xz`{Y;NGc MߔǿQ-=:1~10X^ϳl,<:'{1 Rl__eCtkh}ǹHKo 9@Rx{>;Iw_`})L(lG3xm>g ]ѱq~ص?R낖|` ֘]RA:GP3ΕlV) zi.OC՘QiלY`@Lu!.?BB!xFs 7;#ZC"eh҃X4A]r >vZu#xFV l¢6*X eTRIo2ś{9U$gBZIYg=,yΘC;:O |ޅ.Yi\8SdO&LUA!+$ZNJ u/䏓q@1'\Ɂ KlHf<a\{%-W?,φ~ܾ8=ֈ 5i95޿dVrexv2})WOkUWi]iCm%9f>!C!OH+B|+c8"Bץ &4ic/ >!O)F|>! g69cx|L C c64XpɫFS+kUQ\15 2xf =Ѹ*-j.u|L C ^ 5x/1jX ;,;܁_a+ f5,Jb ߸F)a|L 郧DO?i ?|_'ܧ~Gm7^|u\N?][{;,okXSŔJVj̱9 Q9kԍԄ4΂QGԘ3k XeFkhSy<lt0Rz-(*9.}W帳\unʠ} gk؆n@<,mvo &-'ISQu"NRAQ-Q쾤ZNQ-1`O FJ42sFZ 6kol0G(/N琢aQUZ"JlRXh) V+ambl-FN' ;u9=C ҃2`8H檟T&+VQeHY'D1y(Tc1Lmj@A=ցexJb55 %aPJ# jCۧ[qf]/o -6A=ac6wYZVwn}5vK FtRݦ\gI!!vKhvBBp'` [z"8J[xfo@JvM]=TP7af,5Ab4SoDc oCs /6V!ܾ}tݮYߧ/~c[v|qFYwY_.?]}E\.C )/-Z *(>cv0yVϣEpެPOZRie%j*VՈy"Uìᒴ&xAN*ə#"D=zscY$W}HD ._JzN?l~ 7:EJ!AaBH6`pjiLm(`R;kN "B1=G3 ( 7&++++űBW)xxxݾ~-G7ϯ7~s( 3m 9˗=l,UoЪܡ0YhkBQʢ1e"!Xwf@Y!x3='%!)QTI*'p'xFٮt '0}ΊcE)DGg7%љ#wGgt7J։Q%Hx3t".Fq a a".&yJQ揆C;#;TY<ًdsVG$=*:>}BE0Ã8)2r(MH7eNLbLtИ=:zqûpS+&A\ V\ F&s"It-15IiB3Eecy>QOD 郧ȖMJ8}qN*m5B5q!{Ak_bv[/-=# 01DD/&HƟT0(FFk`8@RP*LKF!#n$jG6ֵJS'H6ƃC[) džy8{] "xk`A CCKXqЫw ⴕq S6TԜp pKWf0E)) ))SUdab9[TVqajdxj)֬ArKjY<Ւ* bh\kXR4 e#u@!6mVE6< z)R7lJQl"$,H ~!ՙ6#hF? 9{IBc~6d@z{W_mw/.e:Dނl.?G.crV /4cZbș<{rRUuw>(tow)|A;ƀ12yTptz0tEsEG'DF0*JdI<"PH9.'=-7:yO;06sXp˚|ZSٜXK$׭m†GVm.z7YU7[C}nV{k~tt Zpc.@=; w,S,sĹ5Z5~1Xmh!*/jv]y{\Wbg>wT~הDDY] ʴN6<˘0!_$N7 ̙Ih*K$%p/2n%1b`Qizg~ 4*LG \fQiDCHSCKjfb6_@蓌U6 関u vD<&-\PL qa"yMU?k`1`Ȥ@Z3'!hMqPh&.%E9`aJ@ :/Ӑ<H>qJ(NEEu!s~0p6:TLf(c|8A`;l4'+C}3PI݁RS(eCHS'78C>9-x?3 50޳W0*Β])GP%BZ}יc4\LpϙzރyCz#Zj@QeYg-nplF F1g&z erC' `EFQa/pP-= Тl=*=b$@@ .뒪Z7dv1m>C]n)L#/d1DOW[`̌dV]d#в CW1~0 ˮ00( FJS#!@LJhf1Yx`PG e$fY>koȄ \U K̔ ǴR)Nn;%(& af!y,Xd+jP19k`/E7 9 wi#ȅqu%CN]P-rY j4MU9 nupFEq M)E5ćʎ C,9UaWuGM P20@~ގ'Tf5Eūg.2ld|;aNT-Ι .9~(;\rTH1EJB*y-FAީq-Mm[YP4{)kf#;#ؒtOAHm?}bC9fM1~aMA"Ar>+m_"=nh%<"2}PaXtR$)}%q*"q]F@&\wGm "ez--kTC-ИۃC@N0t09}9d:Zъ #%/ݎ1"U?`BX)tĈ?f (aN"qu ɓ{cSC iS=٫dNF@TX]3bE_2ѡbM:[8d/w'gtHtmEB{yuST6nݴr@qfUmݛ<ٰ[WW_xUZ(fϦc_m Oޭ^,xhCw'Q%hv e`Dq/(H쌾po+ _\uΕp*5-4!D{ W#Oe "ZQV>9(rbAD64.sq:*BRTPwwO^RJqB_10xl.ZD44creӇ͠OѢ|cQ Ax a3̩!e4&5Zi0I3SNwv5Fha?eQѕ{iBS]}^5-70%j/ǂ\쫫Vܹnou0<J*LDʳ%)&*gͤDt^wJжlTX%4>MuTb[Ma$&RAotr{3qv"3F~}_:Gܟ4{$95#VYչS)2G9CiNΩJSmrS+Ie1pOb| k b4‘W7M#<] ?BH0~ |_njlxhR+fN"sFF;,SRn$%6W:DUx(;ȠUgy˧n^lxfj?n-ۚ&o8M>ds_91BJύvZ) K" _/30Tj܁Z4 5g]`cX>|*7R R CB9\g$Rp.!4Wsc[!TLLSFpqd8oPJ*a/٣=:^bH^R/^%ۯ1jEP/Qӳr ˅WKATSՆG=GTbJP!kTz↧7r¯~X":e7ٷ,'?0$a7g^ Kq4Z\p84 ֳ sGZ&?y+%Qtt *[E^i$/h_096z8?%^MTvA/pN.Qt(Z7#+>nmCF`/_ 4$3n:IwnX`Sx?zrt ~}q%W@y'_}_|+ܛkLFI_nȌ {?x*B^,g.VlB'ČxK޾Y=pU1} %{Ngpabwl`'&վmՏ~~}Km6_,J{a 86J0w)SCRSu5bO)%[H=C RQ`q=au!IWw~l4ruTRR)?}a arwk[X/":~W6MTGuՓɯ* = \VP ٺl-VC-l=: QVqB>ggfo| l |B-AYi%W %O=p7޼ypnp;֫??~1x2 JA?pZ*8?;Nw552[ּr#T*?ŧD߉hw#$dpOwx{r*wQ>t%2W;tho_2xPO+q0Z+[eQj*(À6c@%mxkWߟ^ \REQq |/jOi<+Q%J⩪u5 ^s*: @[c@8ZaSc 0D,K43" D A3!(":,4L5sREi9uVz8&J"xzAY[F|:Y̹?V b.D1yk៧OQ:ş_Dic fĺT4rJ!E9cV tHTJ$3lPrN^jBM2*EwqCRZ!h{&l-]$o,FIv}⚇ 7Z1Rbv;zr+f9ȋ iU$e9IsggnPAԌ_NSeiST4y2-,wL7zx_Z|Si5Q266s͞,+8;RWBRN+~ȓ{*<;F[&*GW!LN]X?#-Z/%<b߉FM)lEGQRBzJ`9A ;[eo| FDhUָUh̜#Hg!tpiX(UrII?PM)F4rwkcGp)D7 x(;nun:rY7 U tc{Ĝ%ܞ?O-C3kx %C--X.y~;op??=>?iUf^/_]4Y*j87ӜOdps2\ }6MoISNvU:}9V2j:m%N B㛔o|w;6@s0^$wN&md  U JLtB0STG˚!>dpM4Y"+ڞ~d`e`Y;Q [Jk|=bsن}~_z|g]A38 ($]eSMm9xGE'JyN $~xU}{ UB(ZחT+|EX 7%DqJJQsb,J@[dgՖy_W̛=e~jί>%nqڪ ]tKj#6yGY ۰>nMPmX楚e^RWڿ>-0ۿ>}7޿.>;ӈQ4$YUۑ#q\TŤMYMxܵ'?wv9ߞu6]y3߭-h!`::#2tn HfYF;tV30#hR+fN"sFF2)c)7NKS+K]o#ǑW} rd~0k ٸ_J %߯!EfѮ")NտʦRd{VUб{^!֢̿&p҃B3נcƋmHarF+l2ށO5Y&05Db9*i9Y=Al B\GΑg9 V`P3[-0  穰$ɻTmҝ<h E\bӫ_eP3nʑg/,(p.j<\n\>d!_TTcwS jetjŻwNݪm݆Dl1ՌwBM»ʠ>w؄S:wfwBrݲ)Ҧi^w|Þ]rsIhホm= ϫްV7@D[}߼{>e^?۫YY J3(hJyjoSp}w]+=_\nSVW6=GGߞY!z~|YSV;|ÏvO#1 2!LzÏE8eW)E`Rcrx(g^*4k^z92 ;TXSp^o"Z#yLJ`('wk/՘5V3C-BO% +iG*JO&]GJ4ԏ] z7hnMrR[2t+fBԕsezå~zr15vޗ{oЗ׻  ʑ~ts2iv;3>ϊ}<+yִ7 X)h28$lc6Nj 2xt I; ?Zx~{s ,6>CPPwkۿmXmXUAo"Y;E?gm=VJq\-s%h=o+ilb Zm2ڠMRR !Yv8m4 nL a~yE&Hm8H#E/Vv`E3ȣ6!-kkuO\"43|XF!Dk\pa+2I帋eut'eKHƙD"K7!Z<aM̹GKyLInarkcNO8giZVc?C/yָcr<9H&C-1$iLĨg&4GJp|& Hge\9 ɑK^e ) "VAp%?y/hɫ$2 d%3+0N+$TѓXb4뤨q%rKZ%.KĪsjT +FISuB餳 K̆9Aƈ @ !d.HZNXA4q)"2V3 5Zzaȸ*& mo}۴rxzn7Wm 7BF`>p;Y+ςbɆ^[䒙~? dz 83Q43=1y>)6 t6╱s $tvmn%N[rD>9*Pum2%.]DFΫ xT  s WR-ѕ2a񋐢P6RpD$38K TG8& Q@1 Aɍ4O-s%5NѪѺ9|6aKP'`4Ƴ(DIIDgTXRK?)j47,r~=+0o){D\Zï?}VyO/-῾0Z4w%$fU;۷|J7ͧr+Ԍᬙ)џ)yZ!?-bvN \JQ:=ɧB "Ӓ53Ч'}KsVjՓDМ\e?w]ߧ4G?>*XսZTp2Ʒ,&>mskk5ױ}X(7[L(3[׎ˀFh9Ď̀ WRTOoG.gs M-ŏ=X[ F+!l eܺg#U>!-n3s}_7XHb)bGD$c{;*"sǧLj=Y.ٙ>lVp1+Ʀi'}t-&cN1ko+玮m" >8 }vt,c>jqP4 e %մYi$ߴ9ta7*^A(¾՚;-PpwO[JJ1FC-H3"} J<+[ge+l%5Iǝ,p `˝|VeDZq^ :bV>:[}P`ڇf;&5 v Wi0u}W*? \}í?@1L>,B56Z)_f±+ɵvۓ|_3oOapJM5ۓn9#@?Qѣ#K cU.vB3Ǒ)_%on'6vZY+ԸGȱn8εT S)H$ "G*V#;rZڨ R_`L4ZkGv@6[*7@!%#VPm=hEGUuDż3z%rH2xҐ&'W"[D$'-s{DNFi89(%i2`ӂQ*ɤr"*m6BF0x^AI5&`AV$q-WE裞F -yX1wXvY2R$wA"}BVX2'A)GJ.yQz A|OA)eɈ#y=|Hт򔏸l1Q<%PM=ENd<#C(=#9/ 8Twp}Z}HBRF`{>U=3fIlQLuUuWuQ2@µIfOf9N ym*}-ЎjEw=DYTkBYwO1!n M]:dXx %`Ôk,ILlJTml ¬ qH  ˀ jvin>qE/5AVP /{b/&]}+(qX[b6NH9"ͮ؍,\QA8jz.IP@zPn olR4Xr$D݋!kx,ofk<"QAR3FA/6"Smu\4s*fX9 $ KaW$ƒ L%oiaP$UjNJNX9\`*T+$@"Pskx Yz̶h/t88Zr6Ry AL2!M+|#VeGG0U(A+x,9 Xʑ/V]ևj.[fսIPw-45*J V&zX`*lc Bih G-Pem=s% 4+k!ujw˩O.,5EXsNdtm3f nox_8bR lܠ ؎Jf`5 t{`?,yshc`;Wa@5vM5q/dя,rYAuz:sUɵZuoe]v:V=x'*TW"a:A?KkѰ9FWKd 'ob 0ПTu_psԪ +D8fD2 㕓h2yoO,~"EhKbدn57_9AsG=븹 " 6=Hp4؊;6>YIGJ;I76"sgx~T0~v Ghm~7=#7a 9s{_V >+կVK}CDJh0;Ӳ=6EOag ZcVd-e>VWf?%s4xHn5wmӵ~k? BG.G˷o?}7}"v,yK3? [>)on˳;r.Can>Y|o'5 \WoۏЈ?_YxϑP$NG$[3rL,Y#b;XUBt=PvȀ˫ߧR1>"B.tV#Vtʓdh䠧v{ec̪7&8_ 6)i>pBĨg D+( uN<z_wk&Ҫʵ_Lu%:v zY]{j8܉pٓ,7reyP$HgصӼVULrݼx=rl^ UtNYuu4m-FTg;X#Ȕ`m}uۅ|*F'8Z77ɛ9u[˃Y9Qvܷu[}кBCqS=#ȝ¨]3NoP&C\d.9nco nˑan[L4{o*ggjӑ"{$:Ļ/?ɟ;xCtŧ=U[ MѨ&*7mC Alk  5~lpxlsckF>ቱ5ăɆLQA*&ߛFK UJ$ ){WC&+WX_)Z{Hy(i[k0\7|uf+ȘūRG'`56[SB_=Ǜ_ֲ^~ޚWf7ں%s^3bG^{CP_g:mzF9+A-1 ^vlg:ezS2``fzׁj0d=vY/k: qZ}ȞW{XAW.նQ{^=2y3?jם>8}翏O_rO|i)Vpz* Ò[,˘jccვ͒V`G+U[nKնO\Rew1 %{yz4D(33xdSrS0Sݙ,p_Ӯ#VpHvi&>XqJJ>Uh*4pT{FX`^prOsב>ZM1Kp-FR־cN6d,Kp=܅|*JO {кBCq]SMxĖ*Ꮭ4^%ȞZtkVi%=W0 K5\4EJhŻv_7|}{E6 \S)wELOC8B_~ˆIt8ӆ+86}ǧx_jauط 2LaҐ}sþhk0jUdOeA T{-ֲA&soa: ,3beZoGecXk>4[r'DF%35C~$c'%$*NMovUz2'vqɃ"<Α<&N^>'j9]\$խYRƁ-K+ A0}ѻZH$jǩ^J>DoTqc]1IVwCJ%dϧz|^zh{yF1'm\kxN36Fu0׮Ik-Da>+>zPǴ;W Vv-[ЍL-LTMiIrvrqQBXuՀfaQox]ڳ$pj' g9z)Ԋˈ^<޻ӐTas0B>G<~ql947U,Nxq,ZF :bO^i# 7%Au5gQ  lVZ#^cPH/zCL$D*ϭ'R&j8Uv ȩR}"9A%`sRNɀk(:WRP}.1G@NCDrLdC^ :[!kbq k6ʫ.Vl9 xiK$Ҳ2䒆91&cJ6 &HZ> xtE9⢤6El=qO<_AuJF1JRy!I&sѕ z ˩2jZ86kk,-mGE4cD"Qk SlZAuﻺ؆ xC1jBՉJ 01|nFk( of8Q=)QMܩ-9w-mSxwS[ '_כwo}ǟ:{x}/yBy g~plnrpU.?ܼpVozsp KK_|ܫE|svߟ}b{O_\ʫ>\m费ltY~q9uOO%(k<'Q='I E@ߖ 0δFibl[b^xs + 7( Ux;]G~ғ~ Reլt@-32u^J ?k L9-+BlK:Flm@xu5nOHMTK+B@Dt?FB^#а!v@UV!lAR1 ,20#Q)ºca9ՐZ:^ak)&%U B$6QE*nI*5h'[,%D@*6T{%6޵VlbbGm:c|UX`XAMc+<^I Qb |ȈrZe j8GDB^aȵR9)[(u L0&0%N/;A9Ɉ0}44XdțO2_@+)6u9XY|uv>>I=mdgZ㸑_a貏A"3"|YؙxBPW-g M&Q]/mbI,V%2?Gֵe`l˔1/jAND0 fm-664]jjozEWhÇ<)2A G9-XM{,.WeaXV\7 U|Ѽaiq7MZ}~Fٷo?{_47]WBc?ߨGٯrA ~*XoUި&t K3ɸ>1r, ՜ݳj/N2LvMbmc>fq8l"{ 4uH%ؘc2l-/?2u 6AJ = D:L(=iC)"ocQ%/ש/Qz(EUW?@T]5sLJ+^SXed)C\C;e^* oNn8858DmgOXvrVc!k2n:Xq{UN ϣEm^fЖ.$. w7?NYѻn[4ae`y-e}ᗎ'-n=yAv kD-t󋾿+ !(m@dJN=!6ZoG-:(QFpbF)mJ>A.grapYuxEv}_ڱ]!-λO>^.W=_ytS+9QM}z}D(F5ֹ110.2 x^Wxxxzb|9<(d??hgp.9bey|La~=&nmqT8ZF>SNgj#t@lZ, .xf%@ms1%CLL*q,=Z$ٹIL%Be3{)12@foHˤx=yٟO=}򇧗>yj\UdKTEkgDYFz1 ԗ`qfWDnnMY9KٜbfކQC77ݓtކM'X#9hP}j>wRRYh2fрbRF0Rz H l4Th7<70/zAtwAt&(ϋ"-d)o`pѩ T[&e;ceo"IB6`lF;,R&YG<6ΖO:Օr(yр໎6Z려(]9>S:oD1fr8ݵ3Lڐ[vcYJte?$(gWe-2xb,ة-Tg5m#=q̶x'Z_eS,V6RjT h(%J]o0F6Wg0{  &V;#.(P՗T }FI*}*C4J_rj3TtF)+CdMeQ+CKϽnOs}n8\X> ?mjC^ J_rjpFiMJ,4C@KT_Sm<)QJRRK6Gҗ\"3JO_Dyn ~f0޼YՕ30֛aȺaIS؏NȰ5< ƍ Nh+[T 8lX>lG;aVcdH…72c ᧩#Z؀[ -܅55>|=-W|k p"7C}uV) &%>6!t԰弖ַ͆oz 5ۦ5dCRKcYF[ɈTVȲgl%YPRrB.Pv/<*ˎJR>0/˧^{vcz_;韗?[#*ɄqQ.yjM֩Pf:oѮjckɫJhN6ХM>1{Q5q(kt]dS X Պ\U2ʡt9`]A!L`Sx{&V?<SઇkkȘxD1{7^2xs,j4+,8&ɫ8S\ /zo5ј>Fljo56! 3WW*F\VFsC$rw_yóՕ-`⡯auGuG=Y|GPE!oQ[mRDgH1Zk:1z"NJBl%Z]~+5{Q-u{ڢ9JBC)Dcq=ՖR!Xk Q8pg@B廟Ӳblրi0.u AlƫK k>Md]v9yOBgm%#D ~7Bgl%*x~2Jm`,_s\w߾_e_*}.zYcev|krT,?i]绂wC|.^tי F}:bo[g$#t0^/ު_s`Շq%0twAut[RR yq> >ڇWqe7?z5FgՔfMj3U%JO)ͪY@샳6wMNgL]L&AT54(?8J&B KV/ {4nY55۞(xX]4N5.K-z cҬQ%xҬn1Y#HK8{.ȃaYW7eq#6\^h4K1> =W`SIpp#,,rl-FzTw5ox4*GrcHӣʑTSKN*(qmZ1Q !%lmRXo]q{b2Je3YhH`רXG[VOa8'J{JR#Yeml%JJk aU:]R֓0vםN vT<,ŚFGV6Sf΍ylHܨCP1'H" ^SMZm+X" OE#-9Q8KK7qB%@EdѴ6ѱ 'o$9mr%ܗ,sPZk98J&m1\mЅK!+`o/0utݞ¢ؖx{&V<<}Sܗks_Ș%eDKVf}fb*/s e%X=QLt8AZgόw"(GA}wu^Nxs: {ѯNЙQ md#g3XUY$NlMBH*g* ˟:g[0B}wr`'.ЮGZ(ܜ'[Ģ1Q#K .`\DjurbHd|C~ .tF t2&Ʊ,YVﮪdV:Ξd.AIIqw)eީj^@JPƦ謳$a 3v0IG ؤ:0RA+3CZ'he8`uoeXtlv J1CLgtpfG빔ֺ.0)Ę8i:>}|WJ}?=8[q5Jϔ7 t-Gc򏏏~ *wMOzuS*|o>~F,]v7Y/OWio_|,2Y) o.ZE5zuY9{n`qU }Ibo'8 v7H^ilp3cgZ"}%y8.~U,ɪd\ʿ*.33QjR5OKw|("77XG> ~0~plBL^[wi'$Ϙ﮾8{JTu+}Uz,ٻvs˶m{]nvw7~x}z/|Og^+?ۗ؝;g؎d&|_&0 ǭ">}o&_+8`Y!ܜR*Cf./ovyjsڳ~wg3=aA^\߿va;yd /ΒIA<~kf8d;CM?ŕ s ڙ.m3±ggg:ej_'pWçӵ1)"hv(کӎv==O1PSFCN.iNiHLB8 Эs|g" 0'XF3h!`(WGήD{WGC_{$o0LOl~$R$6':LViٗ6 ŀB,A0\RSX>6ꃞmn[: -r-D)>Vpכ?l +w[wծ>{e~ws=)x` 8/G>,fA :mu l]aӾ1kvɏ{ݦM@M4ǦAeػF-Fw/ :E9wOnCX+7,`dqFA~#Ż͸:%̻nDz!,䕛h˦"i^ RLkwU:hY 0?=d ͛r`:^گXn:gyͅl]z}wgl6_߿g\ke]m^mʯgOÅwowu]9 ~twf;'rTMSEz֠b8#Z鼫/\83WG} -4 I+A=TlE3Ws[M,̡8VT4ٙ_uZ 3kS5ЍLq ֆ$|tá>&# teW91fպ֖ȹ,#P̿BQRU- LSLs"U @ޭ,(rF, 7-cª((eʢd*@~k w9P[W.g?y¹w?u~/JW {y?%&EIkN^+ Zl]lOYbLWגI#"akš4xM@;H ZinlMUT"8/sU'!]Յrւy{i`}p90JEbW|S~~M&ƻbqm"~ LagP5B0sBhOZsdb< >~[#7ZiUyl=R 9B־Gx䊱$|B~ AiFwɘFA88ћgsZkUF~Tz)~ks7G32f>f449@Ǐćm\|}vm!06CGir&_C$bدH*[YRy0˾%{RșJ:ԁd2&++Z 0/ c`&Ls q0cӴ僥Fc{hӓkǦiS|ǶR,)oTϷ:z*udM4ǦdO,uwݔQl Ly}#i-=O7o!DslY1ёwB[.16jd-?#5һ a!D۔ LDNTJ/0C]+/Χ?F󆓤~~P&rf-4H"IRAVBFꇑiH;?Zs׶gH٪t^O}%e|j4Z+KTJ̈́J?*Ѹr:2Q%]ϓ!Q 7h' \N2` 5Ef~SX [+͊ :@YwUr|ERZPr~WJ#Gf{[4 !hP̼!׀PG~,ao-0ؚKJDL銗RLW S*C (-YrW;kPԜyYR9cRT6(uYЙqWTmdQZ]RD*P$@Zdj &(4Sͧ s4:b?ʵ!g.+ThMQ5$Q:ХVJ8ŠWX[S+SL#V!(̶H:x;/£K^{ WX ]Ci?] EBʫR9Pԑ(5Bb! Ň'pP|x؃oCWBXN摂MˇW7_~6@" / ''`4$>ql=ҠҴq(%8v4/r&6fB:1bAkz225XM1݉RdBzD|:*GWz)LxkcLx NNkQ|y.yqtzaAГpxDSrm)X 5F;K.(nךE=<#QMGg$MvOOb HQ0 uiqiBWiU}iH׀N N`j*%seRk lw\ήۉ44biaY H+zCߚ'BO|ozg_ #H5A\TWn'_vA5S?>Fj̿r1=F 5׌>U65W~I5rͲ)\:}݈"hUs12o$yb/S3j~YwB^Gػ%-FwsAncl݆Wn-̾%I9NyCg[z6 zKKCho`4]~/4!};rrVmzwO0|\>ITFڂTo@ VAG1Ej m6CxNz}yٯbz-x]u ƻ/q278r{%ܴs]>h1/mWnVIoorx I+qNTX-ŨaRPϩ!PUcJyd!&Ɯ{4J0=2Gnf'֗>Bvޅj]oﶫٯ"$֭ڢ{KԻsڮ˛$d0? @#]yŭ]UסT?K9ga+HZ&lP &ƿW#iJjcq9Sc[DAUe"%g`լsu,1rV8R DZre-2Xd, (X*)UdK(;Dp5l3_|B7\-WdX%' ]W^nT|$[!zskMGtM eM7tCǁsNfZሸyϤ/e?>ζ^ X^m_Wݸko/+)`}~j J4,Ik˺JK uUKja4eI(l)J]>XOa<j~=?nwׇ:?P+K o/}Qee,Ҡٚi)K# 0Y1h.G(Fk>, VvF!&Q̙ڰR(9I|tJfK jJ ekG6IhiJl8vD Z(!Uc[+㇢KfbЬ?ěNhچ8׸/z8a' I % tdZ<-@& eD#sNsZ!I"f*Ih 1T7?L:MCGa(%`8K"Bb={q%6?ξP/uξ-51$Mh!V +VCH5۬:5bnXktl7,䕛hMiy7# 3z\ bL')m TM4?$һ a!DlʘGmơqr11oxw^b^9m 99.:Z@"@N^֦v؇hcPÑfcǡ&D=.!22ho"2S2RV{I֣u'> >e%< aNRN;Mau _^| ('8,# nʞ, =rS{Tg)VN$ O=S\`ʴw`G@!91+5ZX `i>81Q%ZG2^4X1 -j];-~׻$xMbXs/q`?JBvW-w3=cwզ6/eV* Zi.JOQ}O6{JJAY)J؋STSMkbhʖYEX)o&zҫR eVa2|)o&KĿf+l@]d ݡϻh(5mF XYF"piڝ x(90 yLdەXpюD3]FZ-&Jp{@?{:)XWj[!9tUd :W[iK^X85,50wQ/`%E7t%Bk$uCl5ld!r$[*`MJkG)RAG#(R3&('D>OFG}"Iv7Cxf[e:CE*bMpJǨ0zY;ܙh2X: =K1EL莦H}i,};҅B+ _X 2L7ot[K"xb1oz +s[H_RH߆Q }4(VC&,] }c-DO.xH`0)B+Ǝj2HUˋE7e>kSeNJ#4@Jpҗ2̛M ^@R4շ/֦Xט[TZгXc5Ts`X$sֲC(',yF6`4*LSͅT,G[ֆjK!u-eV)p`.JYj#+j+B_kpVtM5r)Ľf+U%u4շ{TnJ,VzV*U <@ꋰRʬtM5{-VzV Ja jP .JY)l@ ^b/ŵ/U4շTtV }^R"T5ߋ^banJi7Xn 7[jlFt;(p%[DAo:xЛ@TRd ҕh=͊fHsN F,#t=\ 8_:tZ'Em+UQr(]9( E;^l٪_,y]`',:ꥣ|kP YtK7 yLȕJ0a@0Xnu*l۹b;BI6߮zwo_Cx|&zi"[d[uB9qRe P5 ֋dm>_Χ$"r.[g)f݊CBExcld(&+1aj%JuVMֳi ؼm ITl%c"@q[=XvB34q(B 9BhAY~B/rvgGb1Bhz*s[PRІQB4)FGMY : -߄lX-a`1 +N!d_ ̎΀:F0L+ @g!bL*|{ YzZ` JB%O8j1]!%䎱ZlK5Yp;w, [ܷ(l 1lXzBkLd4ԉTڣ 7b.boոmti{{ `SծEX)B&J-VzVMɋRmʬtC5ۥTvkܞRږA}bXPzxׯeHֽ=q`c^`VW%X9T }|_%4uy"gnK2g] " U[Lىj )Z| J2ອTSֈXtVZRMKOkR*ʢ'̪Ja,bv2&X"~V]s?<sCS-v&C2Dt%]:Ģ84!݈6'k4 J1~ɇ,d-mVTY4 A'r\yǫ]l OW.H'-:ꧣm-覗n͡ ?e Q.C{b~qɛ_Ga?MY:-ք}?V._cx<7UIuoQO_~jеpE)!3HN|u<b#7&Y "C`/ ?F;΍f4nP`v?irql~z}m#`BFNZU4NdwUYB,CTEk^ڱ$]6u IUm6c@"g4ݲܷ#,"mrVآƉq,+:ku?9#O9N7eq[]~C(uTؓYԣW"OYvm3ɥU~Cœ:}ri9+( Q/J @g'r`*$VΎ\,)PP4!V11ԧeKpX eX(VZ*lWf{' p -BSP>A1R LQJ #BYEjn%ɚg ATT$bxz dsupiC %S04VT@3/n<ޥb~{]N{O֨QZZ*'g`:p2( oc.Ϗ[kD)UH"Xxsck}(yҙM*Ƞ/@_ +Z;wjDP6iQ&C++?>EX5\2 ndwq_q?Gd%K6=BnwWX,6V =-A%mh/}s>lq RkBRi{h$s}7OYt) Ml~zKFk< }F Cm.9 A dbrquf4NNj'L 4_' .Z3^7b9c+5bO]uug4oUVO?WJKYҊ܎vWӣKq;I-)q`oc3=lѥI+Э*Γtvr zs1hQܠ qQp(-[ VBE)ϯz}֜ʼ:oZ@_jFfHEO%V7Eҝ"-?*yEPONaqRɩI5m $>6لMY8hn#jwJJv_W:΂l\;kb(A ;Rf/q] ޥ|K&HK-(n2 NQ{. 9ʦl +C K!ԁ{fqJej^kw[?;|[ђ}Cdږ[XRRJXYGNV,Vd(J,sXU+ S;|P}?2Q2b Eq=/(8dN=psRް^(%~#JT;0 &]ʳ,=kp*P9䊩4ޗ>@dea*_= }d""C.p{EƒD8"2+2ViHZLh O*($.5 `NXH|>aGdYeTd2zs׋V/NY њOP_= hMw> PCԅ܅\?2!Rް#=>BNdkR甎"vwkgUacBK;MQYޠ8jpfjCSجV>nF8BLje\mjvɛ qN_6 =Ҿ8;v˻۫p0 ۫X4 v!?]8WN*gly ^Nd/? Q=»_a7UxNOfJ;+bw7wA+ɜ!uYDYFU yTaVLe >lKci4P@IR]@8?jNcKn p֭/)9mSr+`֭muBCpSøڸLy*Q/3#.NNTʠL;.^J}gb#m}H<STz =[aMT/P~ڀ! uJa~o$uB#E$,F$}R(]x*4IJK(Ӭ!Y 5䚚(&`BKNZ?g_G&uur>*?27>db^7}8S@_WqM>e#EukXOFkx>d24N_~K)3;J5w\[Ms"sqKSU*S eXm#GuKfs㑣l:F[6ʾr?$YNCD傂N<̻LJYRmK_$uMeb *6eMzW$z BJk`ePBR86PvFMGet>NP(!x8ȀKWP5$,Fm7k˕Y(`4BrRAzE`“ûQ`F|Zl, Ѧ?Gj*ܴ*QԂi$Ÿ~{mOφ]^,z.f$֟0J˥IgEx\|XWqӫ[4ӸOz8yxCy5*N~%^<N[q?d([n_YAՙ9mx~+oD2 G'`L$`k3pVQu40/,J`H1OIyÎ4qs`%c9%ǕGVhQ*B%c6v֗m@xW%M22ma\3NqlL.a606Xow:*|,EF*<$^@sHlIO9O.$&]!$]eF <|C̳jgN[Oryݙ钒red|jv/; _1Yo|lh/yT- vZ}lXvd6A*#eSeIFP29(nakCf,͆In2 ^Hɔ, JQ" pg'VEekڻPTKP^P5>$qӶZC]a$%i]8VC% Xuxu (+}MHĆ0r!`#^*_sSt"zL.v VFaL[owQ10|zfWiIa,$ #K 3xƨ%' /zs J~ ;K^szsׇ9%tV]0pn7Odb4]c"1 h4B(rs)I[?kQ]i)oؚi1ڶ:ga0NVt>K^^,Ȋ,V#9/j(*iJ[[hY8v|oi~}]|ߋo2k|7z X}5qxQY-Eʷ.oW~X&ԜJ]<ĐW:H{- KQND)UYҬIZU>P*ֵeij;F`UW׽Huoө~IE-Ul iZaʠQ~갴Z$F .@ ӴZG-}Zމs*YP`K&LZT;tZ7?jReӴT o:-}TtW:qǏ3V5ZwjlO9jRG,8NP}CS'V]KSZ43iykb&N\V-:նT]-:=o;FKM&+&Zt%n_-G8  x: ',tբ?|OU]jgmU\+Yϙ-̓u9VhRS>ץڭ /iajѣgm8 o] ;|9cO8AU)iHRσCr ػ=UztQmmբ't~v-J4Ƚ+r .^M(ɒjU,1D0H,<.1@BPH Vm {r ju1xh4=42 M6aI4+:kM𒓈fs.tLIF"O䣠5*4')&<[kۢ8ldP[Ҏ"OS]<iރqXf7{\F՝4wGbqMeMsh\4=\ZӼcVJ0iiU)Pq[!Ӛ޻9p|+Bj`f]]+mvPӼ `YpiZ>0ŢU9#hţV[Dxd5-:roM?*DJ.Z+jO֭A T|6PJ̍E 8νf1|3Pו$JR՜8@:LxזVFN=a5|t7 _C~ HB js,F$;u会_Ň ~{67Ah!7뉲2«Jj PBUcݧf\5m% o% AX>p_!"lyJ- lVQ |~^~Ѡ13U7J"X~-̌5v-bĎr(a(!g(І,q} i'sLr.!Y~ЁF"d+}WUFi~Ż ,]J|"Ҽ@J/z3gS.ފȴJ;Yi%blG܆'۞@ǚy(͉{sF C|IHv=%$ZP åu,†JL9)`ZƷjv(P?.2`o6~o!4! 4=vhy 1:3%^;4"!R4F{4"f e$pG!ORzP}): "q@Ʀ'D ) ? LG6&j\:EVZQOi|fsE]p?wY=|zN_2_wO?5h b^P3 +ªfȟoP"Kr||'nNv8zoOn>Q`@ɵ}c-"IZwUnnooq P3̫4W<6ڹc|>!OQ-FJ ZU1R61(cPa>txN7 ~JZ3z7.qTu+ ,:׈exn>s:eQ;] _h@7M_-tQߑ{ސai!n%5ZBubaPDf %aq x v0W\eEx=_\T?"ÿ#~|r F?,GaLA`q}u;-^k K=fx_-FK_sQz`H @쪺;nbm~UBCӺi4Pf)>Zp+Y#GmqV(D~/jBK693g#'VьzeB?{J>D4&#f jQ]7j_X6d/SˆEok귚?&쭨~w]VT @m[PRA}di\nt*0⃏Eոj7r(@on!?b ;tjK6FvCr Nh.r"eͿ9`t齳(C2+=+֠._ĴxH֤Lᅷ\Mp[RwxovA JobS!N&/1z,cByd7 /~c9ٛorR$;q>e fM֮/?Q΄9|Z#Mvl|ZF/*߂Zu$/*]GON` /(ZnG-wӨ1^ \fÐ=m;绺iA?C2j~Tþ;[Z8;" X!ѣ4tlTBK^Fȣ(u~P kRmbÒ{w[ F:xr5ߡS[d`0I2\TUp},[)h+},hЊ{{hHͳҨfUOz?<{}xGh=F- Myck.yS$Ֆb3Qig&+ Tܵo\y8+g^<5eDZF|ҭ/JX;H]HSmܵtکP !_9)y+Fyc[_Nwn{"f<ֿ9utCr])QcQ)¸8װ,Va [r (ܼȃ6uTH2foC}"X}ҷXzeÉKKz+қЃRʸZ KKyƧ}r)/5Ԛu#RyN\G=?޻4M3}UaV{g 1YC9W/߆bZNӍ!NBsddjh=ϡ~'S⧛x1s o?SOM-& 4:~@NX9IH26a$AAAOsJ^j:zE}.@O^FC )X#o sRXI2&|,n{2L y!L(aG;HA&nơ=6owL_&"1|YW:) joV~Ub|su24 n_{וh;vg}D;eB{瑩,ѩ~gW.]W<_TGO g-z5r0OD4ScBlZ{l$u:7/mhԍFFPYg4hJbm Ú$Ӱwe`ݐZy4t~,",9ʚdS?ո_ 1wP &Z+ ֝YA#\&"csM"F,F$~n}P\g iH&ҰZȈaxdSf4G 6V/h d$tK+]$1 .BdJ"Q{/8xRto&nV b{sx!\BP_C9%$d9r3BaQh}`Ö(G#Q)q%Z1HV5B2PQKF3iQ1_]4} 94^ ;&W*ËX/컻~SN˷>/69~^ReW~z 5Rס#`6f"4Pwc\h%F!rG;UE%+l*o橬۷7:=z u}{tU\+:k4k)ʛ[*T0N;_1BY]ƉY@8[01Z*8UŤnv \Z6{ dZTE]RʸZ z 3BWBծ3m/֠΍KK)RZWK)jII72^ƥ}&Ap)P) uT B^O2N19L;p> ` LSօ W)Zl )]y]'ʎ%am%,Yzrj)؀o@P&>M0S[A7V(!iRrN;5ADF/pr>/O-@CEiJY`y'+W<7T[nĖuI-T N4F#Xn ND8v;QۦCRFyknT_&?G.͙&ٹt y^9Ԍ^mLFzraRo7ğ2~)&.W𳫻JZI$1w֑x\A BzaG0+r]L:~~pNUpWga}dMk83]wzܫkA NJޞz-P J }* Ȧj,,)jh{6`5'2 ) c[eI[vA| st5pI[t(܊ۮs`ZM(z̑ݑXxz_&9E/$gE)2zXBD5ڞriD-a9w7O­)R67w +׳Rm5;0hd}H@2IT&9S,P v~ s9hTO cͨt!OTE*S"Ad* *lg<NNJQkP-T26h{l~PF7&n@Wؒ%FP>nKWqeb+|%HjNdcO;=lv;-c{?{ܸJo9Ti29{L򶧦Ʒ$MQ  %;J*}@X;:;Z;>^=L>˛;7VlqTd•ѬEֽjmVcai)9KP2~=x-j͌.?Cf^K[O *R>L#lUt䂓 뫛49٫!? V,ЏU@x~J/o怀ۻZ#WWw}'xQ&”<~{vy !/8#S=nP=<`D[ϣ@pEon~uuzZIFĎ ="y!kzh O3X { )n(la3/Yz|u嗯=yPT!<oC<$/!Ϣ.e& Ss+˗4`S鐑 7+RoWWKoC3뚌ymE!z{79:wی7ɮact"!7PQsvG=E@$ jpY@[b6{p$&F>ķqh` E$M;B1JDhI[58b])וBQU =-P-^-KtW3zr-&#KW/rRG:k.&bW$zI>uyTIàQ\ ! ` vyo^P#޵ { ppm7>Eu` }Ţ\Q::z pr6Lj-,w몸2ؘ',UsDne>Cܹw{ jJw]yg}w߳og8.+@t8խ]7;/7o)] /~+ozٲmů|`p>^*̵z6\ȸuߞf9__F{6ǻw#;&{wX6^3)Tp+ްDb`ST.A,Ծߢ0؃no!cL'fa[%C{YuB gaJZ^%'IJ$R|0^'P3Y&5ŕ@p18\pzݿ)CmͳPqn_~r'X@<g&pB~v\2DIaqq{ES:'O=d;%X .T>̕D!Kʲ( j)(]H]YysxS2x($Nmw(SW3@2RYa JK]N|) ^9]tPo(-;Ut㟃ĄC85/M:~pnNF1Qd/ū-jy+_m8c9aQT%P>Na. ;ł7$7I=C87 0t4dS~MGk 66?Or=nµ+\nKSXRRlw" RbaqJXWi (3h-V3M2EI$S_?;֝=x&b i):yTH1y QƦBiP%@@R!J,${H-AuQ}N#Y$S_`RMOl6L n1/F@R L)A 1DQVr\ryrK$R$4"&%rųS# L}Ic`x"ぽL}vrʱSQt]5xoM럪Ɋޚc_02hfMF:F*d4w^~~CIM@e$ ic u B,03z1Ӎ㔾ę6WOa~0uG&e=zfWMp9T[\~LJGGu2>>g5XA[wK)9^ R#o'@~|7Fn] _I*ߞ]}C<\!yg}߭ c ׃>O%\ag>^ ՕV`l&xd$ټеxw>u+Xd&FCj#N..[q)%{K/]yQZ5sY3y6xz&WW _(h8Kvu&}  ?f _WKz^~RŖŵw[",Qރ e@\eU%3(?R(,A,5gTfY __/.a@Mn{=dvk7օԳ>2p3ՑXz0C=H k8 H`O\7X)t9'*g,?`mDQ|nwn$0%@'ҢJoĝFRQcNg 8\yy SbfLj+٧WJr42dAኅL' R\_J\!)cU"Y16; }τ]\Ĺ/ x7㦩;諀{ǟDh&וUSs@CW8b>[0=@=iaһ` K]`A4%{<'*xХ"fhb0(E+dHc8>3Gƈ:WB=ѓkq7Y=zi468qkK+,>,j&f̀r7fڄܐKM7PQEy8iU ]soN:_$OţJκ^&KZ-{tUB7,4sJ& I-VlvmER#O:#U>jI7tÐ/3wk;"X}[i [IGջuCs}zuvލl0#d=庙tbʜZi&B㲪R'C-f m 1$j3 "k%øK%'3Jr^VlВ*ARpw'^/'ڀ/j*uRٝG4RtBC=&;2Y18?B.r`Jʛ 1.3˜M Yʛzi4\> D',X9 Da&6 i=q 4sOxSy&|!BN(&;uOpw'ph鸨/ (#fӺx Nz_Fe_ܥ"Hq>Ğ$|!Qwޗ!=j%lbUX$yx,JHT~svכׯ޿9;e߿hXedlrw~­3HeUjRG [׬ k i'h*#04u$R`QR2κ+mKXUB"hK~Odo*⼮OI4\mF@3Ce"wyq1Eg'5Yݣ0`*@RYt6AN 1V{YuU Eg'F3 SsQqXU!5Ψ8ױ6`ʨ$%J PUt7X]?L%굊'#-QLaͥSo ٱ lZoи s1F dYAc&,9XͬFNf3,%*%Ίܻa3 N] +`Wwvoi!pɰ& V@)CVRC%q 4VXTK#,PԷ]UC ȩ%Eŧc}YȊ mmO6@-mu˘Ty* J!H]8^.:F2/b@Ud*x'|(K¥`Uv}Y.Z!~GL >Ե (/b P4Vց EHVCܽ#?!'!݌J)o MѶMaF)76a4fpcۆ/쫻^Iq@+V53oW1xuw_>4Cӄ}'Qӟ}?*# 2V87?ކϋ=4C8iWE{y_?}uqɘV9J3Q_7n{q4M&i)ACK<+Y"- t U~%V.92Ğ:C`}~#+O_1 Y7"[ w(7kWoo}kӖ, ȸP[Է]Zu+CdBtk)1 HT|Vs 8sx >NwrT\ A9 DtnHCԑLLnjj6Ιy³*g80\ snTr__=Db>MS[11WBSUCuGe+%;*)x?0,[ؓa$'ຂR[ &F Kq\ɞdHѲWq\hwەaB-&=Mm-V:ssO?|ʕwn?WBWo[g1tUZg?x^|0HG1XL<6%j vY].5[̆=뇖)7'e#y3w!>eZxst7i-Pp\g;r|yW)ɢUftCR&A-"w }GNtqmھ`m8gOA" uDLBy{ͧB9O8HLVԮ%+5xGK%A+r^zJۮZ8q1Gh۱錄Vě}4PΉp$rO[*>|b:$oBmOńf58h&;ts v]pPb05kZ9w꺐V1P+jSBEA“k{ks^ezҟ -nelj5moY:7 l\YSb9/x% ;5ReY@+F(C g}j:~:5"eXDIl_v9 vY[vMFp|{:t sZJ7H u]O jO;X)|T` ٷ.'o\B-ό9 +uR8eaua‹Td]EQԝ2Fjo)Z& f0s*X;0U-2e]U  c0D+񱘥`evYrmƐYECUz$p^80 Z9 Tq)"m-8[J.E-P!fQ }R45\F!Qt!v5Qڑh 4QѻؽaP(E7vT=*1h3TΝ $$8̈́OeyMdlcvzj;`!N<VYO.1uq[S1,pOq*fDbXjsOL,t- pv,o);jKbߺHZ; SyLJWpk*Wlj/j<`$Jӱwy6V<,Hl9+46rMg3qeOLorrs08AgUs|NM9pbD mn]Y*t?}JB&" ب|SOKMp-,qį7zBWm ZCrvO62Cu!n&dSɸe?Ln1<䙻 yz/I 8T6_ljdmx3!ET GԌs/."w }GNtsEu6ȝgb*HWI:u{%D1dTR9Ff䝍b<{fv^77+ ^@ !HMW^! Tύ|D/KMMH1%4@aԢP#x<.󺃤CAY1EFlĴ!ڦmP$E! 0>dlkVSs|CJTPWC B8]PQ9t6x'`&h%A`)v ~Sj-CF.E$uPV!ZUEe{E 4)rUI.]MʊP/1dC ,tNaPuvH3ڐms 5LaͥSoa@ oZo᥼иsA1F gYH@8a42k Ff3,%:ǸՊvݧW͉?JIݐΡ N]AŒzB^z8۩W=&n#&_Wr!U "ZSڟ<`-Nav4gz UݓڶV9\b#:[G'p6ϕIemPr+0.J"| k\ >R`nmHc©E-`<7Reul]RZa3/UFjr\Es㽤Ex)2ci#5iVc^Jjn Է]JzE{:B'ly)Zh@9sHZռq83t@ rG lsAor8Y5tPdLк8>Ux0uɗ!$o[}q8W1UU??|LL/ۿuHM>uȖ3t8!]Aزr^Jd!} h^W` UxN B)B#necR"c3',mnބJR01)gu\EUgSqԢB$!rUJڃ#Tb Б DB3d)??C #+ڴ"xqb5"6bʲ󗱣#&,9! vE sqF\hYB{7s@N]!eB? (C`dcS 12B*Q{U ]ԑVTm)$u!yAPcvmjpv|ZۮdpgTw.=KY$kdA.a}[ꬷH7{=#M"{zFZax=j*Ç1("h\sx.JC6,y̆ H#nc(2q3`]X]z`ʛT+7 e‚4dZښBsIf۵d^f(rùG2D9ӔZ&+73j:= yci;r-Mvb6u횗l׬sٮ 3`6L1!:ig)`#%ar)Qغ!LpcF9UVS 5خ VY*[3A+Q90AeU{[K3lל7lc6ҙ3 "F77:e},qS:sKD*-;ČU8ڵ풏wӄv`5g~nە@ xLu{Sp]RY)_IP9oE%PלcՎx<t0{;Y}EzJk:}-nYpڳ[|_}j>4윍IJD[oY7Z=-ٓUqB.hhd@ J6V0'Q>$dkfRl4 !0j>ࠔՄ4bmk&ZQ9 xlPL)\ C՘(r\sIcpN%䄗\P \ł#1jNB%C-Ȣ0R&F%+BBE¨Lͼsi60lMQ(0"4h`L:g3I{Impw&28WFL$3d;>AP -xj &V-bT#$LbJl+gC`rH9%r2&XCp͡p~p0c2MH?l6G;h^\8bfiJ*7G~;??wLÅ& O.>Y2)fL)P_jR4*7އ3L6ӓ%?]^w5 ԡb}j(.}ٹCKRROT< +(azziI}6پp:lx1~gMש۠̌҆8-,n%z)Z-h.q>$^{QS'8=EaY 賩tO)^5T&U-5 FGEq eȬp`r2#nAkܝ|'`Oל!hq?oG7nΪ ( mXC=ވ$\>% 7VV',n#~`}!OEُ3y#Gûx<~S:ZoY 5I!}Rg^nH'uJ<*`tdom0&葩P>IIV< }+Ͻ2I7Rl2|޹k{ -l/PDWRyZbk`]agLshZ5;F q{$P)ݻ ~Q1jRnusTtO,+'KQĥlR7iT:}8_!<,N]vԖ[0 f~0;o^mQ*nǓU CJܩMЎ 7q~wӜnڦM4-/.Z}+[#E7CbJ)q#FvS]f(n(6YvNncD7Cb*ʎn+ׁ0 't~GIv;,B 3mCt- 3`nFVh[z~J) [{O6͘]Hquqqwy6YB~\7t{A-0MWŦ`Rw:pSgpJ&lD MO߀E'c4A9l` NSOi1u{B.cZWDT""Gʀ* 1 !)Orюx%J_$ϲ[>H0.v[V|%;Š}CplgrXdi?t'& <{'|"Q|tL?$dkfLWTa}='vKjx,DX>[Ri'\2q/d2]R`o.[%,gaay0 po@pCvX|5`Tk:jl1M4`еޡmH0 ChgQ@Rtomڻ4Oh*SlLuu}m2 H ]W_iYyKx4[w7n~}7<0f9MBsz gdOVhe Fj&ϴ+w~2uDp]6M l ֭=.2*HZCe,g`(7ŷyiO|O95, e ~ htPWhF5OƝL]:e2#LX4"^3͵) C&&~g %a5aeBJyNj-#֖[= 3҅(}v8d!!q8)tmi_(Nw8KY4qNF&n†K#spl:FX\TT9V5tU WUrWC DXM|!F4ZI-Hcc1rU/phG-3(d|`HeñY IkFz+`6VqC`UTϲhvvsop Y|GYGk?l7 6w CV l&VrqSob#4#SOꐐ 3{!|}[m,?[3gg)pImNɚԲ#p'xum*j΂5^4EgkAƚHȐ9a<I-d3͉87fBȝLqbcHxgAN Qi@Ax21c#]fJd\# ;`L;@u7W!t| fO/zyqz80c^Gk9c&,%|"pd\0^ _oh4Nl9w<)f6ghPL5?7f@7z V6Pݕ)?r]յAaS䈞a+_OxNTݗECTZz2ZYjm5xS_ʰT\KiYrՐiuIo|L^)Ȫ'3uRL2-~O2Fb_Z f[ϫP{{_ajY h5*pp+ Fej#2H{߹'4 Acw[zki?{Wȍ C/bѱۖ#6fv1c^<с-[I}l/Pb$)Sv-J|Ly^tA6h;LD8m}[ /ceb;S[>|cxۺBgJQVdW9sNy[Cȹ=rn^*D-Nc[m67|~L?NwrN.HVg6r [lj 6mYO<H.y^B4V|~s^:FjabNC_쒒ir:K4qbtÅ8C$Pj]iU-\@䫧vgQj+W.w/%ޒIF+8OfahdG%cQ?V%{5]e1%`s8%EJ =9awLSc(n8oJ%\ƮkeO/z;Y'`׻EUŧ^mw~yM:X)+y]StVZ#pD2k h\˜3g4 @H ("D9m=wX`I L,<[>,xHrp,o>6*ͣkvɾ,=>FȩԃA>t褅.`kS$4j$JNxfӱМk-[ٶl[.-g@KjoSOd"ԈJIջj E=_jT?hHp(4Di( n3`4LNS۾OOR)'ma|d?$ u$D$4(!s&\fГ 4%`2 6xN&L6fab&Z|.ZD ݀Tb+K-Dh%AC)P@Gr V+RF5KOzS1uLyySX]^(Wta__=,np9Jι d\.'H+:1 tĈf^umoYEo FfEd .HeQe/F<܌[2e%s@.:_ӷI,gT5opFEx=ʸ͐ C5uX3N ++ ?kv=Qߏ }mA(0t$rSR|, HmL$ȄPX Tk1 R3@YچE)$"srK) #>["WR p{+vʰ<=_-~f柢}(xhSL.z6 rWX5Eo<'SԔW푻wmfTÅ^Kw~[7ETPݲ}!v Q@ax!.^9SX=m(p*/u]#FTw;T YaN 2-1ji~Ռr #F $^Uprxe[oow]3 :/f][H'o'c?O."!HjvB ^Bb`S*(T滨S-RpOQCJ-7.MU.Bl@~Xa'[T J{l!<TFЇ$c!oNCX kuS&![y4 *k<|XPk] l_## u#Yr=x7ifjU&mo_ӌ׼i-Rw*Fvrӣsq3=U1jQ{F4r{Q= NVpdfW =7:0@N`-Q}ӹ*M F=-#'`ߩd\.T?*vw{oMLS'M jWmXbFRU ;8=uWQī{{nvMS|w[|Ut܅õ_y U};ck龮Cw;y /Y%J~ KcI)l`I%,y%ɔ~&tm=S1Ѡt(UHVˍꟓ1dȺՐ N){-w6F.eu/\=7\C'C'YB؆?~e.($3uN*U*%/3D30hQ`6xFZGya ee9An0SaFUN RٖwZ4t U^'XS]уhVaYZB4&|²ȄD>לC{SVhVS^&u kFGJ9 xޜyӊ7S!Z QB{(cC S"E>"I7VSuQgf WB٢nW.s2R֬|x"|J4J{ԛ$4jQEF!k8WbŐs;сJfߞܼŔ}Lɵ/Z)㱩& 4W'a;ƒ/ۡޭm+Nlx@ ?xjhSDQp>$A9C6-/[^C9/-a)EE{Fs%ՊySn~y ^M:eN=ҩ*HgOS@=-enDcAh䄮;{]* ^fC0($檵~<)hBo)bUv͛Fy9ϯic53_*>R6ž>F 7K r.{S p}5t:@~ #Yk6x&g[5{UfZel&=-cջ4f?K 䕴nkӣsq3=6(pMmV^֟u_jyӣf 29(\M?_88HU? OmWp=z3d!!زVgwZ+8>lBfF==xhԬ%ȝ[S-)'޵q2K}DX3 HAXt" 4(.ߗ3>wٷ9փǣvwX,_}% 9^^)⯞h0OĻ[JȚ~jyk7g- T-xflQW"ݷ b`SH a]J5t"Jk12^_mlG1l y.ZS D%DRN7r|Lɢ[9,1 !oE|J3;l3FRN7rۜ%;]t˧!v}-;&N+0[v.0iw#ASްT I}s-_|SR &/S KY;/Kyi+/^z^ 楀m,KӼZP9o/"KyLw R.Ҽ./=G/E(R#K^z^HZǽ o2lNH{-D2hA /EZ?n퉇kqv2$wIQlŽIT1kKQ@FfSjf/ JAgg@_~ cXFW?;si"`ږwBוL~~,%ɩ%Om-&zeeF7O7-T}F[gZ;Q\5 *BRkJSlB ;Sn8YJ؄hz_xQ_lFRα8d Rmryńf$ax'A~w.q%YW?ɆS׏ئHO%&M k:Uny%L:ⴱF+!ґz(E0Ia V e96&b$V豨2E7ڸk:vSt ؠC9ua8ΡEwNĔdtn"$@d=l`aKkvIB\Nc9'aSaL.+}m~qJINo8&B92le+&]bϽIgZaSj[`^9sDHՒ xWKkp5RzIͩkiN0<C|܇My癤SwO# Ypϭo< Hfo(o)AťW nqMd]{DNeXS#5&(_AN<3/kMFZ8IYmr@mǐ[gU٘ l,%^S\ T fRxbk1J_cM4zk@yj \@0j|D:!T#m!2oA5Jy5D~^4' vo`J\'5D^ǂfR'9ɠ}:I."zI}/EyN(u\t i"&aBa=oji\݄8d 蜈g܂2l-YbhC",~y-v?5!K# vn8 a‡ƓGal aȘj>',qx{JtǏei9^l9uI,;U dO; j{=O;?-Q-clr9^v@5 t=BBz+izgZKڹHjdI[?=1PKv Gx&a"MXQXc>Q bY:{sX7l1Q1ܣ\O!XC,[s-πDŽ WZQqYUS**CHC`Spcx &/.8p,]9oޤLt߃4Q(8nTDr8b 6`]B;tOg{s@̮ټyyM2m"%nih2!ޤDyي?@K*Rjb-h" h#3r#1tQ%{HTX+g O`0t暩C^&Dϸ>ԡb=+JԌϗKԷ@jcs르N2܅s,/`ՔMH_tҴD/K:Iӗ'~1/ͳŧu;RN/9 uT[R`S|cZ8>q-)DKn:HqoDű[9n 1 !oE|JF7`f:H*F|G|VL4C޸) 3¶ ӝ/6!~6tNۄlY:-3ehAZ@ y*GeOW$+V,yا~_?>T.̒0/5:.TU lY2/BVq;,ǘ"F?=& =S嚄vʗf'C w#5rarre#C*CP\ cY?|ZfѿԷ㨛&նw~tTv+sP$lT$aa?;s I7⾮M,2cc¬m #ݧo3VW^Jqj* H!`v2D 47o݁\Pd6cxT\fg|KjL -4Sm̊'g4#li΀װNF),a!\^=S]El㝻y|}|1l *C\~Pa~MS^m>#5凐+lCVuJvn^ջvk;g}qݳi<5n_z}s֨`wSk Hs!F u*VUmF< K)Ox.GM8w 26WX@AqR!t 5 XzZ JoM oh*n۶<ŲH z; ~y 4?A>~1pO3M%Vd<ɍ_AyR- >\yfru?~f/f(dp*oA/F4U9ʝ3A^2 tܩv~X5 8jFGqA{jJdd%C`mH(!rLmkFp4g+zG<1,6p ܏OS$z$ángj+H-sDv_Tdk'>3BaEb/,L5` KobJosRaA͚ Y.n) शFQ\JPfJQ($J*>aZݥV9l(M1ϗG{rB#ݮMPXκ#=Gȡ>ƞf6^}ܩWViLjY;TBMrp]L2eWc86j+⤔A#IA*p#J)Fp$ѹcIp&N'RQ2d`F]d\z`ngb .0Oj 0O:1Apj.nkƃڗΖqg6,}ķ#ḇ<.,$)^G^w V Ghi>BŚpwN7T2f /FWz6< *9hv|'n98ZJius2}XDoW8_c J^tnf/GŢ(m 'PwRğ׫2˴/,tr|]2yQ}ʧ=9neIE< lȾ|qqp3fE,@4|[u7ݧJRYm[M,D6qO@mD+ E-uֵtQ^,z馠1Qau{i\|9Qa)F# 4FzwXzk)dOaid\ v;8ՆRm @ 0*b6ɃU1,r221aL0cR8(pRPHƍV~Ɗ'ݼ%D%Tu$Fnb0L 0)Uݑ?jws֥ږx"7?_dJ'Q6mC JgwWn5w Ǽ э.PHu^q W?uۇV)r32վ¯βRjvuu*= UʚQ *'%Fzǣ#KyaEpd+)J!Qn[,ͬz`Ȳf{Uq6 +Fe[,yIGmv^A+[-;_+n6}j;d9v?ߤRK4Zڜ*eJL\eJL\eNhr#@ ؠTib888jNXqkMA6X"`F?|%tz.qu_r a訧~a+mq}|2Y'˷̜r0N]TS}GF =_pHv` N :޲Me#G,t%eSϦSRQ͎wx+4iKrzSҗ)%}Rҗ)%}YNIoޤ?qTsq K$ Ԑqܕk T5_Toj|g7gEuU=l]BzDx%ge.}~N3Bp`3Ab(P/:fssCoQRAvG8$Wל-&ѓ5i˹TǏwV@48zI^! U:N9A̿azS˩Z{;A4%8,(0b:huZ1`; ړX&vbseAvh!Ht^̈́A &RU0 ) iژ}a^ij.a%R3RFYީǥMZs~:/E~ȚUITlo+Ƥr#f٢w Y@6fÖj^|Cܞ~zoFpo:q=yW3M|#,_qx嶸  ܞ~'q#;5N Y' $;ź_6n{&[p)ŠhlaIn PzZ ;A <VlE wsUEU[_mx*C(c康eJQttܽ}R3UwmU!C"8t{!UO7ԥyO>W=@ - Bc ɷ*D5(HfEAt FYO7e:S5 o>LAB{mXO>WUv?@d<<Ԍ$Е=_'٣ue*LE`\V. 3 )iKK'^hR)-e}Q/?~SG/?^~{uq^y/^~ˏ+jS 'jb &qΙzᄍt?FRߔޤqc΢^[Tl Ԯ{k,){y{ qN1QO`^h= Av{ e[3~TPmM<mȶX:Z}!mqyU&5xGCI!]YWP?8b˃[J}+t_O{T{3u4^w~N-_hd8Tׂޖl!4Y ܏8ib! oś{tƛ!$:,&9>]L*ײg}9d)F$Q ykEN{P'm>iE\A6M.h|ɖ!/%/GY(`y }|^oތfr5Oǣݥo:P Y,Oc٘1ew͛Rx0,2⣞S׋c{ yl|'$Ry3V8B8L$1/^}.Ć~@\ %eNW 03%*TwjiyjZz˦cC]fcNj<;iirkE1g*:/ܱo=$cX4Mz ܛROpYjd89׷s}3Oӿlt.Wɼ>OY_/$P^'t~0ҳ͵w+C%"M_!eɴiVsķS3xs!D \q芾/G!Qr-($uB&Vn*>zJ.q3Ja>t3*޺P+E%!id 6㝎ʔM;U&ϗ;=M [4+<!\(\!aK!Zr/H! M`HpĆѕsi 4{wڬr 'cl=9A]_-_0ς -K H=? j6iwv!QɀaWX=)P+IBDϯ\dIdc:!;*n;q.*!K ]5?"OmO4>h됿 Zio=@"{!vkvgx蔞x'~Ǯi{t'")m$ `'§IX4l"~p="HZS =yEP lKlk1Vm:zGzHe9*W䞒)G}5yI\$=QjkLAHZcyb6 #KXnI@cdSp#"Ǩ$@IbD6~qK]zZ uB1s~櫿\݅@{ϋxEo9~uIbxF_^Hg#q& ؠ:$ZE+h95r"@q 0@FWI/?ww=Fi"sà A[4 :XڃѼ{.@- ڿ; @뵰uXd45z0MZ"4M :޵q$BnV#Agزc7l'!1~P‡+_ )HJzH693Q_e&D\ʌ99Ĺ{*k8#,+1 *L{c@:{0:}Q@E^g:m%괵AR :w B |;)ƒ3LRs3#xΗV$J ziǃ\OCY:?"Y:>+|?xܘBi(ӱ-*[c<i(FlT $K;*0k@U V둥v'NKHuK頼&@WGQ)RgGt鸄ac^[])6^%Z;#`ӟb:hx:rOaRJ]hRbp'ŊZQTJdgկ~0Ixh V`cu^BAJK(@}t aۍfړ6zyIx2;&ݎ W_3;;nFa+aj8) :@Ք*b !wX J$Z'112yD_{=hQf#10T 0-?GyQ;*_oR^JC%Ia1Qé&`W,wZRfD]A-O\g'Nui Ԁ5[1wCh[bRhkB'G(Ofݩ|["{NڡV?`4$#r_\p߇qޟDkיrd ?'UFKYl8ijqxY1#y?f#HSscpG$k)ANy<;)mt>SwVqBen?Ldk9>L͏eK8|y7|}w"DB: $Q_uƣA7 Rg6fNJ IF!,3I$ N <|?,|Ƕ{lͰl-sč,`h*eڰvZs;yZS44gPXe8.B|3 RYm'pz~ӿí>lB'ZqzlZ|?eR0t>L k c0|k'D-:)Rip߿\O_֏OxSv7{Vo 皵moBjbd-6,mON_iosfGi~} pUXk鵂ԭtKӛR">^wF{\ןvM|x40 V( ptMQ3hߞz<ay ?y?Hf{2I 0*G>7s(s\l'tWU hPʞdPiQjzrqdp\vͷYȖ~saݻ|UOe ]&\&3 TE5OK(E {&/2Q+in[Ui]2F{17 :Vʰz,dzHk /&oBFaEi\L a&pbī)uY)`p-Z<bw%[n_[rWZVfk:wb6;3SM-_!a@6K$!pZCLСt.FZrj7.-*0盠Ҟ{&v̔ VVz},k9qKo}lJ2.$GZ2xddu(e әeӠ3]/z'sҦ*.ݔN *}bQ kvJ!P}*dTL~wK\|;D~V8Z^.)E Ś_=eUc= 3(F[11fxfiה c\j=ģ0KpNf}2𚃯wH)VybL;kopcTԀ$>B6Ennm }p+zSYexw)ߘ?Hnqkވ<3qPo`n–#]s༏o|_Ddr.E#w/cާ6M.@Ain*0kfK3.1ru ʿv+kO/C^iBXZ!V:BI~+LUdۋ^(&7W)a!X9h/_֗RBOG5 D&sChȒ%F,1d%+3Ta`a2O#p)n iȜ8o /kbĪOz3$rEP*%ٯ``iIAAzSRh<.}q&ܯs)J]8&"w4mV |[LrI`^A$NjcFg gCU`<׺2Y݊jխ|}Q0kO, Q!ʜ5ĦZΆ&QB܊"k Bҥ+I)/io\Qb%볐zyY\26g& ~~81< +3CF3Q#bmSO*[zpi YG226cV))cPk3}V UA6]Q۞ \I/wu ~GW-,r(B.1uH>//?U4]Y{N]!.>Vl0w!Zl (0#WP47 A:.eFJ^F o*)Fnĥ&xcєB"!PX`@uDCL% X=},`2kֹdr~ؙ ;c-B2 # "Q XC0&-A=ȉY#bVwf2{$6R# J[4kSV4ZE E}-.\XteTspK]K|'b>ʏ4z0- qu&)!x;sa҆[t`f{YFيLćX9IaҠ,hJrO|< h*(]0j5)@#yP,QƊ rT1*1}KƁ4`僺 ?ŭ(5&7(DA^ε9߀خ4_Yޱe}ǚZVd["ܱ$4*k$ CbDXDK"y9OmGhJg%ܱVTcխ[-C) u=)ƈ*I!g:]9E< i"A}6hi5L@[H_ 9"Wה/ G+"Gc]#kER^JU, 7D$H-13k D,t_'Z`Iv Xr,6ITq e$ b g@쿽?M)HOJHOAg0-+cLȃUl2,9 KDS(aMEqw|wk/? 9άA3zՁ@,B:]Z00alSZ yE9tE.2CḚ#GT$Kf=䘓#q鼑ǘM2Hq|5m̄b4GP\Dd.Kb{XHmH:T 't bO% B[fbG#ZG={dWA}ڇ"l2xF 1vym!'101 N,#F vq1C&(L䴄y,o0 BB}:֔p`V 4YVa\R`B ~To7'ۍBJl`|0 ~?`4;T/(utf3 }.kknFE嗭=u(Ҹ%Jm*9}3:kK^IN2I/@EIHq*3 ht7>tObDNaQ};6)bxWSPE9|f/u^<#kէ2p9 ItF#T+ږOoO <v&dv|'#WOҞ43zהqY"̏n/o%\pr-n1Ws{WN F9KVx* :-uabHj4Ax󻶠:_ǨOv1Ϩ $$-f-MHLUm0aѭ~4=sVz+v&rv8?_-~F =Wv;S p43W] ct)̛)n-7p-N߮?~`+w}Z[+[|,0o3pbg˶؂Ṕ> J]oݜ|mZVTG.WϖlE֮0{!{s<ǵL :~F^dJ~b$ck<S6([$)U7BS%rXpuý pe3%otbxk~y)tAm_g.ovPq=ںϥ {+pxuw]LZ>|}b6d9dLwrxT?ep[3UI@>QkrmWehfމV~Aީ#`e,# f¸,P&gga (󦃥[^LglxM!0)~֋H%bZK|NPHTovԖS( K ^q RRǺ0'I݅^UuW80N "ItY- :7rsOUu<45gQk! IRu19Ȁq,*HΈ 0Vm*s DX)/XvqɌljPd8E[ڼSuE7qc4тnw@Kxw:W{7v=aQӍ6"\1ޅ j5zta& -$JHGuͪ,kn1xNjaN+ZJ-$o2/a_U\g49.,m lJ㍝YRLV[;I,}|J?P6[?Qy" r X 4M!=G&iX x~p)R\ YraװH\$"K]WK)|!R! (m(ǔ#>JҽjʽR8ɏWmz yȀ*Id!#€Έ;"ΣJnsNyT^e`"d0 x0u@B;?sEyZR4ܪ0>"S"^q<c,eO@zH> ``Ry`x&/ ˤ5i9Dcy[S!)ڱq9Q'3$IطfpS`KbJ uA5,JS\ÜGtw$N,2Tx?k9íaEf $1YN4rf4NR 5RkJdaEjB#8Fa%T!dZaV9ϬV, D:goe^ϖ(nZzk*NdWFFC"Ύa:YOҤp$8έj0J tY.kULF0uPG 1xl S+z!Y/P;PG+0٘!U]D w Ө[`.BTx!^\rB \ 9Lf?;́fBPEHP#W0.<= 16v'԰jIS! L1d3C M%͆Z98ņMsr,U.@0އ)/X ZτaX@Տ(yNN\?PJezO|aLbPxcR&z0^=KyIb-a#Z4#* Wjv#ڊ@rU/OqVl_n}^:Q"U[w8ƒR^H IHh\+9@+&$k%z1^&58]瞱 z9`7(̌wnB+u)ן2;[^9'>: V6Ӵ0Q[,d8ȷZ4M\/E JQgn=e ]$^* @Ry]P }$$g:}C_8hwq4q6~`z'5{Ǒ #̆^/L s),X*-#օӆ6,#D T|t(iA$$k#~'ʰwj?kn#=0Pv%"쬉Jy^pR˸.{m@5/(rr%ln&R#]B4xa"NgN[Rԋ׮Ab䉬-c~U+t(^4P#ck)T c9Â9h%M⡢#j8uySu3/92ӷO_a{1| 3 S8H;%RS+#_hlI2 Ql C f҇+ە=QHD)|.<& 0/C™J,$uH hhCsU)jQT(K;)*(og!b!TZ~}hrɔrDgvׇx<]}žrv&d[V{˜m#aphTɸJEt BE  䖽|L6CȪLMR>ت^}ᖺ|_v_AYu!-/5kA 뺹3Q/o&u*~C懥,7~ZO~^\_c"7R(y]6 G&ks?QisgOj-=VlQd~z>eoyWNﱉyiVop`9dsa[r$#`MBLj|L6ؖO[y%s5h!E7.{R_"3eZārٟ~-%SH#zdjlgC*pZkт*?E{H/FOrD,dPb ̸VL2T/H|_~*-*@Kpcg!$dCv9Y;։ycM$ *n2b.ʦl8_I(JE kpǫN.=@qdm MJ)@-hӏY%Wxf>} uqo`EFǹ&Z S[,ڐ"njJ-T pL.&Aoo7LٝK L+ v E]6X?K?Nf6坕\Q gwFp$x*XblVT-Q7G2 ZÔ%3RL0c9#KNY`\#G-Y "/STv6"S&r۝ {eTBnC:[kwX΀XA;+T7TL㦂J *TPi%QBۜ+XTPig nsڵ=̫lTe  @\ V?x}Nc+&R:l%M<9ЌM1lC/%ߋ K,{i5 Ǜ=n`;̥0[6HǘϘֳ;}uX3dru BÛj/cyB5AD-ys2W)b8`7Qy&j~R*rl }+m`dh+9GőՋSKnbvG# 1f^:յ"V >sGUSp:4+\t|Xz&5`~w,g:ZX]97H' lO#$F7w *<\L/ SeWpYi-H($ͨnGlq&5Xu2O3>9m#-Le* F.d o!Qvʅj[2$4sF\c$˰ jo}7 f+J<8B+jzezy VҎXqipo{ñ- _٠[mHkVOB_nNk THJu:Nsw9 IϚFmAX͐.+%툭!Hg1a\\{ۤZm96(&t@x`7W{S}7._0VlB7h֗!h#W?_c˸  *}bi)c1'Z"Ovqsq+,WnC=Sn]euR֓|*SB:GY7AjchRy:hcwyZ-ơ[zќHև|*Sflq[7%XTĨN;XsFB[zHև|*)SrJ@pќQgќO8l1yBz%i;wθ#9xW<2Ґ \ LmaRx ڭWT3^ 'N@s(ܷ=R jaIK wn<& 9~ "]$M$mHd ˆ"q ;,ϰgpg;sCP+~siy{)B;I4i&œAIGlYV;-FkĬafg:G,SNIGػq,W *|?F,0ml6_ RdJv%HSdD@T9W=_X{ Zkw|1w?>[QL=.Z<;QKMJi Ey:jNB0+ѵ{N)9%4wi dC%ȥZMoi㞙HGԴoxxSzRS)*"(ǥ; imK) e! l/ ($s6MQRԛƒRD,߲VImJjªbo.UP(W0-Ŵ*9{P`]YJ'_ݓ!#hPZ^*]VP[QVE|c.X"S(t)a@KVU@Cb|襾/ !*!XCvtF' bj9եA$;e<{վF0Jz]U1 8Sq[VAw[i+֢$DLI\HҸWF*;+cl}!Y[0hR0]jQ{bnpGGOn`<$闻cCfkvB~#'Ʋ$cÑ ) !xJA:o7k]]YEJ[k+i<*,**t6Tcp l25DnpH Pԗ& 2Up@K)0%ukʆqMHI02N U(Аv/G1gLO:>5X6_cʟQlCBj];̋f4D.8`@n%ג2\XVAn.LeOn 5 3eT((mAr34n!ܶ 㨢aN+Diq׵+İX7{I tw"L1@6fol],kkI8P8d9g)BLWv)3uܓ C>Tۛ eb·A0nǨ!"{A Vd >m㿠C|ng 2zދ{gƧxzzvpv 񴹼׽$a77%uΌ{|Z,x&+8(]U0{ HCƯϢ%D6#!2>aig5MdH=\QX˥EX -z*(KK dDy](,wC8TJ0Qu;uH3ZV0i V:@[gJ,,$SA BAr1f$jN3jAXb$(1}NXG }oE| p5UGT B\'1pQLsn:u!߸SDvܥ/,sqK_74tÁ5ؗz6!EcKI1EVa$6b p7qya:<=rTp'n둑TH@~axi$PL)0Hh&PqBҠZh;f))iҜ`5,w1NQ9v·B^?w>S;% A(}/M 11U8N%ڦWYv!߸&>=y 2{D$l8$A$Cq=ߧ[v5EpǺ$jϚm..[9lD1F4xDadpz PX7 :j]Ù@@ë(@ݛ1L~Vxh"ةB- n= 9XYwI?YqPUYqCsoJ`wS֏?_y{o?.K5~[fu}O9|:A0o->7y]bFlYϺ3 S);/fe"zό Ae3Nm谫7cod5a ᦦ2M x8y*oV΁';r6"JnH|/-hhKdɶWWެr[ G(KJ ѣG3bv7[%qk=z C1BMOnI`0Ęԉ묞/~)93 PjO2!ny:3U4L,o[kW}`t?xMM_z]y){+a(| oK'αz`jEX%:d!s[!N0Յf,+u)R ZZ[>ܮeR 87-SrbbQ:)q]hw&˴?-󫂖<У7W-#oWU䉮\dfpʊAVˢQX*pNڙ퐯]8y쑀f..Ht z]vJ  ea ߶ "DHݶ⤭"8:]~qVSG^G6H8F`}ƺԹ5^+9fJ`kIH*`i ˶.m8-/a&AJ&30-Qg4S5O~ LO.Иŧ}qr#>Oh#r $!fi1zbLXS癭$԰~oVxV3Br!?FѦ\t*tc?iǘ31Stx0֔}vuN9zˍP9W:'M Ջ+X|bq_ tw>&ͥvjJٙsd ~zQz~ lҮz^/fu\& 9!9߿yͶ7~}7T"wDŽ^̊?ms}͍l\'>UoES|צ2AatK uR3 "Bj5'޻ץ\.g>`pH ( Eɣ%q! 1ϩ;g|[LiX?,s6l"(mdhXs#bP=s+cˇUL%X[?+ =Id/ɃIK`mb' [ϭcqg$N)%&Ǻ\ϊ̯,^ߚ/$Oy{?9,wG3h)ϐ?uA=%)3 BRPB( <<'a ꋀ?~;}b:^mfRܳqXk@R^!d??%kҟe+piMzy{gzukK[F7eoySo $TSPI^K]j XV Z Ҕ#CDSbW?y-|bY(]6Grg7CfDV~cV%|圵uhRw믖ކ $tsHQq ,Jn5enf֊ MwZOnk"唃Xi- 28#fu !]{%MZ BqFmy lsVkЋtZB@}EgUqѢ u!dg2 EPF!3nMmږB]9Zظ, s<@\p>!@E `pd.e)bTH2!IvI$跩t'24%@N.!~Lَ\8)Ȇ [t4%-A2S"H%r?N!] ON $Om( nɈ;faZ]›Ȯbx2S~(g)noJ;{g\˔oB3ۄa&]Rwݴ.@λna&Da@m}}f2^Zm  8ttˌc8,A %U.{L])@4$x;Wھ^1(t=L q`:_PMۊC11^HOaŁk Dzo9:"G~~V(+9k5ҥUp8Ryi,T%HVZUi&>P329Ÿ-9Ÿ-@C K=9y \CwgL/2Z12R)4G{Dl hӸKwe/YY6 6f{CӀ~$ڵq}SU+'eaQ9T)%N "$g*22' xJLJĦ$INIQjG ^Jj,т(J\Bqʈ: q\M9)Ew.w4wS7˚̀mgqMFOUU'۳:0GI[0d Eǹ[ߐyzy~{7 0On1}%~pf4 ޒv[_u~{Sn Oa4NxG2~T~޼Ql{=zzE@=B[)"J)Q0Q 516Em}y4[e MZq&)}hI'^=?xi@MɺmhZ)Όr|IUq0;{P0;MSKk07ӉT;cR# I9'V1uj6Q_ /]v%{ԢhM}<a.L4[5‹׻J]){S=xp]Xqȃ̑oc[P8-Atߙx权(43 =mo];r%/ͨnjf^X ;1'G4{;eKngdon/5Yt|EzBΞ_gd̤xJWUQЖ;[kt&N^lp`T|Vj4LO562T\XpP0)$c&r$ {WAK(PQFucdC=(C>GT90֭ }{v%v.unt88{ &48O>X;j$s$bIAcMtJrK4ک2N` hJN3n8%t<\ԺrD YdcV $؟Ĕʼn 8V`caYi# K[fۇ<=|j|<0U8{ox<e$Ç/RADBt`@_\Ww2ކ|邈COށ: vŋn8!tW-4]w D=-@1#vo X))* !P”A3*)Pk;xOA,RȫfmiUY!rC5}SX;,g0ʩV'C }>46A\Jpô%?z ֧1)%I7оiQPvx Aj%vr K 42L&#;Zm?pb j!fļ`*LFJKCpcǛۯܒ4؉TI'##1(8 OpMJEi+d<|k N )]u`y.i`q9&䨚FɀT<0\ %}%ߝ'ߝ'ߝ']|);Z3;"V ̩H'`*QRN"F;cpij4}Q);M_^5N:՗h`ZoJrhFc(3T7~jg>iB2ypпxM rKI)9~nzù}Z-U}H_>7\=}3Y9C~MR XKoܴ .qXw"GG+9ߪ-C`o^ۛ '.߯c@*1>cT:ZޜTI))9Gmi# _#;"CF'w4WH;7F>ɱVVp PQ ps<է_k,C0ARэ0}gE*0+\Q*g@PN$S-e`Ktﬣ{,0D3;ffNRs0hjVLkVKHg(p ZX0PTpQhp2{ HmO83s5bC# PS!010ə y?M a.0Y7՚-X?f +ST]$`cLp 9qwi=m_"8='d,uVjՃ\ VSe}h|I^nD:N-,ޞwh}gגǚNloR:q޿<7?e[^_ᗌ.YAw 'c7>0'b%^V.ZyYb7e sZ?9)97I{Zx43Nȥ[<́S /{4~̻Ą:jB ٮXZԏ~wSCLoa|gw#g0zC7P {pZ5HniB { ŻKi9b'} Aj. Y8aU`]_\2BeS:5y>l ٷtӈh.Αک^(*~t<1' ntN'kQvC DVڛ@=:nv[{O`.lxzиT10=})҄r&+d{>hV3U9+րBo:?d85};7VKvP`W,@K(n% @0޲>^5ToN`ASELp)& ',v*f60+gv"Dt}Q=i s_hS6`By!]+0#0e j\ { pn ֵ6˪`LcݬXbV3P+LZL#~6o> ۶j \9 O|fbq2NQLDnjGLJ&"2Q1;U@ y܅#~Ąi2 xWA,hwIJ#^흣?P>n݅թ$:5.%Zs/2$&ދBDZ09yI 9twY@rޕ܍C H;x܇CK%36NR2xj8''H*mLV;'%9´2 v557wԏO8oEM_ۻKPgK? ?SnQ%cؔ\׾Tj< * bǁ` m̑awC"\(u yW0L"]0jӃ3Ks$ь.`D! ю7 UjmPS+ />}ITqwۃikygb.uRBzI߫:6:K=ǡ\ËSD-`= Y"]_?w¯V,բ=hի*x_x_{5jQ *AYy]+pIJE}ނ"[9-J JHh%\Ui;Q!a~O.̓".<$َo'n'*@h;^ ӯ-TDyݘfXw).t67`}5x!pp098BКhՇH@ \S!㥚)` h hmpȟE'7$Mr q4<;*w1-_h&a+=맜r GK xOI]'uNٓUFrKpXL1cEJ'2>+x>mZ e붒^FbP".rdb{g?"$ "҈ v ] h )\Ci6r.x e;#ANkOVat;ɿ>(qkNt<$h@ Df|iO ng'5` p8ClӃi1oQx+AC\TGg!SPM6kc=ɗBLqD(HSr)((F7D"Z28% 6{&P<pD R>םw3KNI ^8uτbOώedZ+ ԛ+ӧOV bRdw$e"p%I-`)FVJ%%+ Op5@"ą?Ej4Gu<5o: x f˧Wa^غJ M btkTGoxh:Y&D<nfžC6yGTztmGx9Z+]_^7mgqŻ FXd ,1-O)|) K ]"ĜVPR3 ;OF*GRdؔZw4q7qKAHJDK"R q 62@k-\pf1;,|, ^2=Ҁ ǝck%RgC!S#bSj$O=<+Ĥ.RTyVZI-dhV}1^Y ;zxզSg1 ~ @Xu[n[T}GҶԾ"x) WZY5Z꼕`I|ƚJ'DzfÛ6zYܧN(JOm˫!(L2GTQ9m6~rX|94K˽[LpZG@}eap4DG.L6L2 qRiQ2(-"(c`<Q6B:AA\hG'2ms/퇿!v}oMg0x?+QD*F=߿=^]ǫ(#d@y1++x}tDHQ2H~sm_ .>|J>SDB">y$sM{>8>R JSIԦ,R=hBUڷTP+ReayE<T^7_YIQL(7\$NQD *jTSOYIep] :Tń)rZ)=(yUJjAuz=(M4A.Nd4FvQXdgC0䈸e~]&wY~uu, J/ cbL.¡!) AN Id6z^o KbR$En6L9jRs G@gScMt8T%xU*^]LM!{X],n br`mo][xL<3kYȌD ]߼ԛч٭[$r߼^}r]n%{[qE`.'H8is¡C9AQD5w㵛4|z4feY LC*UiHЭqS ^P4'e86̍J*v< ֝z\kGSSlk4V)Lpvro(o"لB' /s;qp?9G'+#:D'K5v2!^C\)Aki)7f s)J)*p;+"[ /L?DA3_efm.yԆjUKfoCo{( z Θx vng`f$ kN_>yeeee] *e$~:M֎(ZԨs]i$²b/PIۏo4\6vwI [wqq$Gog +cDJeSߖ1&{'M CU@:T>i 0n'mgW (\f/b5<|Δ j* R yrj'( "\dTю`]Y>*Mù%]hh4 aj`jG ( 4֎2O`]m{ טOL&닏ދr#52:oRz22uTcG5Ϝ<3(<+sRlfXiznNe7kCy~i/6ENENh>bfS@Ԛ[7x0 dL2M兾3O ~IǶcQ83ww^rUHi^Ι:9n^TOs(c<` ,Hu`)Jgހđ1]?KX"7 RJGEE2*%Dx Za5u Bh{bjS[dN51)zzk9(s#֓[Of2ejhV'L"rcR<*RpoOSJ)0v=GBwb:G\d00t6 aYSq}Dz~lQs'2AdsQ#Bwt?/noy\j޻g1{f{&5gYI ͼ>=7G :Y4FΞÊyDq_ 5GjpG1MuЬ.C^mmc46LS+SV&B-\]0,IhpE@e[rd+ͼF7=7'*-;d˼X"3 IDea~p 9`)m #e=ݲJIK\/n>՟[ W +7]vҲڥJlj%ҺOK(ކ+ еe=` lVI{BzƔ,vb%C,F.@txKaS4!$Dkd)iVrEC5>4*i ^Q4\7ZS8wҀNaX/ V[XH DN'gBĀVwq>_Cx{{azULV mQ_&U<8ȵ~מ5j_ELSmХ2z!AVO5/_z>*Ƿ G$>TZ0㰩RE"tfg}lݐz7C'㍐ gX]ZsV0x-3E6.eNH]'_ i!F j|z̓JkI&˙&MΕ!mwߚ2FVT`(!G;5КBr^;p[ɷ55>\B@>:N<2;Lm5ueu |~ 8rbyw \%\p.~5XosgBZ _Nzw":6k:RSƷR)rAe#P\NFiYxhjY;e!a=%/OP xT)+cgl#=-+SaQ+ wC9m@l`Ʋ Sq+,=e"r_^r^\\??}w1sث["gz>;8yER7;L0+""vbZl>8sZb)zb$qeZ΍Jպ(G9+|ƚJ$ v3-1;)}>`?s5T9 hSup΍}joՌ0GWHǶy7\yze!>^3i;xhP#G>&XF98׾s ~Ŧ9-PU=jK8#=mEw DJ3r$>CS _-!T&b\)u@礽_cũ{'Q38QsT"5k=A##2 ZP2.g^@5;ysܟNԝgGy:ʟwrJbGT(AP]@C+QRdIti֬T@!qvM%UQq֮a8Wwϙ?64k pD2 "&NRJa5pBJTBw^fTqeN_F1_"c'/}bD]p3S&kjcǙipZ#Sǁq tUiWJ-$'Ye X whY syg^/UBywsu+Zf{f>"Ǐ+Uө:I:fB,lX܇2j~2M4.dOM,cQ.Ʋ?{q g=+9d ?D_wHTS4Pr8ԐVLpUu]8Kh'c(Y`-Cl ZŸ̓08\^q,Am+Aہnz5 +R9qfi:s0jzJ!il=ʢ+0lݜ4ҸtN|_|x#7Z>?iu'f󷺶J_?x?t~1^?ÿ'Nc$5#H&lwX }pun]9vSGW2ŅByA;\*XIʜU$R$!SkiUu&H"dNB>"D"Óc0R'="MPS15s$y!15u2,(%GaRD锋VyK([ɘYWX7bbMܫ&-@vZɉw Fz{ @tO0`ـY$*gFY4GyBR0F Dˁ9- "Eev.A# 6BXX\!eB>K&PT4!\y_~a4EA8\HUQ8q8tYF?AƋ9]g% "#g'Uf>ݗgG[يX zCߴVYurZ5kE6X KyUY?,D'J+4vGZ+&{F8BE弇*7& ƞr߯/.M-n9 TbxQc%ɩ5ōVm`[,%Y;4&/%zmiWΐlā䃱f[67bR4m:qh'<d_w3G-<`mZŧ+{Lp3>8' (NĜrRYV朂kS:!G f-!kPoO9@u0:INc5E:fEI:9]%bobNqdhHʒH^) *Se:J:Fw$edYDC6L $pu ږ]"E JM'fJ< ]d<BFԙshJ-z4eOfeHHNY k!2 4/h^[\?W_Fea~M/]Tm!3gAv^XOa?u$fpk ?9_]2ͮpzh/GC\U _ ~pO/'迿:;;<UV~gdtg.nxL'g:kϴ~5_oVyieWyz9hU!W g_o??7_oāclC6c <c 9Srg$p#wy/B#Pmj6L}_m2I~"QoVADzrtVhI%{-L$kSPk"`2ڝwrW f*&i/alg@v{)v !z0GY@K_3>"SIzI&NXOF7*Y Ki7prMuqבV[HsVA-o..8wWEcYO>U?rɾ$o`xך^.l"2a_$|Bo"0ھbZ2!,PmOis 蝹(F @ޛ]..K/-^7W5bC!>y19-;kϭ3\"gey-~ogZ9+kٱ}ʜ1x`w6㙰f΂}A^[ʉڦk;W4@=+xH`KzܚZ{._Bw:Yjr^N{r2αQ-$ *k{n/&v z>@,,]+¶[\; W-n/z?懓`rquѐLČ=俚#n>4 ck;=Ŀ~V7{R>0M)L }Y4L.Rd0*aT&è>n.O/*aS AC:cB&GX/Y-K-@YubS>fxK|:% 撃>Lj2@yWr F#t6`)zE+]5Z;wFG`TQeN܁ӌ٨ Z] 6Db8y+ d@Kz鿦vVvC}Rmѣr0zTFQ`t]>ޣ> xf2TYg1B #RV4'X>J>je֙jeTp O&LӟߣM_an#GëΦ~%J0̊͒Z瀞q&tn;Qv^ 1:Ca&uښVckB6A1m57vJQL1XcH)K$h*=R U*%]̨"i'IE->bR![XIK KLRtDB 0Qȅ!8C$ 1kɷbr` yPɤ;H2T+5,: Kq!/X$m#B;$CLX@CCR"=c#)RRL2MQNZ!fD@BF%K$n,LP\{mMHPFe[5k@C Qlb$QV9k9nQ? 5{a-",y0⬖ItcGgn א݃xuQ 3>|~#Q(i'x~}Xa2{Ļ49ѐWn= û!E _Q}4gϜS\ϫ/>A |`Qը}ER0KtI>7ǧj!WwplP"Es{{t'&;xh ! CB`/^`B!Jfo!n$- bDC-hɊ7^VشUMW$;śK/5SLoN:`>n @n6 7K.oAjr!I֛֬Kk%5J$lub)4!~YEo#nEA fRlQ Kkq<9w+[I (}~-Cb=KꤗP׷–sQ/ՕRI|W^L/W[N/ݗA8'VwձL6Z+[v6?м,D( r:ʅ*c"Eβ}tV e7UZ["mO *o^ w+_^?82k:,Mtş1k +r3&k;yڜt*J`)AۣIs    %~0 pms(sF 6:BIIk69d:vX#縑܄6їS+@!:Y :2kی\z\6SHc1~a|EIXiw"oR{fM=|$ۀFƃ`ɡ&,b$KiO̠L%; 黋  rklA8^m[hZM=JQnq[gQضSw bK6^e[b9y5D V3f&ma/h@ lGȪĪޝ.|w@b 2PLu^%"ZG}hp5hoAVl oeW'yLwbszn仙*FBW "'Xx<5/Knkq^W#<-U̐/n8v4O]Γ.i%J6Ђ1Q,$%E[čt߫Yɽ^Z+$ w{\ AbOR^(t koǟq|$"|4ҷdiU)cqRl P?RM ZyY*&s>YQ0"FiFe?%q0 q)`bm[(Y+dŔ0D^XWzJLRa-* ivepT˕URw(>~&&žդl6#zH('wnFZk5[z. cfVF#Yq&2gj{9_!Kp@Z{wllqNW-ﴒBrW=Ù!,$z [C} e#DK<QdIqnuJ5ЅbQiHYꒈYb%b:(VEp[,$F% [ÔrDbDV4 o]}A tF݆uuC֒vJ2?'< 8LZP૦{QqWo:}_O+M!fL%'sNpݧ|T+id*ak0gY2Xc]@V Q.#@=g A~ =0v/wC&ݧy.!%Fﲇc<:[??7-F$ho?ᓇs1hr'+;:y(vn[po[3Vi&^/t=h z1<3AnNlkx9>?j>V-ÂTQFw{E: {l>f:]yFqۘR &j,V޹8T+;*0 _O:{enj|{SN9i,8uQ]䬌L"=#Yh(5Yӣ>tЊ^n]-:tv~FW2m_gE|p&O3j2StIj7O^*NɁ!*c[%8z"W@^ }`L^GBz%he1vI!:4.pQ 8'^@$!֣aLZMA$C:P-= \U)%mr=˛BXBKDPl Dѩ6wŸd*ϲC $`$qjyd$6dv; E}Ұ5[t[H{7'oƨ-zRKA;7C CB)O -M96`WӈT nȽޭ Q'uVX+咠 ܽuڂp #bdneynwsn#S] ^z!*A >"+ RD0I($1Z9sE Lܙ)D?9/j d?AUNau{ͳ^m ]A n[X1umՃ'@s8Z1ŅvPͪCEoTj9o8yQ7!3Uf5_m'Gu/xLіX Y 2zՆ6-X{ƵfSbXGRŰZ0*sT+]UhqKAUZZqv^zɽ)+FjnMg' "$Q2 `9b#3;㘮pW k%δuj]bV'EaɢxkDTDYDpB Ёj(YN!k5ǽdޗaf} \d x \0®c c!<" +5M+î!RE+# + [S3݆wpp%hޓ@R q3' %Up0]+[L׵}x*̙N]q0e\~sO7ԥy'A]2'2=_=d _#K[@|wb7?>+)i NV"@ *ʃ&68nEUsf#F|x;Ӝ+.)y}C -F[94HncKa獿JX 7'p@~cUp Ō ًU Tpp \^5E#_)/ި.|&T+bË́P=}_[lXuE.#ڐ}O pp2fHasJނ,T=y"k5?DVJ\0HJCrTh"%:BY!%"Hp%AI`.1e8" -`}- Rn &'TPFL[S]F+`}7tk@K$|Ni Irq{A! Gܺn)ZFʮtf3y f(0K2/ ηf |] \qzə5>`Imd"\<0^˘$Ɗ;KºJJըF{u炖F{뭥eQ蒫B:9H7dž& UѹE%eQQP[0WEQ8n cI\S`L "UM(=[ZYG1'p0@xAiݚr+(G2 EPD+k4Ԓ bNNܔEsb,dD]fSGq\ ō/B5tyW12:pUrFӉ3h,AuRW}輢"d 2hv-z\ 4IE<8C({ 1J+TB%J0G4%HYrX20t:5dxoOhAi07\`sLa&e=P-hf]F pqPlwc "4Z 䘡JJzK1!!G$ тrx *Z-""w\M9'@sckӞ4WmϦ(q r -sr*VC~,Q8ARj %yGJWFb?>Y75TDf·V/H$a|ha{f2*2*z"x4ٰK^bZq9V d qAZ3Ch$XIE|,V;-tvvl A|'::{<'WX!lW?V -zXEdD  Q2" 7A'q440$_YDT/=W9-oI+ܖWUaў J_oA>95 ݹvm?\vm,3Ȗ]յ~K=gcKPBPtk9ui>^7eq7GdͷG!7Y7 =RE!8bL@Kv|n%}нjV/;=}[uvuZR:TpNcX]7t 됥q Lw$FK=;[ &uN29<`6ֺ NdHԦpIO%VːނW 4Qv6w< l#:#Cpv0LpemWTKmWKFv m;Y7 ԖUWPBADA|Վwl&+{DA/ubǍ(pr6+c51.c:_>nfɗX0֬8PEW}՟1J#^}G*&ٵ]z[2C :N||rq6 q||||dr??G98_0a2fL}1暕V+5nS:vn:ݼu}(Wivb6wɍ2MB痯ɇ,o9})Ӌ\|y5:ySn_XD#*0Q02&@3Xͫ>rn|H?f̫o#$9Ao66<"#3/~񢐧n^MRGݑZ33ӣD󋈌##2™H8tk6Cm]+mz^W rayPGۭ8Z$7KW楻l;k^z9j2/h~r`^;  H#"{wlNC!JHӜ;(|: iQ3ÏIx'u8}!եvYn>TI]!Қ7K=Hjew7Q-ݗd0-ĝkC.ZrtKe$ 6l/AV>{1?=T{Tkhާl;gW Ur67 >`FadDNJVǢ1V$Hwy3eǻ׶Zrׁ6:FԣdM!x)99i=4'%*[|ZH%㮫1v.-vILI!3-5iF_ .jK$lq)h p&eTˏyBG?gr?2ey;AzB_H;4~9)0ږ_Yk119Z #d q62ċe 0/"ƻXPdݿ/Y7,'AfO8oQ,U hֲ:DYq% ^hue|^ivdVBn_p# `Vbxp,wd\8$-@2P 4CyІCYWJ 0 )WӦBHRInx8w3.sfy}=_~ffAOСI14Uv_]Y L ( sH E>N+/-J#Τg]q9y \1)Ҵ/*4C;'wqP9-%wXI>g^B(p0 BԕNkiջ؁w#.w&$~D $h=02("V0OOLRО=q(1J#pc'E4HO>$sN5s<T%RQ1*P]`T>ɣ⒍F( I$U>$SDmRǔOн C%9X)Zy#Q:ҲT00Y0oHX†Rhח v[U#zɧY,a{H>j]! 躆2>X-O3|[Cݯ:i^-|̣V+凓_YJYϦc2lZ?jCڃ:(taelE笥X)*CDcB:sT-jzz>7D0XXRpJFLvT nYa~ k2,.'ϐcuL?=28ǥ]!q뾔\pPp"C0~v6!cIvzi6cJVP Qq+{5߼/~G)׾KkhX ,\&x (tA )Ĵra(>}8Y܏޸T0HWhBr "ױXŜ|T3~(>tA^hRb7 ""1} m8~g)?rm[A|Y8S (nQqUXBQ+n8g~1c!:c.@{|RR$F㡯+@ lz.a( 2Tx3 N?L Zmp D&+sӘ,"D!XJAVփ :Hc  ?iyd?uUz-_\}6dӋ0jDL{\Cڡ;%;"q~c!Gӷ6W) VK^Y?YFSIw[,_ O diFމqIVONM1r h,½F/i") 2Y6j8Xt$d P˒ 1+5xYѢm(Iڑog2x\%'H 208Z3k5~~p7rH1h &iD:@ >rz?/OO`e]VW );#&Gsb˜Lam*r,) yĂ(CɕxCbL*DNYզl? 'Q!oXq%)ziOG!P&tQP %2Wdn'haNޫ)1h`5lsٜ,MYDQ!b A99\Gyα_MN)ˆ R{ޑ EeU6xHz^1$RXȋil19b`IDO+HFSdPbvU.G4#9ۚ/WI)喼|$#u!O~E4v4!ݱy'L`q+l<;D1!q zJ(< Z/ynܷ/BikHJ`d"k8r_:p)ho2+r]LqlB9I>.i%=<!G_ujë"`GՂcÏ:#vŸDtO?*T 9x!Tm_ع^Fh\࢞瑣A'L>ly-$Ow_p1fXjLg @Ǜ ĤBfk^ٍdGF.#ٌ̭7lxHYM:/.WG798Gfk*{'W !}[8`䆴/F?iЉ/4PԴ&?Qiţ Ĝ]Q'&c2h}Nd>j2< 6;pԷ'HoFuHXZ)) 1kN.XR{T!eUigAlZ~\'HXcZ jK,,5<k-aaOJ+hvF5mR=9MS`^ 165,Ճ=yko#q\1lhubr4%L6-R?l#dm*pnF$}x댞p0A,U98 2d@*Y,8̣ YK ,H%G[vs}׍޹T ;}tDb"]' b);kX7}O7t1q[Ö$oCKdA H˕;] H?N9J>ӟ߰۵]=fjQl ;Wu@0Qtzh'Q>Gܷ;<}8w_~[^UeUKw0{݆7>3Xj:|UC;_~m^iAop~gVnbDywz'ZIe.MfO~Ww+qL|z25Þtth+<~Xٗto,M\Wxi۰zџbQ٣.'xsE5;@5.“K!_z,q}CIN^]utF"L.{y}2&;o2g>nKl>qчo v/M؝l3?uP8QM6=R! p&i\Cră[|6@=cpn.GN.W]1Ũr9D9ǽWt@Ѵ*I AmvSR5N^P>$ T7yKɤӋG$i|RD@.2/+U_!mB"}d)΍hdTA7+;IC$_8@zCFT!7ώY诳Jm}lT! 廉_.w즗X2҅6MRQ,+FRaV#_A8^yk rQr3a?}ɲр,È}QgL05~>UoPkۀ4D7-ړґ0:n%bkG->v 6X<^},N㿿~nWZ38px\T/$/Q}_.N?΂ME\VhL!@PeF usעH!T^.B kĠ;!0w![KDM:JJԞ$q4&"1j gUFDRӫh?C,?G+*粻]iZM& d&iAOZku}CHxBEE. '|gRBxkK%AJH)c\SyTK?vD^Bs .4QLjF a[=}@W  47Z,2@aρxH7Ieu AqͶ7L@.M uK%x\ [r YP/>Lw\j[ :t:7ۛ D!zfkk4{YC} P1:# @S&P-r〳t]%>TG. a<\,tJ[.dk ˞Dp.ޜn{zsD&geB+s(RɨCBn&ٟz??0:hm nZ wh(WR=$zvz;ZRPRr@kWk[nB REho-QKS\Wk;c0{bӑR^hxG{RXr,ɡԖz B-zN[.ț8-ES`BXwyZw!'-0W`9y[7vᦷcޕ`"ԁHBcy5cn(ZlA#0( &X=gAjY_9EggYve#ۄT)AT%0]wW"IINWh_\}Rwj.˞, 륮il͕lٓ)f-+[49RɏN9sv.OT';R܇~e ԪRI  .Y,MZgcb%VcX)2Vg*UK!EpH?Ƣxq5gz,g!sf*"쨋ˋAĶz(xN/M^Pu*ڧ_N䲦no׋5 Ø1wm]=Ⱥ.;hן,Z_.V#u/ I 0]JT|d HJFCrs2J_w!UY翟}Oi4,FvHԬdH"BoGKƼ|OhRkG)%@V^ cf[e- Qf[-f9[{Ŝ}fuq!vy4M̓KѤP@;d=d\5DP_lC3b([t>ݖ%!.# kؘf@-E9.*U}EE`k& K- VupUmg?n F%7{EŜƞHӧqh֌ؕSI p51'?;C̙^O"S3?QdE6<*jc/{QvIxۉF< %e )d($cC>w Iʂ)ǃH%dB rU`((wF3Ŵq日r!|+')?,:>%|3z#yd:&RE2e'w~P={(/Quyg%P= AVF) )K,QL̟ rm:{'=$7)h(1XtpZ4Kk⩓jL:\O1Zu͘l&{dV0Mm#[k6rRZPƥ0? T ׃]9$+s;V[+yF ̽!R;~-*B(_, 0+Ş#!PzFkΪD}<8Ш&<ф&B> g>N DT/4>`eAܳ(J(k,{x(%+, _2X-؏ɥ ʿ'G7RwT]O=پL#8 (tzGB3Yi(Jhzc9g zrͶ?mpH8T$fR?(zzv_7U>:y3G=*:m0/~9#*ʛG?/Qm ]j]o#̌\=V1hyuM<^}%#׳8*9cXvlMfk/;|?/]z8xw~l< G̟ϳ=o>$l[]([sq4WT:(McWM>D`!K38_PA_&|gזh pS|}{;_ؽߗxZSNK^juuC !F*R)0H.6CFh O>ׅ*D$}y]g5PR+-Y̛Ϛ1O~dFu Kx??|9Pmzoܛ8JJ:Sl9w)&LuB!/c o[Pa7+XC`_VGa)'V=x\\frpQwzwSI_AbDi<ĴmbDn>Go!~& QJ6.Z{1@ ]8B/P)HJ[xd7fT1Pl/dnFqKQmF?WHo Q0?$懈1ě &K9ACksuHbeH; 4 vA"j/!JS}z |?9Qn3jAݸ[A4(#b+cd-"G བྷlfE3Z"nZBv-$-R]K7ѽKq7]K$H Xhc2}H T1P"߫r0`Y̋.RҒf:,ShF?(4йj3v+ O0!4 AjrOg(u(/ VKeM뽧sCa^5h$gwmO!èI2 Z^] g9aܫ՘)E.E[ %5!K۳ 0WQSNhB(ryUǙR/S{,5{bK;Аo\Etuq޶nE1^.bT')m8+ެ[Dc[hNQtuTcnC=1rrU۽u׽+꧋% ox%[A7͝a9jo?wʙu`+-(˛&S&#ܸe/Y5]iR]Q!H0 l|+ )MYP' HHA'II CHT.5vN)FJh(5Pt2B&:HԂJ2˝RY8:q(n)\?ѷ>i܇Dql^֞p2>"j1ᄄ#I^wl WI#~zܛ^͖ob}K0/iN1`i⼝ogILvvL\Q7){ w޽vtwfR=. ޙ7dI梸-Eejz;l'unC x`> -?؄O62 쯡9ROO |jFoxW 5ҠĶXX-܁XZu\h}#Q[l'%p_ڥHZt:P;>m`ƀuRB!W+·Z)" p<)t4ƫUh\V  n:]%a\%u}(Kzh^W"hN.^[f>4x/u`JvHY0մh`lv`0Xw,fi? 0ȹիqϣbhzxu~U8a1-C[$*[jg, %ƛM.e!GgʾBZUaiw[scI9#rO)Q? ׳6 ׀ YO;s_=l:?bY*[ UNb7\ނ\Bj^ $Yd׸Yy]U߷ Bw bk*YFdfWC X&ov eӑl(C8cl XPI5'6\fV~y{RBRάϣ[7^W\QW?xxS5֩]-=Vk #Ʉc!DӺ#`SO7|<p$SJ$U,$ KI0U€&TX x ,X00v%TF|.;zD"H@R+"=dz-=~*B@Doxq.arkaxgb4r '4}Ǣx-+IR`AAK ޏ4dPs%EI P/{݇%7aysg59x; L;!Iy^T8:!,E5lȑH A &!ZЁc5ַq%.8=}[Y,NJ* +OrȈ8R!5B8X@-9wDnsP@LPa!+u{*e!V ׺!D-Sb%9Zㅷ4{u}} ' R o 0f<-|_4A!ZB3g9TK N <5"j1Fu= \-8?WMӟ.'-󇼰~Jogj!XT΅g7|w˞8f"xA7*c?kW, ,b* r~_k ؾTv]ZH{3 )!o8Y-c̓Nci4FQJ@0aRZ%`RQRʅd#CSYKR< 6EVy6اh R QhKT_7frSRL8@RmֱDujZzZZ*@Ԏ-%m;,v/S}ݤqzRjg(%TSI;ik)qZiRQhKT_7ȥRWTSY2yғR&qcHW}1>oTBuZIjp g>Z&0] 1wPe Dw0v ؟T>zY- @:Da&p9E ٠kOp$8B(I gH L#&1/86e` %g+|rDVp hA+$Eװ+ܬżve GeOدg֌0دgٴdCHa2DBuj 9#ZJFāўfZb匁vo0:n~ToS2F]2̌4(8E-A(Ffb#SX-#'5@50kJdF{>0<; %XqW [Ja12/o9<T lv6G2u~{0 `3k>%opb0D+~ h %۫849w}j_5QPB[಄xڠ[rއ+uQ{ǭuS pQr١SQ!= Ke2]xm=D8c=祓&;Ґ@8QCgq1(Q ;tD[S<E:6n,HT& v˱ , j{&Fc.G (B+ʨXK2ZM(d`0pD4|\Y[rb~#{\S7a MاnTݣ4%pLKfJb,"XZ%@UCH TY&J1EYs_^$vܫOVMcNHXk>2~SJ0Sقj=9˹.94]zSS?7c ޷vi8/*KJ4tT6jg\Q{mg3BO[{i$1g)~-l9 SN}ϴ}Hѐ1|wBN짇7 V\#  oXfJ.?S O@9W x uu':{xoş VcLl I^O<ntZ}ܫp3# GcSLؠ{Ӎu [tY9=]]D;HCq) M`}n map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 18 18:03:11 crc kubenswrapper[4830]: body: Mar 18 18:03:11 crc kubenswrapper[4830]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:58.301928268 +0000 UTC m=+12.869558640,LastTimestamp:2026-03-18 18:02:58.301928268 +0000 UTC m=+12.869558640,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 18:03:11 crc kubenswrapper[4830]: > Mar 18 18:03:11 crc kubenswrapper[4830]: E0318 18:03:11.545002 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e0186020a0254 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:58.30200994 +0000 UTC m=+12.869640312,LastTimestamp:2026-03-18 18:02:58.30200994 +0000 UTC m=+12.869640312,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:03:11 crc kubenswrapper[4830]: E0318 18:03:11.551755 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 18 18:03:11 crc kubenswrapper[4830]: &Event{ObjectMeta:{kube-apiserver-crc.189e018699fa1190 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Liveness probe error: Get "https://192.168.126.11:17697/healthz": dial tcp 192.168.126.11:17697: connect: connection refused Mar 18 18:03:11 crc kubenswrapper[4830]: body: Mar 18 18:03:11 crc kubenswrapper[4830]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:03:00.851102096 +0000 UTC m=+15.418732438,LastTimestamp:2026-03-18 18:03:00.851102096 +0000 UTC m=+15.418732438,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 18:03:11 crc kubenswrapper[4830]: > Mar 18 18:03:11 crc kubenswrapper[4830]: E0318 18:03:11.558685 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e018699fabd71 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Liveness probe failed: Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:03:00.851146097 +0000 UTC m=+15.418776439,LastTimestamp:2026-03-18 18:03:00.851146097 +0000 UTC m=+15.418776439,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:03:11 crc kubenswrapper[4830]: E0318 18:03:11.565178 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 18 18:03:11 crc kubenswrapper[4830]: &Event{ObjectMeta:{kube-apiserver-crc.189e01869ed8fc6d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 18 18:03:11 crc kubenswrapper[4830]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 18:03:11 crc kubenswrapper[4830]: Mar 18 18:03:11 crc kubenswrapper[4830]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:03:00.932820077 +0000 UTC m=+15.500450409,LastTimestamp:2026-03-18 18:03:00.932820077 +0000 UTC m=+15.500450409,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 18:03:11 crc kubenswrapper[4830]: > Mar 18 18:03:11 crc kubenswrapper[4830]: E0318 18:03:11.571010 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e01869ed96a14 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:03:00.932848148 +0000 UTC m=+15.500478480,LastTimestamp:2026-03-18 18:03:00.932848148 +0000 UTC m=+15.500478480,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:03:11 crc kubenswrapper[4830]: E0318 18:03:11.577821 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 18:03:11 crc kubenswrapper[4830]: &Event{ObjectMeta:{kube-controller-manager-crc.189e01885615c1be openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 18:03:11 crc kubenswrapper[4830]: body: Mar 18 18:03:11 crc kubenswrapper[4830]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:03:08.302000574 +0000 UTC m=+22.869630916,LastTimestamp:2026-03-18 18:03:08.302000574 +0000 UTC m=+22.869630916,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 18:03:11 crc kubenswrapper[4830]: > Mar 18 18:03:11 crc kubenswrapper[4830]: E0318 18:03:11.583232 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e018856168d62 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:03:08.302052706 +0000 UTC m=+22.869683048,LastTimestamp:2026-03-18 18:03:08.302052706 +0000 UTC m=+22.869683048,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:03:11 crc kubenswrapper[4830]: W0318 18:03:11.873870 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:11 crc kubenswrapper[4830]: E0318 18:03:11.873969 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 18 18:03:12 crc kubenswrapper[4830]: W0318 18:03:12.024182 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 18 18:03:12 crc kubenswrapper[4830]: E0318 18:03:12.024252 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 18 18:03:12 crc kubenswrapper[4830]: I0318 18:03:12.134233 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:13 crc kubenswrapper[4830]: I0318 18:03:13.134286 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:14 crc kubenswrapper[4830]: I0318 18:03:14.134519 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:14 crc kubenswrapper[4830]: E0318 18:03:14.335989 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 18:03:14 crc kubenswrapper[4830]: I0318 18:03:14.337908 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:14 crc kubenswrapper[4830]: I0318 18:03:14.339557 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:14 crc kubenswrapper[4830]: I0318 18:03:14.339645 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:14 crc kubenswrapper[4830]: I0318 18:03:14.339673 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:14 crc kubenswrapper[4830]: I0318 18:03:14.339709 4830 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 18:03:14 crc kubenswrapper[4830]: E0318 18:03:14.344622 4830 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 18:03:15 crc kubenswrapper[4830]: W0318 18:03:15.115942 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 18 18:03:15 crc kubenswrapper[4830]: E0318 18:03:15.116024 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 18 18:03:15 crc kubenswrapper[4830]: I0318 18:03:15.132630 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:16 crc kubenswrapper[4830]: I0318 18:03:16.133175 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:16 crc kubenswrapper[4830]: E0318 18:03:16.317354 4830 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 18:03:17 crc kubenswrapper[4830]: I0318 18:03:17.128754 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:17 crc kubenswrapper[4830]: W0318 18:03:17.799752 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 18 18:03:17 crc kubenswrapper[4830]: E0318 18:03:17.799876 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 18 18:03:18 crc kubenswrapper[4830]: I0318 18:03:18.134486 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:18 crc kubenswrapper[4830]: I0318 18:03:18.301909 4830 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 18:03:18 crc kubenswrapper[4830]: I0318 18:03:18.302042 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 18:03:18 crc kubenswrapper[4830]: I0318 18:03:18.302129 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 18:03:18 crc kubenswrapper[4830]: I0318 18:03:18.302325 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:18 crc kubenswrapper[4830]: I0318 18:03:18.303994 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:18 crc kubenswrapper[4830]: I0318 18:03:18.304043 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:18 crc kubenswrapper[4830]: I0318 18:03:18.304060 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:18 crc kubenswrapper[4830]: I0318 18:03:18.304694 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"719a4f2bc2570518f1eba72413d942eadfbfbba05d54865dd384cb3772a7705c"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 18 18:03:18 crc kubenswrapper[4830]: I0318 18:03:18.304983 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://719a4f2bc2570518f1eba72413d942eadfbfbba05d54865dd384cb3772a7705c" gracePeriod=30 Mar 18 18:03:18 crc kubenswrapper[4830]: E0318 18:03:18.306731 4830 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e01885615c1be\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 18:03:18 crc kubenswrapper[4830]: &Event{ObjectMeta:{kube-controller-manager-crc.189e01885615c1be openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 18:03:18 crc kubenswrapper[4830]: body: Mar 18 18:03:18 crc kubenswrapper[4830]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:03:08.302000574 +0000 UTC m=+22.869630916,LastTimestamp:2026-03-18 18:03:18.302002971 +0000 UTC m=+32.869633343,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 18:03:18 crc kubenswrapper[4830]: > Mar 18 18:03:18 crc kubenswrapper[4830]: E0318 18:03:18.310395 4830 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e018856168d62\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e018856168d62 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:03:08.302052706 +0000 UTC m=+22.869683048,LastTimestamp:2026-03-18 18:03:18.302090923 +0000 UTC m=+32.869721285,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:03:18 crc kubenswrapper[4830]: E0318 18:03:18.314190 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e018aaa4ebc73 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:03:18.304955507 +0000 UTC m=+32.872585879,LastTimestamp:2026-03-18 18:03:18.304955507 +0000 UTC m=+32.872585879,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:03:18 crc kubenswrapper[4830]: E0318 18:03:18.434418 4830 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e0183759619b8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e0183759619b8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:47.35566892 +0000 UTC m=+1.923299252,LastTimestamp:2026-03-18 18:03:18.426617141 +0000 UTC m=+32.994247503,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:03:18 crc kubenswrapper[4830]: E0318 18:03:18.676580 4830 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e0183866a037c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e0183866a037c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:47.637992316 +0000 UTC m=+2.205622658,LastTimestamp:2026-03-18 18:03:18.667726442 +0000 UTC m=+33.235356784,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:03:18 crc kubenswrapper[4830]: E0318 18:03:18.686373 4830 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e018387183a83\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e018387183a83 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:47.649409667 +0000 UTC m=+2.217039999,LastTimestamp:2026-03-18 18:03:18.678614641 +0000 UTC m=+33.246245013,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:03:19 crc kubenswrapper[4830]: I0318 18:03:19.133586 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:19 crc kubenswrapper[4830]: I0318 18:03:19.401347 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 18:03:19 crc kubenswrapper[4830]: I0318 18:03:19.402306 4830 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="719a4f2bc2570518f1eba72413d942eadfbfbba05d54865dd384cb3772a7705c" exitCode=255 Mar 18 18:03:19 crc kubenswrapper[4830]: I0318 18:03:19.402365 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"719a4f2bc2570518f1eba72413d942eadfbfbba05d54865dd384cb3772a7705c"} Mar 18 18:03:19 crc kubenswrapper[4830]: I0318 18:03:19.402455 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4d7648c5bb2f9b2e80aa0140e4a8bf7ea30e0c2c5550f6c77f30a2de20d668d3"} Mar 18 18:03:19 crc kubenswrapper[4830]: I0318 18:03:19.402667 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:19 crc kubenswrapper[4830]: I0318 18:03:19.403855 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:19 crc kubenswrapper[4830]: I0318 18:03:19.403896 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:19 crc kubenswrapper[4830]: I0318 18:03:19.403910 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:20 crc kubenswrapper[4830]: I0318 18:03:20.129574 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:21 crc kubenswrapper[4830]: I0318 18:03:21.134707 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:21 crc kubenswrapper[4830]: E0318 18:03:21.341875 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 18:03:21 crc kubenswrapper[4830]: I0318 18:03:21.344920 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:21 crc kubenswrapper[4830]: I0318 18:03:21.345955 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:21 crc kubenswrapper[4830]: I0318 18:03:21.345984 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:21 crc kubenswrapper[4830]: I0318 18:03:21.345993 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:21 crc kubenswrapper[4830]: I0318 18:03:21.346013 4830 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 18:03:21 crc kubenswrapper[4830]: E0318 18:03:21.349870 4830 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 18:03:22 crc kubenswrapper[4830]: I0318 18:03:22.134576 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:23 crc kubenswrapper[4830]: I0318 18:03:23.131233 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:24 crc kubenswrapper[4830]: I0318 18:03:24.133436 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:24 crc kubenswrapper[4830]: I0318 18:03:24.722622 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 18:03:24 crc kubenswrapper[4830]: I0318 18:03:24.722876 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:24 crc kubenswrapper[4830]: I0318 18:03:24.724267 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:24 crc kubenswrapper[4830]: I0318 18:03:24.724325 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:24 crc kubenswrapper[4830]: I0318 18:03:24.724345 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:25 crc kubenswrapper[4830]: I0318 18:03:25.133715 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:25 crc kubenswrapper[4830]: I0318 18:03:25.233986 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:25 crc kubenswrapper[4830]: I0318 18:03:25.235193 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:25 crc kubenswrapper[4830]: I0318 18:03:25.235236 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:25 crc kubenswrapper[4830]: I0318 18:03:25.235244 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:25 crc kubenswrapper[4830]: I0318 18:03:25.235765 4830 scope.go:117] "RemoveContainer" containerID="6c2ab63e413019d112270332e09f382ddbe6142a3d2473bf644cb05a02ecb90c" Mar 18 18:03:25 crc kubenswrapper[4830]: I0318 18:03:25.301701 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 18:03:25 crc kubenswrapper[4830]: I0318 18:03:25.330572 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 18:03:25 crc kubenswrapper[4830]: I0318 18:03:25.418488 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:25 crc kubenswrapper[4830]: I0318 18:03:25.419856 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:25 crc kubenswrapper[4830]: I0318 18:03:25.419904 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:25 crc kubenswrapper[4830]: I0318 18:03:25.419918 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:26 crc kubenswrapper[4830]: I0318 18:03:26.137553 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:26 crc kubenswrapper[4830]: E0318 18:03:26.317877 4830 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 18:03:26 crc kubenswrapper[4830]: I0318 18:03:26.422577 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 18:03:26 crc kubenswrapper[4830]: I0318 18:03:26.425423 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"811467b40c9890aea83b831baeb4e3799cdbf79ed318366ac0cc6e3a89dbda08"} Mar 18 18:03:26 crc kubenswrapper[4830]: I0318 18:03:26.425530 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:26 crc kubenswrapper[4830]: I0318 18:03:26.425683 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:26 crc kubenswrapper[4830]: I0318 18:03:26.427244 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:26 crc kubenswrapper[4830]: I0318 18:03:26.427308 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:26 crc kubenswrapper[4830]: I0318 18:03:26.427332 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:26 crc kubenswrapper[4830]: I0318 18:03:26.427712 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:26 crc kubenswrapper[4830]: I0318 18:03:26.427947 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:26 crc kubenswrapper[4830]: I0318 18:03:26.428920 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:27 crc kubenswrapper[4830]: I0318 18:03:27.132894 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:27 crc kubenswrapper[4830]: I0318 18:03:27.429664 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 18:03:27 crc kubenswrapper[4830]: I0318 18:03:27.430172 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 18:03:27 crc kubenswrapper[4830]: I0318 18:03:27.431618 4830 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="811467b40c9890aea83b831baeb4e3799cdbf79ed318366ac0cc6e3a89dbda08" exitCode=255 Mar 18 18:03:27 crc kubenswrapper[4830]: I0318 18:03:27.431681 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"811467b40c9890aea83b831baeb4e3799cdbf79ed318366ac0cc6e3a89dbda08"} Mar 18 18:03:27 crc kubenswrapper[4830]: I0318 18:03:27.431739 4830 scope.go:117] "RemoveContainer" containerID="6c2ab63e413019d112270332e09f382ddbe6142a3d2473bf644cb05a02ecb90c" Mar 18 18:03:27 crc kubenswrapper[4830]: I0318 18:03:27.431885 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:27 crc kubenswrapper[4830]: I0318 18:03:27.432678 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:27 crc kubenswrapper[4830]: I0318 18:03:27.432708 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:27 crc kubenswrapper[4830]: I0318 18:03:27.432718 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:27 crc kubenswrapper[4830]: I0318 18:03:27.433231 4830 scope.go:117] "RemoveContainer" containerID="811467b40c9890aea83b831baeb4e3799cdbf79ed318366ac0cc6e3a89dbda08" Mar 18 18:03:27 crc kubenswrapper[4830]: E0318 18:03:27.433415 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 18:03:28 crc kubenswrapper[4830]: I0318 18:03:28.132860 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:28 crc kubenswrapper[4830]: E0318 18:03:28.347832 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 18:03:28 crc kubenswrapper[4830]: I0318 18:03:28.350882 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:28 crc kubenswrapper[4830]: I0318 18:03:28.352359 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:28 crc kubenswrapper[4830]: I0318 18:03:28.352536 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:28 crc kubenswrapper[4830]: I0318 18:03:28.352557 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:28 crc kubenswrapper[4830]: I0318 18:03:28.352595 4830 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 18:03:28 crc kubenswrapper[4830]: E0318 18:03:28.358892 4830 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 18:03:28 crc kubenswrapper[4830]: I0318 18:03:28.437594 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 18:03:29 crc kubenswrapper[4830]: I0318 18:03:29.132609 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:29 crc kubenswrapper[4830]: W0318 18:03:29.429717 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 18 18:03:29 crc kubenswrapper[4830]: E0318 18:03:29.429836 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 18 18:03:30 crc kubenswrapper[4830]: I0318 18:03:30.132825 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:30 crc kubenswrapper[4830]: I0318 18:03:30.850848 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:03:30 crc kubenswrapper[4830]: I0318 18:03:30.851051 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:30 crc kubenswrapper[4830]: I0318 18:03:30.852304 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:30 crc kubenswrapper[4830]: I0318 18:03:30.852347 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:30 crc kubenswrapper[4830]: I0318 18:03:30.852360 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:30 crc kubenswrapper[4830]: I0318 18:03:30.852991 4830 scope.go:117] "RemoveContainer" containerID="811467b40c9890aea83b831baeb4e3799cdbf79ed318366ac0cc6e3a89dbda08" Mar 18 18:03:30 crc kubenswrapper[4830]: E0318 18:03:30.853200 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 18:03:31 crc kubenswrapper[4830]: I0318 18:03:31.131967 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:32 crc kubenswrapper[4830]: I0318 18:03:32.131976 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:32 crc kubenswrapper[4830]: W0318 18:03:32.277003 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 18 18:03:32 crc kubenswrapper[4830]: E0318 18:03:32.277087 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 18 18:03:32 crc kubenswrapper[4830]: I0318 18:03:32.373950 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:03:32 crc kubenswrapper[4830]: I0318 18:03:32.374253 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:32 crc kubenswrapper[4830]: I0318 18:03:32.376365 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:32 crc kubenswrapper[4830]: I0318 18:03:32.376416 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:32 crc kubenswrapper[4830]: I0318 18:03:32.376429 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:32 crc kubenswrapper[4830]: I0318 18:03:32.377139 4830 scope.go:117] "RemoveContainer" containerID="811467b40c9890aea83b831baeb4e3799cdbf79ed318366ac0cc6e3a89dbda08" Mar 18 18:03:32 crc kubenswrapper[4830]: E0318 18:03:32.377352 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 18:03:33 crc kubenswrapper[4830]: I0318 18:03:33.135022 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:34 crc kubenswrapper[4830]: I0318 18:03:34.133974 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:34 crc kubenswrapper[4830]: I0318 18:03:34.726505 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 18:03:34 crc kubenswrapper[4830]: I0318 18:03:34.727684 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:34 crc kubenswrapper[4830]: I0318 18:03:34.729388 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:34 crc kubenswrapper[4830]: I0318 18:03:34.729444 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:34 crc kubenswrapper[4830]: I0318 18:03:34.729474 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:35 crc kubenswrapper[4830]: I0318 18:03:35.134279 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:35 crc kubenswrapper[4830]: E0318 18:03:35.353972 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 18:03:35 crc kubenswrapper[4830]: I0318 18:03:35.359538 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:35 crc kubenswrapper[4830]: I0318 18:03:35.361325 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:35 crc kubenswrapper[4830]: I0318 18:03:35.361387 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:35 crc kubenswrapper[4830]: I0318 18:03:35.361410 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:35 crc kubenswrapper[4830]: I0318 18:03:35.361450 4830 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 18:03:35 crc kubenswrapper[4830]: E0318 18:03:35.367333 4830 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 18:03:36 crc kubenswrapper[4830]: I0318 18:03:36.135503 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:36 crc kubenswrapper[4830]: E0318 18:03:36.318747 4830 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 18:03:37 crc kubenswrapper[4830]: I0318 18:03:37.131856 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:37 crc kubenswrapper[4830]: W0318 18:03:37.238920 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:37 crc kubenswrapper[4830]: E0318 18:03:37.238986 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 18 18:03:38 crc kubenswrapper[4830]: I0318 18:03:38.135986 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:39 crc kubenswrapper[4830]: I0318 18:03:39.133550 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:40 crc kubenswrapper[4830]: I0318 18:03:40.134471 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:41 crc kubenswrapper[4830]: I0318 18:03:41.134493 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:42 crc kubenswrapper[4830]: I0318 18:03:42.122922 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 18:03:42 crc kubenswrapper[4830]: I0318 18:03:42.123159 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:42 crc kubenswrapper[4830]: I0318 18:03:42.124798 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:42 crc kubenswrapper[4830]: I0318 18:03:42.124853 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:42 crc kubenswrapper[4830]: I0318 18:03:42.124871 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:42 crc kubenswrapper[4830]: I0318 18:03:42.134022 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:42 crc kubenswrapper[4830]: E0318 18:03:42.360624 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 18:03:42 crc kubenswrapper[4830]: I0318 18:03:42.367656 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:42 crc kubenswrapper[4830]: I0318 18:03:42.369806 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:42 crc kubenswrapper[4830]: I0318 18:03:42.369864 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:42 crc kubenswrapper[4830]: I0318 18:03:42.369883 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:42 crc kubenswrapper[4830]: I0318 18:03:42.369927 4830 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 18:03:42 crc kubenswrapper[4830]: E0318 18:03:42.374080 4830 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 18:03:42 crc kubenswrapper[4830]: W0318 18:03:42.698119 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 18 18:03:42 crc kubenswrapper[4830]: E0318 18:03:42.698216 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 18 18:03:43 crc kubenswrapper[4830]: I0318 18:03:43.135863 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:44 crc kubenswrapper[4830]: I0318 18:03:44.135464 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:45 crc kubenswrapper[4830]: I0318 18:03:45.135764 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:46 crc kubenswrapper[4830]: I0318 18:03:46.134453 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:46 crc kubenswrapper[4830]: I0318 18:03:46.234135 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:46 crc kubenswrapper[4830]: I0318 18:03:46.236462 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:46 crc kubenswrapper[4830]: I0318 18:03:46.236528 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:46 crc kubenswrapper[4830]: I0318 18:03:46.236547 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:46 crc kubenswrapper[4830]: I0318 18:03:46.238191 4830 scope.go:117] "RemoveContainer" containerID="811467b40c9890aea83b831baeb4e3799cdbf79ed318366ac0cc6e3a89dbda08" Mar 18 18:03:46 crc kubenswrapper[4830]: E0318 18:03:46.238495 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 18:03:46 crc kubenswrapper[4830]: E0318 18:03:46.319633 4830 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 18:03:47 crc kubenswrapper[4830]: I0318 18:03:47.135148 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:48 crc kubenswrapper[4830]: I0318 18:03:48.134136 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:49 crc kubenswrapper[4830]: I0318 18:03:49.135236 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:49 crc kubenswrapper[4830]: E0318 18:03:49.369632 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 18:03:49 crc kubenswrapper[4830]: I0318 18:03:49.374553 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:49 crc kubenswrapper[4830]: I0318 18:03:49.376828 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:49 crc kubenswrapper[4830]: I0318 18:03:49.376912 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:49 crc kubenswrapper[4830]: I0318 18:03:49.376934 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:49 crc kubenswrapper[4830]: I0318 18:03:49.377012 4830 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 18:03:49 crc kubenswrapper[4830]: E0318 18:03:49.382903 4830 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 18:03:50 crc kubenswrapper[4830]: I0318 18:03:50.136129 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:51 crc kubenswrapper[4830]: I0318 18:03:51.133145 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:52 crc kubenswrapper[4830]: I0318 18:03:52.132737 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:53 crc kubenswrapper[4830]: I0318 18:03:53.132898 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:53 crc kubenswrapper[4830]: I0318 18:03:53.897438 4830 csr.go:261] certificate signing request csr-cc8x4 is approved, waiting to be issued Mar 18 18:03:53 crc kubenswrapper[4830]: I0318 18:03:53.907115 4830 csr.go:257] certificate signing request csr-cc8x4 is issued Mar 18 18:03:53 crc kubenswrapper[4830]: I0318 18:03:53.940189 4830 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 18 18:03:54 crc kubenswrapper[4830]: I0318 18:03:54.904049 4830 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 18 18:03:54 crc kubenswrapper[4830]: I0318 18:03:54.909331 4830 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-18 23:24:53.552783059 +0000 UTC Mar 18 18:03:54 crc kubenswrapper[4830]: I0318 18:03:54.909408 4830 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6605h20m58.643379232s for next certificate rotation Mar 18 18:03:56 crc kubenswrapper[4830]: E0318 18:03:56.320319 4830 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 18:03:56 crc kubenswrapper[4830]: I0318 18:03:56.383928 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:56 crc kubenswrapper[4830]: I0318 18:03:56.385725 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:56 crc kubenswrapper[4830]: I0318 18:03:56.385819 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:56 crc kubenswrapper[4830]: I0318 18:03:56.385838 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:56 crc kubenswrapper[4830]: I0318 18:03:56.385975 4830 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 18:03:56 crc kubenswrapper[4830]: I0318 18:03:56.395467 4830 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 18 18:03:56 crc kubenswrapper[4830]: I0318 18:03:56.395905 4830 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 18 18:03:56 crc kubenswrapper[4830]: E0318 18:03:56.395949 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 18 18:03:56 crc kubenswrapper[4830]: I0318 18:03:56.400278 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:56 crc kubenswrapper[4830]: I0318 18:03:56.400327 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:56 crc kubenswrapper[4830]: I0318 18:03:56.400340 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:56 crc kubenswrapper[4830]: I0318 18:03:56.400358 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:56 crc kubenswrapper[4830]: I0318 18:03:56.400370 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:56Z","lastTransitionTime":"2026-03-18T18:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:56 crc kubenswrapper[4830]: E0318 18:03:56.422393 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f91fa51-750d-472a-937b-41a0fe3990f0\\\",\\\"systemUUID\\\":\\\"633bcea2-a7fe-4f06-927d-dd6893c932b6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:56 crc kubenswrapper[4830]: I0318 18:03:56.432516 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:56 crc kubenswrapper[4830]: I0318 18:03:56.432558 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:56 crc kubenswrapper[4830]: I0318 18:03:56.432571 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:56 crc kubenswrapper[4830]: I0318 18:03:56.432588 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:56 crc kubenswrapper[4830]: I0318 18:03:56.432621 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:56Z","lastTransitionTime":"2026-03-18T18:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:56 crc kubenswrapper[4830]: E0318 18:03:56.447664 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f91fa51-750d-472a-937b-41a0fe3990f0\\\",\\\"systemUUID\\\":\\\"633bcea2-a7fe-4f06-927d-dd6893c932b6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:56 crc kubenswrapper[4830]: I0318 18:03:56.461642 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:56 crc kubenswrapper[4830]: I0318 18:03:56.461725 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:56 crc kubenswrapper[4830]: I0318 18:03:56.461753 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:56 crc kubenswrapper[4830]: I0318 18:03:56.461815 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:56 crc kubenswrapper[4830]: I0318 18:03:56.461835 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:56Z","lastTransitionTime":"2026-03-18T18:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:56 crc kubenswrapper[4830]: E0318 18:03:56.480997 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f91fa51-750d-472a-937b-41a0fe3990f0\\\",\\\"systemUUID\\\":\\\"633bcea2-a7fe-4f06-927d-dd6893c932b6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:56 crc kubenswrapper[4830]: I0318 18:03:56.490375 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:56 crc kubenswrapper[4830]: I0318 18:03:56.490410 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:56 crc kubenswrapper[4830]: I0318 18:03:56.490420 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:56 crc kubenswrapper[4830]: I0318 18:03:56.490436 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:56 crc kubenswrapper[4830]: I0318 18:03:56.490446 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:56Z","lastTransitionTime":"2026-03-18T18:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:56 crc kubenswrapper[4830]: E0318 18:03:56.505606 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f91fa51-750d-472a-937b-41a0fe3990f0\\\",\\\"systemUUID\\\":\\\"633bcea2-a7fe-4f06-927d-dd6893c932b6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:56 crc kubenswrapper[4830]: E0318 18:03:56.505798 4830 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 18:03:56 crc kubenswrapper[4830]: E0318 18:03:56.505843 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:56 crc kubenswrapper[4830]: E0318 18:03:56.606712 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:56 crc kubenswrapper[4830]: E0318 18:03:56.707089 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:56 crc kubenswrapper[4830]: E0318 18:03:56.808251 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:56 crc kubenswrapper[4830]: E0318 18:03:56.908688 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:57 crc kubenswrapper[4830]: E0318 18:03:57.009142 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:57 crc kubenswrapper[4830]: E0318 18:03:57.110165 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:57 crc kubenswrapper[4830]: E0318 18:03:57.210914 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:57 crc kubenswrapper[4830]: E0318 18:03:57.311085 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:57 crc kubenswrapper[4830]: E0318 18:03:57.411264 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:57 crc kubenswrapper[4830]: E0318 18:03:57.512100 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:57 crc kubenswrapper[4830]: E0318 18:03:57.612830 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:57 crc kubenswrapper[4830]: E0318 18:03:57.713729 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:57 crc kubenswrapper[4830]: E0318 18:03:57.814868 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:57 crc kubenswrapper[4830]: E0318 18:03:57.915863 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:58 crc kubenswrapper[4830]: E0318 18:03:58.015990 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:58 crc kubenswrapper[4830]: E0318 18:03:58.116245 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:58 crc kubenswrapper[4830]: E0318 18:03:58.216785 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:58 crc kubenswrapper[4830]: E0318 18:03:58.317281 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:58 crc kubenswrapper[4830]: I0318 18:03:58.397683 4830 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 18 18:03:58 crc kubenswrapper[4830]: I0318 18:03:58.420014 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:58 crc kubenswrapper[4830]: I0318 18:03:58.420582 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:58 crc kubenswrapper[4830]: I0318 18:03:58.420674 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:58 crc kubenswrapper[4830]: I0318 18:03:58.420756 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:58 crc kubenswrapper[4830]: I0318 18:03:58.420860 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:58Z","lastTransitionTime":"2026-03-18T18:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:58 crc kubenswrapper[4830]: I0318 18:03:58.523725 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:58 crc kubenswrapper[4830]: I0318 18:03:58.523838 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:58 crc kubenswrapper[4830]: I0318 18:03:58.523874 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:58 crc kubenswrapper[4830]: I0318 18:03:58.523902 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:58 crc kubenswrapper[4830]: I0318 18:03:58.523922 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:58Z","lastTransitionTime":"2026-03-18T18:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:58 crc kubenswrapper[4830]: I0318 18:03:58.627063 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:58 crc kubenswrapper[4830]: I0318 18:03:58.627127 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:58 crc kubenswrapper[4830]: I0318 18:03:58.627138 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:58 crc kubenswrapper[4830]: I0318 18:03:58.627159 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:58 crc kubenswrapper[4830]: I0318 18:03:58.627173 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:58Z","lastTransitionTime":"2026-03-18T18:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:58 crc kubenswrapper[4830]: I0318 18:03:58.730946 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:58 crc kubenswrapper[4830]: I0318 18:03:58.731000 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:58 crc kubenswrapper[4830]: I0318 18:03:58.731017 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:58 crc kubenswrapper[4830]: I0318 18:03:58.731044 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:58 crc kubenswrapper[4830]: I0318 18:03:58.731062 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:58Z","lastTransitionTime":"2026-03-18T18:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:58 crc kubenswrapper[4830]: I0318 18:03:58.833909 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:58 crc kubenswrapper[4830]: I0318 18:03:58.833947 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:58 crc kubenswrapper[4830]: I0318 18:03:58.833958 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:58 crc kubenswrapper[4830]: I0318 18:03:58.833973 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:58 crc kubenswrapper[4830]: I0318 18:03:58.833986 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:58Z","lastTransitionTime":"2026-03-18T18:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:58 crc kubenswrapper[4830]: I0318 18:03:58.937135 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:58 crc kubenswrapper[4830]: I0318 18:03:58.937183 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:58 crc kubenswrapper[4830]: I0318 18:03:58.937195 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:58 crc kubenswrapper[4830]: I0318 18:03:58.937215 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:58 crc kubenswrapper[4830]: I0318 18:03:58.937235 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:58Z","lastTransitionTime":"2026-03-18T18:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.039595 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.039658 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.039676 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.039700 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.039717 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:59Z","lastTransitionTime":"2026-03-18T18:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.142028 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.142081 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.142093 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.142117 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.142131 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:59Z","lastTransitionTime":"2026-03-18T18:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.150814 4830 apiserver.go:52] "Watching apiserver" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.160815 4830 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.162450 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-fvhfm","openshift-machine-config-operator/machine-config-daemon-plzpb","openshift-multus/network-metrics-daemon-wx6kd","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-ovn-kubernetes/ovnkube-node-vjt8t","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-node-identity/network-node-identity-vrzqb","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nnmtt","openshift-image-registry/node-ca-5tfzr","openshift-multus/multus-additional-cni-plugins-c5rtg","openshift-multus/multus-zpw8m","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.163007 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.163181 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.163245 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.163359 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.163401 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.163442 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.163446 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.163561 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.163627 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.163854 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5tfzr" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.163916 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fvhfm" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.164002 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.164763 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.164816 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.165539 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-c5rtg" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.165634 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wx6kd" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.165697 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wx6kd" podUID="437f27f7-4531-4e3e-b3a9-a471c7630012" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.165749 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.165761 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nnmtt" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.167698 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.170923 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.171320 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.172068 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.172335 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.172361 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.172476 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.172536 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.172575 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.172658 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.172735 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.173003 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.173524 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.173728 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.173832 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.173730 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.174034 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.174056 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.174121 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.174217 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.174412 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.174427 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.174497 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.174538 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.174647 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.174818 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.174838 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.174932 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.174951 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.174960 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.174965 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.174996 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.175142 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.175143 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.175158 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.175239 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.189754 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.208945 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vjt8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.222490 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5rtg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f094c167-4135-4e16-97f3-2759780a857a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5rtg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.235838 4830 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.236100 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.243837 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.243918 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.243931 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.243950 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.243963 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:59Z","lastTransitionTime":"2026-03-18T18:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.246235 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.255428 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fvhfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bacdd483-ef3d-43b9-92c1-67f1eac421ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwr9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fvhfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.263496 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe02a32-24dc-4772-8a10-0128d3a304e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5n48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5n48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-plzpb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.273488 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nnmtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eb10a6f-af83-4366-9613-6350e3297007\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkv8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkv8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nnmtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.273728 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.273799 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.273831 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.273856 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.273881 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.273908 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.273935 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.273957 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.273982 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.274005 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.274029 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.274083 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.274121 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.274132 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.274152 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.274189 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.274228 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.274265 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.274302 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.274339 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.274373 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.274410 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.274443 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.274480 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.274519 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.274612 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.274761 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.274851 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.276026 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.276131 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.276158 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.276193 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.276212 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.276230 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.276384 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.276418 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.276438 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.276457 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.276476 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.276496 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.276515 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.276533 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.276553 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.276572 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.276591 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.276612 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.276638 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.276664 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.276688 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.276712 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.276733 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.276753 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.276796 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.276827 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.276851 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.277128 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.277170 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.277196 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.277223 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.277255 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.277279 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.277303 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.277326 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.277355 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.277381 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.277406 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.277433 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.277460 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.277483 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.277676 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.277729 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.277840 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.281392 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.281538 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.281582 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.281612 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.281642 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.281673 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.281704 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.281733 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.281762 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.281808 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.281839 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.281864 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.281893 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.281918 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.281944 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.281972 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.281996 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.282022 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.282046 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.282103 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.282136 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.282166 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.282194 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.282223 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.282250 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.282279 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.282307 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.282336 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.282365 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.282394 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.282426 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.282456 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.282483 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.282514 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.282541 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.282566 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.286422 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.286494 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.286536 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.286566 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.286603 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.286638 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.286667 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.286705 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.286736 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.286766 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.286841 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.286878 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.286914 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.286943 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.287019 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.287059 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.287089 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.287126 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.287161 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.287197 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.287224 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.287261 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.287293 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.287322 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.287354 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.287396 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.287435 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.287468 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.287502 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.287540 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.287571 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.287606 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.287657 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.287690 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.287726 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.287802 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.287833 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.287867 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.287900 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.287928 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.287963 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.287998 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.288039 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.288074 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.288108 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.288144 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.288329 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.288425 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.288470 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.288509 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.288537 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.288573 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.288658 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.288755 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.288808 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.288874 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.288912 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.288942 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.275080 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.275144 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.275160 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.290671 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.275340 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.275395 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.275947 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.276066 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.290715 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.277807 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.277644 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.277881 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.277888 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.278115 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.278221 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.278344 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.278391 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.278584 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.278868 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.279035 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.279066 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.279448 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.279718 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.279790 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.279865 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.280015 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.280060 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.280254 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.291469 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.291638 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.280304 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.301328 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.301313 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.280623 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.301454 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.281082 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.281781 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.282103 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.301497 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.282241 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.282450 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.282477 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.284899 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.285048 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.285076 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.285167 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.285246 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.285422 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.285662 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.302197 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.285694 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.285818 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.285870 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.285991 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.285907 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.286568 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.287503 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.287577 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.289816 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.289797 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.289999 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.290175 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.290205 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.290228 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.290577 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.302991 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.276132 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.290963 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.290986 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.291079 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.291638 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.291848 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:03:59.791712498 +0000 UTC m=+74.359342910 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.303223 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.303250 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.303265 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.303275 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.291930 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.291945 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.303334 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.291957 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.292484 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.293161 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.293460 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.294166 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.295429 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.295710 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.296138 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.296182 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.296601 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.296598 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.296986 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.297153 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.296902 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.297219 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.297759 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.297919 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.297940 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.298132 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.303374 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.298300 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.298342 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.298919 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.299281 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.299396 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.299707 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.304270 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.299740 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.299804 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.304398 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.299996 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.300124 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.300144 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.300249 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.300472 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.300552 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.300696 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.300722 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.300805 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.300178 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.300973 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.301002 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.301169 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.280429 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.301684 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.301812 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.302017 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.302034 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.302034 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.302392 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.304530 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.302658 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.302667 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.303642 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.304718 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.304711 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.303760 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.303926 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.304000 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.304000 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.304036 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.291839 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.285698 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.304232 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.304244 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.304844 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.305029 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.305127 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.305231 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.305601 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.305649 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.305688 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.305764 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.305896 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.305976 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.306034 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.306097 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.306147 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.306198 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.306342 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.306461 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.306584 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.306834 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.307023 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.307183 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.307255 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.307328 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.307370 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.307394 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.307379 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.307498 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.307544 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.307585 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.307511 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.307625 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.307657 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.307690 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.307728 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.307041 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.307814 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.307823 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.307887 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-var-lib-openvswitch\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.307896 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.307910 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.307951 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f094c167-4135-4e16-97f3-2759780a857a-system-cni-dir\") pod \"multus-additional-cni-plugins-c5rtg\" (UID: \"f094c167-4135-4e16-97f3-2759780a857a\") " pod="openshift-multus/multus-additional-cni-plugins-c5rtg" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.308292 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.308094 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.307802 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.308156 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.308584 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f094c167-4135-4e16-97f3-2759780a857a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-c5rtg\" (UID: \"f094c167-4135-4e16-97f3-2759780a857a\") " pod="openshift-multus/multus-additional-cni-plugins-c5rtg" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.309328 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.309377 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-multus-cni-dir\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.309403 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-cnibin\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.309428 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-etc-kubernetes\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.309463 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.309492 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwr9q\" (UniqueName: \"kubernetes.io/projected/bacdd483-ef3d-43b9-92c1-67f1eac421ad-kube-api-access-xwr9q\") pod \"node-resolver-fvhfm\" (UID: \"bacdd483-ef3d-43b9-92c1-67f1eac421ad\") " pod="openshift-dns/node-resolver-fvhfm" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.309523 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f094c167-4135-4e16-97f3-2759780a857a-cnibin\") pod \"multus-additional-cni-plugins-c5rtg\" (UID: \"f094c167-4135-4e16-97f3-2759780a857a\") " pod="openshift-multus/multus-additional-cni-plugins-c5rtg" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.309550 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f094c167-4135-4e16-97f3-2759780a857a-cni-binary-copy\") pod \"multus-additional-cni-plugins-c5rtg\" (UID: \"f094c167-4135-4e16-97f3-2759780a857a\") " pod="openshift-multus/multus-additional-cni-plugins-c5rtg" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.309580 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-hostroot\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.309607 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.309633 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.309239 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.309713 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.309730 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.309703 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.309966 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.309283 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.309326 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.309638 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.310348 4830 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.310342 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.310429 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 18:03:59.810404651 +0000 UTC m=+74.378034983 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.310521 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.309661 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0eb10a6f-af83-4366-9613-6350e3297007-env-overrides\") pod \"ovnkube-control-plane-749d76644c-nnmtt\" (UID: \"0eb10a6f-af83-4366-9613-6350e3297007\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nnmtt" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.310676 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.310718 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-run-openvswitch\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.310803 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.310843 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-host-run-k8s-cni-cncf-io\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.310878 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-host-var-lib-cni-bin\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.310911 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-systemd-units\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.310952 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg8nw\" (UniqueName: \"kubernetes.io/projected/f094c167-4135-4e16-97f3-2759780a857a-kube-api-access-jg8nw\") pod \"multus-additional-cni-plugins-c5rtg\" (UID: \"f094c167-4135-4e16-97f3-2759780a857a\") " pod="openshift-multus/multus-additional-cni-plugins-c5rtg" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.310997 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d206127d-732b-421d-85ad-22d8e21c2d45-host\") pod \"node-ca-5tfzr\" (UID: \"d206127d-732b-421d-85ad-22d8e21c2d45\") " pod="openshift-image-registry/node-ca-5tfzr" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.311031 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d206127d-732b-421d-85ad-22d8e21c2d45-serviceca\") pod \"node-ca-5tfzr\" (UID: \"d206127d-732b-421d-85ad-22d8e21c2d45\") " pod="openshift-image-registry/node-ca-5tfzr" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.311064 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0eb10a6f-af83-4366-9613-6350e3297007-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-nnmtt\" (UID: \"0eb10a6f-af83-4366-9613-6350e3297007\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nnmtt" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.311101 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.311300 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fbe02a32-24dc-4772-8a10-0128d3a304e4-proxy-tls\") pod \"machine-config-daemon-plzpb\" (UID: \"fbe02a32-24dc-4772-8a10-0128d3a304e4\") " pod="openshift-machine-config-operator/machine-config-daemon-plzpb" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.311342 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fbe02a32-24dc-4772-8a10-0128d3a304e4-mcd-auth-proxy-config\") pod \"machine-config-daemon-plzpb\" (UID: \"fbe02a32-24dc-4772-8a10-0128d3a304e4\") " pod="openshift-machine-config-operator/machine-config-daemon-plzpb" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.311376 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5n48\" (UniqueName: \"kubernetes.io/projected/fbe02a32-24dc-4772-8a10-0128d3a304e4-kube-api-access-l5n48\") pod \"machine-config-daemon-plzpb\" (UID: \"fbe02a32-24dc-4772-8a10-0128d3a304e4\") " pod="openshift-machine-config-operator/machine-config-daemon-plzpb" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.311405 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-multus-socket-dir-parent\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.311436 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-host-run-netns\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.311472 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkv8l\" (UniqueName: \"kubernetes.io/projected/0eb10a6f-af83-4366-9613-6350e3297007-kube-api-access-nkv8l\") pod \"ovnkube-control-plane-749d76644c-nnmtt\" (UID: \"0eb10a6f-af83-4366-9613-6350e3297007\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nnmtt" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.311507 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-run-netns\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.311535 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-log-socket\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.311536 4830 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.311565 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-cni-bin\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.311598 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-ovnkube-config\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.311629 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/55b8eced-700a-4b44-8315-c5afac8ca1bf-multus-daemon-config\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.311669 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24d89\" (UniqueName: \"kubernetes.io/projected/437f27f7-4531-4e3e-b3a9-a471c7630012-kube-api-access-24d89\") pod \"network-metrics-daemon-wx6kd\" (UID: \"437f27f7-4531-4e3e-b3a9-a471c7630012\") " pod="openshift-multus/network-metrics-daemon-wx6kd" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.310716 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.311707 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8r9t\" (UniqueName: \"kubernetes.io/projected/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-kube-api-access-s8r9t\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.310794 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.311729 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.311750 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-host-run-multus-certs\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.311803 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-etc-openvswitch\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.311832 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-node-log\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.311858 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-env-overrides\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.311887 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f094c167-4135-4e16-97f3-2759780a857a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-c5rtg\" (UID: \"f094c167-4135-4e16-97f3-2759780a857a\") " pod="openshift-multus/multus-additional-cni-plugins-c5rtg" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.311921 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-system-cni-dir\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.311955 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c9kx\" (UniqueName: \"kubernetes.io/projected/55b8eced-700a-4b44-8315-c5afac8ca1bf-kube-api-access-6c9kx\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.311995 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.312027 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-kubelet\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.311089 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.311247 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.311603 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.312064 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-run-ovn\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.312117 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.312151 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/55b8eced-700a-4b44-8315-c5afac8ca1bf-cni-binary-copy\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.312172 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-host-var-lib-cni-multus\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.312232 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0eb10a6f-af83-4366-9613-6350e3297007-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-nnmtt\" (UID: \"0eb10a6f-af83-4366-9613-6350e3297007\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nnmtt" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.312253 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-slash\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.312271 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-ovnkube-script-lib\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.312381 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.312292 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/437f27f7-4531-4e3e-b3a9-a471c7630012-metrics-certs\") pod \"network-metrics-daemon-wx6kd\" (UID: \"437f27f7-4531-4e3e-b3a9-a471c7630012\") " pod="openshift-multus/network-metrics-daemon-wx6kd" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.312480 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bacdd483-ef3d-43b9-92c1-67f1eac421ad-hosts-file\") pod \"node-resolver-fvhfm\" (UID: \"bacdd483-ef3d-43b9-92c1-67f1eac421ad\") " pod="openshift-dns/node-resolver-fvhfm" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.312558 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.312598 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-os-release\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.312641 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.312686 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-run-ovn-kubernetes\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.312886 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/fbe02a32-24dc-4772-8a10-0128d3a304e4-rootfs\") pod \"machine-config-daemon-plzpb\" (UID: \"fbe02a32-24dc-4772-8a10-0128d3a304e4\") " pod="openshift-machine-config-operator/machine-config-daemon-plzpb" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.312933 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq7m7\" (UniqueName: \"kubernetes.io/projected/d206127d-732b-421d-85ad-22d8e21c2d45-kube-api-access-tq7m7\") pod \"node-ca-5tfzr\" (UID: \"d206127d-732b-421d-85ad-22d8e21c2d45\") " pod="openshift-image-registry/node-ca-5tfzr" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.312970 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-host-var-lib-kubelet\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.312993 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-multus-conf-dir\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.313025 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.313149 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.313198 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-run-systemd\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.313276 4830 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.313264 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.313327 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-cni-netd\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.313356 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-ovn-node-metrics-cert\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.313478 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 18:03:59.813416547 +0000 UTC m=+74.381046939 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.313622 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.313720 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.314129 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.316275 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.316366 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.316407 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.316620 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.317007 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.317058 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f094c167-4135-4e16-97f3-2759780a857a-os-release\") pod \"multus-additional-cni-plugins-c5rtg\" (UID: \"f094c167-4135-4e16-97f3-2759780a857a\") " pod="openshift-multus/multus-additional-cni-plugins-c5rtg" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.317115 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.317145 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.318196 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.319256 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.319982 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.325641 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.325991 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.326819 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.329181 4830 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.329232 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.329259 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.329292 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.329313 4830 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.329351 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.329355 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.329373 4830 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.329400 4830 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.329420 4830 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.329443 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.329465 4830 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.329487 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.329509 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.329531 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.329552 4830 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.329378 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.329574 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.329595 4830 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.329446 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.329597 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.329661 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.329689 4830 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.329566 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.329667 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 18:03:59.829647298 +0000 UTC m=+74.397277690 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.329847 4830 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.329893 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 18:03:59.829861423 +0000 UTC m=+74.397491795 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.329922 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.329951 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.329956 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.329975 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330005 4830 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330023 4830 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330052 4830 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330069 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330099 4830 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330112 4830 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330127 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330142 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330156 4830 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330171 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330184 4830 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330198 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330210 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330223 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330238 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330252 4830 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330268 4830 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330281 4830 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330294 4830 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330307 4830 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330320 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330332 4830 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330345 4830 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330356 4830 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330370 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330383 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330395 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330408 4830 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330422 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330434 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330447 4830 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330459 4830 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330470 4830 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330481 4830 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330494 4830 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330505 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330518 4830 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330529 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330541 4830 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330552 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330564 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330574 4830 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330584 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330595 4830 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330605 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330616 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330629 4830 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330640 4830 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330650 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330662 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330675 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330687 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330699 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330711 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330722 4830 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330737 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330749 4830 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330760 4830 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330795 4830 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330808 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330831 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330843 4830 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330855 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330866 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330878 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330890 4830 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330902 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330914 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330928 4830 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330941 4830 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330952 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330964 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330975 4830 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.330987 4830 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331001 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331013 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331027 4830 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331040 4830 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331052 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331065 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331080 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331093 4830 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331108 4830 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331121 4830 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331134 4830 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331146 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331160 4830 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331175 4830 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331188 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331201 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331212 4830 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331225 4830 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331246 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331259 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331270 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331285 4830 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331298 4830 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331312 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331327 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331339 4830 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331350 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331365 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331376 4830 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331388 4830 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331400 4830 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331411 4830 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331423 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331436 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331448 4830 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331460 4830 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331472 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331484 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331497 4830 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331510 4830 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331523 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331535 4830 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331547 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331560 4830 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331573 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331585 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331598 4830 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331610 4830 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331621 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331633 4830 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331645 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331658 4830 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331672 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331690 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331708 4830 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331722 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331732 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331743 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331756 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331784 4830 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331798 4830 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331810 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331823 4830 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331834 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331846 4830 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331860 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331873 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331885 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331896 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331910 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331921 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331933 4830 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331950 4830 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331961 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331971 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.331989 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.332000 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.332015 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.332026 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.332037 4830 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.332049 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.332060 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.332072 4830 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.332341 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.332468 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.332851 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.333196 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.333217 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.333308 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.333409 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.333407 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.333978 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.334178 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.334239 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.341661 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.343640 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.348185 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.348221 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.348233 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.348253 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.348269 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:59Z","lastTransitionTime":"2026-03-18T18:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.353916 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpw8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b8eced-700a-4b44-8315-c5afac8ca1bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6c9kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpw8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.364956 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wx6kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"437f27f7-4531-4e3e-b3a9-a471c7630012\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24d89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24d89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wx6kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.367199 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.372135 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.374712 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5tfzr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d206127d-732b-421d-85ad-22d8e21c2d45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq7m7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5tfzr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.380374 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.432743 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg8nw\" (UniqueName: \"kubernetes.io/projected/f094c167-4135-4e16-97f3-2759780a857a-kube-api-access-jg8nw\") pod \"multus-additional-cni-plugins-c5rtg\" (UID: \"f094c167-4135-4e16-97f3-2759780a857a\") " pod="openshift-multus/multus-additional-cni-plugins-c5rtg" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.432848 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-host-run-k8s-cni-cncf-io\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.432895 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-host-var-lib-cni-bin\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.432923 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-systemd-units\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.432968 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fbe02a32-24dc-4772-8a10-0128d3a304e4-proxy-tls\") pod \"machine-config-daemon-plzpb\" (UID: \"fbe02a32-24dc-4772-8a10-0128d3a304e4\") " pod="openshift-machine-config-operator/machine-config-daemon-plzpb" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.432994 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fbe02a32-24dc-4772-8a10-0128d3a304e4-mcd-auth-proxy-config\") pod \"machine-config-daemon-plzpb\" (UID: \"fbe02a32-24dc-4772-8a10-0128d3a304e4\") " pod="openshift-machine-config-operator/machine-config-daemon-plzpb" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.433223 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-host-var-lib-cni-bin\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.433223 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-host-run-k8s-cni-cncf-io\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.433612 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5n48\" (UniqueName: \"kubernetes.io/projected/fbe02a32-24dc-4772-8a10-0128d3a304e4-kube-api-access-l5n48\") pod \"machine-config-daemon-plzpb\" (UID: \"fbe02a32-24dc-4772-8a10-0128d3a304e4\") " pod="openshift-machine-config-operator/machine-config-daemon-plzpb" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.433713 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d206127d-732b-421d-85ad-22d8e21c2d45-host\") pod \"node-ca-5tfzr\" (UID: \"d206127d-732b-421d-85ad-22d8e21c2d45\") " pod="openshift-image-registry/node-ca-5tfzr" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.433755 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d206127d-732b-421d-85ad-22d8e21c2d45-serviceca\") pod \"node-ca-5tfzr\" (UID: \"d206127d-732b-421d-85ad-22d8e21c2d45\") " pod="openshift-image-registry/node-ca-5tfzr" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.433788 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0eb10a6f-af83-4366-9613-6350e3297007-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-nnmtt\" (UID: \"0eb10a6f-af83-4366-9613-6350e3297007\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nnmtt" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.433859 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-log-socket\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.433879 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-cni-bin\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.433901 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-ovnkube-config\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.433900 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fbe02a32-24dc-4772-8a10-0128d3a304e4-mcd-auth-proxy-config\") pod \"machine-config-daemon-plzpb\" (UID: \"fbe02a32-24dc-4772-8a10-0128d3a304e4\") " pod="openshift-machine-config-operator/machine-config-daemon-plzpb" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.433918 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-multus-socket-dir-parent\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.433922 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d206127d-732b-421d-85ad-22d8e21c2d45-host\") pod \"node-ca-5tfzr\" (UID: \"d206127d-732b-421d-85ad-22d8e21c2d45\") " pod="openshift-image-registry/node-ca-5tfzr" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.433968 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-cni-bin\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.433979 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-host-run-netns\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.433936 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-host-run-netns\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.434018 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkv8l\" (UniqueName: \"kubernetes.io/projected/0eb10a6f-af83-4366-9613-6350e3297007-kube-api-access-nkv8l\") pod \"ovnkube-control-plane-749d76644c-nnmtt\" (UID: \"0eb10a6f-af83-4366-9613-6350e3297007\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nnmtt" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.434042 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-run-netns\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.434166 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/55b8eced-700a-4b44-8315-c5afac8ca1bf-multus-daemon-config\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.434209 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24d89\" (UniqueName: \"kubernetes.io/projected/437f27f7-4531-4e3e-b3a9-a471c7630012-kube-api-access-24d89\") pod \"network-metrics-daemon-wx6kd\" (UID: \"437f27f7-4531-4e3e-b3a9-a471c7630012\") " pod="openshift-multus/network-metrics-daemon-wx6kd" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.434232 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8r9t\" (UniqueName: \"kubernetes.io/projected/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-kube-api-access-s8r9t\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.434256 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-env-overrides\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.434276 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f094c167-4135-4e16-97f3-2759780a857a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-c5rtg\" (UID: \"f094c167-4135-4e16-97f3-2759780a857a\") " pod="openshift-multus/multus-additional-cni-plugins-c5rtg" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.434300 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-host-run-multus-certs\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.434308 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-log-socket\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.434320 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-etc-openvswitch\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.434436 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-node-log\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.434638 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-run-netns\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.434653 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-run-ovn\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.434697 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0eb10a6f-af83-4366-9613-6350e3297007-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-nnmtt\" (UID: \"0eb10a6f-af83-4366-9613-6350e3297007\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nnmtt" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.434707 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.434795 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-system-cni-dir\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.434812 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c9kx\" (UniqueName: \"kubernetes.io/projected/55b8eced-700a-4b44-8315-c5afac8ca1bf-kube-api-access-6c9kx\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.434831 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-kubelet\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.433862 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-systemd-units\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.434986 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-multus-socket-dir-parent\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.434885 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0eb10a6f-af83-4366-9613-6350e3297007-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-nnmtt\" (UID: \"0eb10a6f-af83-4366-9613-6350e3297007\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nnmtt" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.435037 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-slash\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.435060 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-ovnkube-script-lib\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.435239 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/55b8eced-700a-4b44-8315-c5afac8ca1bf-cni-binary-copy\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.435285 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-host-var-lib-cni-multus\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.435303 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/437f27f7-4531-4e3e-b3a9-a471c7630012-metrics-certs\") pod \"network-metrics-daemon-wx6kd\" (UID: \"437f27f7-4531-4e3e-b3a9-a471c7630012\") " pod="openshift-multus/network-metrics-daemon-wx6kd" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.435451 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/55b8eced-700a-4b44-8315-c5afac8ca1bf-multus-daemon-config\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.435524 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-env-overrides\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.435535 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bacdd483-ef3d-43b9-92c1-67f1eac421ad-hosts-file\") pod \"node-resolver-fvhfm\" (UID: \"bacdd483-ef3d-43b9-92c1-67f1eac421ad\") " pod="openshift-dns/node-resolver-fvhfm" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.435577 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bacdd483-ef3d-43b9-92c1-67f1eac421ad-hosts-file\") pod \"node-resolver-fvhfm\" (UID: \"bacdd483-ef3d-43b9-92c1-67f1eac421ad\") " pod="openshift-dns/node-resolver-fvhfm" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.435600 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-os-release\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.435613 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-slash\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.435639 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-run-ovn-kubernetes\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.435706 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d206127d-732b-421d-85ad-22d8e21c2d45-serviceca\") pod \"node-ca-5tfzr\" (UID: \"d206127d-732b-421d-85ad-22d8e21c2d45\") " pod="openshift-image-registry/node-ca-5tfzr" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.435734 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/fbe02a32-24dc-4772-8a10-0128d3a304e4-rootfs\") pod \"machine-config-daemon-plzpb\" (UID: \"fbe02a32-24dc-4772-8a10-0128d3a304e4\") " pod="openshift-machine-config-operator/machine-config-daemon-plzpb" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.435965 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-system-cni-dir\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.435977 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-host-var-lib-cni-multus\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.436037 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-host-run-multus-certs\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.436061 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.434372 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-etc-openvswitch\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.436133 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f094c167-4135-4e16-97f3-2759780a857a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-c5rtg\" (UID: \"f094c167-4135-4e16-97f3-2759780a857a\") " pod="openshift-multus/multus-additional-cni-plugins-c5rtg" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.436184 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-run-ovn-kubernetes\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.436190 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-node-log\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.436269 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-run-ovn\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.436302 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-kubelet\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.436275 4830 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.436402 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/437f27f7-4531-4e3e-b3a9-a471c7630012-metrics-certs podName:437f27f7-4531-4e3e-b3a9-a471c7630012 nodeName:}" failed. No retries permitted until 2026-03-18 18:03:59.936372048 +0000 UTC m=+74.504002420 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/437f27f7-4531-4e3e-b3a9-a471c7630012-metrics-certs") pod "network-metrics-daemon-wx6kd" (UID: "437f27f7-4531-4e3e-b3a9-a471c7630012") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.436429 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/fbe02a32-24dc-4772-8a10-0128d3a304e4-rootfs\") pod \"machine-config-daemon-plzpb\" (UID: \"fbe02a32-24dc-4772-8a10-0128d3a304e4\") " pod="openshift-machine-config-operator/machine-config-daemon-plzpb" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.435666 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-os-release\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.436487 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-ovnkube-script-lib\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.436883 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/55b8eced-700a-4b44-8315-c5afac8ca1bf-cni-binary-copy\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.437742 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-run-systemd\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.437829 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-cni-netd\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.437977 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-run-systemd\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.438004 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fbe02a32-24dc-4772-8a10-0128d3a304e4-proxy-tls\") pod \"machine-config-daemon-plzpb\" (UID: \"fbe02a32-24dc-4772-8a10-0128d3a304e4\") " pod="openshift-machine-config-operator/machine-config-daemon-plzpb" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.438105 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-cni-netd\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.439313 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-ovnkube-config\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.440153 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq7m7\" (UniqueName: \"kubernetes.io/projected/d206127d-732b-421d-85ad-22d8e21c2d45-kube-api-access-tq7m7\") pod \"node-ca-5tfzr\" (UID: \"d206127d-732b-421d-85ad-22d8e21c2d45\") " pod="openshift-image-registry/node-ca-5tfzr" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.440187 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-host-var-lib-kubelet\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.440212 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-multus-conf-dir\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.440234 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-ovn-node-metrics-cert\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.440259 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f094c167-4135-4e16-97f3-2759780a857a-os-release\") pod \"multus-additional-cni-plugins-c5rtg\" (UID: \"f094c167-4135-4e16-97f3-2759780a857a\") " pod="openshift-multus/multus-additional-cni-plugins-c5rtg" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.440301 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-var-lib-openvswitch\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.440326 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f094c167-4135-4e16-97f3-2759780a857a-system-cni-dir\") pod \"multus-additional-cni-plugins-c5rtg\" (UID: \"f094c167-4135-4e16-97f3-2759780a857a\") " pod="openshift-multus/multus-additional-cni-plugins-c5rtg" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.440349 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f094c167-4135-4e16-97f3-2759780a857a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-c5rtg\" (UID: \"f094c167-4135-4e16-97f3-2759780a857a\") " pod="openshift-multus/multus-additional-cni-plugins-c5rtg" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.440378 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.440406 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwr9q\" (UniqueName: \"kubernetes.io/projected/bacdd483-ef3d-43b9-92c1-67f1eac421ad-kube-api-access-xwr9q\") pod \"node-resolver-fvhfm\" (UID: \"bacdd483-ef3d-43b9-92c1-67f1eac421ad\") " pod="openshift-dns/node-resolver-fvhfm" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.440428 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f094c167-4135-4e16-97f3-2759780a857a-cnibin\") pod \"multus-additional-cni-plugins-c5rtg\" (UID: \"f094c167-4135-4e16-97f3-2759780a857a\") " pod="openshift-multus/multus-additional-cni-plugins-c5rtg" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.440452 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f094c167-4135-4e16-97f3-2759780a857a-cni-binary-copy\") pod \"multus-additional-cni-plugins-c5rtg\" (UID: \"f094c167-4135-4e16-97f3-2759780a857a\") " pod="openshift-multus/multus-additional-cni-plugins-c5rtg" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.440475 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-multus-cni-dir\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.440497 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-cnibin\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.440549 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-etc-kubernetes\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.440602 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-hostroot\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.440626 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0eb10a6f-af83-4366-9613-6350e3297007-env-overrides\") pod \"ovnkube-control-plane-749d76644c-nnmtt\" (UID: \"0eb10a6f-af83-4366-9613-6350e3297007\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nnmtt" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.440654 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.440676 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-run-openvswitch\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.440729 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.440744 4830 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.440757 4830 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.440785 4830 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.440800 4830 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.440815 4830 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.440830 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.440846 4830 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.440858 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.440870 4830 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.440884 4830 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.441104 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.441118 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.441123 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0eb10a6f-af83-4366-9613-6350e3297007-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-nnmtt\" (UID: \"0eb10a6f-af83-4366-9613-6350e3297007\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nnmtt" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.441131 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.441339 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-run-openvswitch\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.441533 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.441574 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-etc-kubernetes\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.441623 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f094c167-4135-4e16-97f3-2759780a857a-cnibin\") pod \"multus-additional-cni-plugins-c5rtg\" (UID: \"f094c167-4135-4e16-97f3-2759780a857a\") " pod="openshift-multus/multus-additional-cni-plugins-c5rtg" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.441823 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-host-var-lib-kubelet\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.441852 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-multus-conf-dir\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.442243 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f094c167-4135-4e16-97f3-2759780a857a-cni-binary-copy\") pod \"multus-additional-cni-plugins-c5rtg\" (UID: \"f094c167-4135-4e16-97f3-2759780a857a\") " pod="openshift-multus/multus-additional-cni-plugins-c5rtg" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.442304 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-multus-cni-dir\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.442339 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-cnibin\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.442698 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0eb10a6f-af83-4366-9613-6350e3297007-env-overrides\") pod \"ovnkube-control-plane-749d76644c-nnmtt\" (UID: \"0eb10a6f-af83-4366-9613-6350e3297007\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nnmtt" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.443037 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/55b8eced-700a-4b44-8315-c5afac8ca1bf-hostroot\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.443070 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.443102 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-var-lib-openvswitch\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.443142 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f094c167-4135-4e16-97f3-2759780a857a-os-release\") pod \"multus-additional-cni-plugins-c5rtg\" (UID: \"f094c167-4135-4e16-97f3-2759780a857a\") " pod="openshift-multus/multus-additional-cni-plugins-c5rtg" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.443316 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f094c167-4135-4e16-97f3-2759780a857a-system-cni-dir\") pod \"multus-additional-cni-plugins-c5rtg\" (UID: \"f094c167-4135-4e16-97f3-2759780a857a\") " pod="openshift-multus/multus-additional-cni-plugins-c5rtg" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.444033 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f094c167-4135-4e16-97f3-2759780a857a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-c5rtg\" (UID: \"f094c167-4135-4e16-97f3-2759780a857a\") " pod="openshift-multus/multus-additional-cni-plugins-c5rtg" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.444077 4830 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.444096 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.445795 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-ovn-node-metrics-cert\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.450486 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.450530 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.450546 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.450565 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.450578 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:59Z","lastTransitionTime":"2026-03-18T18:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.451254 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5n48\" (UniqueName: \"kubernetes.io/projected/fbe02a32-24dc-4772-8a10-0128d3a304e4-kube-api-access-l5n48\") pod \"machine-config-daemon-plzpb\" (UID: \"fbe02a32-24dc-4772-8a10-0128d3a304e4\") " pod="openshift-machine-config-operator/machine-config-daemon-plzpb" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.453985 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8r9t\" (UniqueName: \"kubernetes.io/projected/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-kube-api-access-s8r9t\") pod \"ovnkube-node-vjt8t\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.457546 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24d89\" (UniqueName: \"kubernetes.io/projected/437f27f7-4531-4e3e-b3a9-a471c7630012-kube-api-access-24d89\") pod \"network-metrics-daemon-wx6kd\" (UID: \"437f27f7-4531-4e3e-b3a9-a471c7630012\") " pod="openshift-multus/network-metrics-daemon-wx6kd" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.458086 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwr9q\" (UniqueName: \"kubernetes.io/projected/bacdd483-ef3d-43b9-92c1-67f1eac421ad-kube-api-access-xwr9q\") pod \"node-resolver-fvhfm\" (UID: \"bacdd483-ef3d-43b9-92c1-67f1eac421ad\") " pod="openshift-dns/node-resolver-fvhfm" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.458859 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq7m7\" (UniqueName: \"kubernetes.io/projected/d206127d-732b-421d-85ad-22d8e21c2d45-kube-api-access-tq7m7\") pod \"node-ca-5tfzr\" (UID: \"d206127d-732b-421d-85ad-22d8e21c2d45\") " pod="openshift-image-registry/node-ca-5tfzr" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.459945 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c9kx\" (UniqueName: \"kubernetes.io/projected/55b8eced-700a-4b44-8315-c5afac8ca1bf-kube-api-access-6c9kx\") pod \"multus-zpw8m\" (UID: \"55b8eced-700a-4b44-8315-c5afac8ca1bf\") " pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.465326 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg8nw\" (UniqueName: \"kubernetes.io/projected/f094c167-4135-4e16-97f3-2759780a857a-kube-api-access-jg8nw\") pod \"multus-additional-cni-plugins-c5rtg\" (UID: \"f094c167-4135-4e16-97f3-2759780a857a\") " pod="openshift-multus/multus-additional-cni-plugins-c5rtg" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.466749 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkv8l\" (UniqueName: \"kubernetes.io/projected/0eb10a6f-af83-4366-9613-6350e3297007-kube-api-access-nkv8l\") pod \"ovnkube-control-plane-749d76644c-nnmtt\" (UID: \"0eb10a6f-af83-4366-9613-6350e3297007\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nnmtt" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.479627 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.487828 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.496005 4830 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:03:59 crc kubenswrapper[4830]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 18 18:03:59 crc kubenswrapper[4830]: set -o allexport Mar 18 18:03:59 crc kubenswrapper[4830]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 18 18:03:59 crc kubenswrapper[4830]: source /etc/kubernetes/apiserver-url.env Mar 18 18:03:59 crc kubenswrapper[4830]: else Mar 18 18:03:59 crc kubenswrapper[4830]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 18 18:03:59 crc kubenswrapper[4830]: exit 1 Mar 18 18:03:59 crc kubenswrapper[4830]: fi Mar 18 18:03:59 crc kubenswrapper[4830]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 18 18:03:59 crc kubenswrapper[4830]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 18:03:59 crc kubenswrapper[4830]: > logger="UnhandledError" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.497208 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.497894 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fvhfm" Mar 18 18:03:59 crc kubenswrapper[4830]: W0318 18:03:59.498068 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-c322fd63010b7968bb50213f66f107454c4e62f11e75c29fc4a50946b5f4a3d2 WatchSource:0}: Error finding container c322fd63010b7968bb50213f66f107454c4e62f11e75c29fc4a50946b5f4a3d2: Status 404 returned error can't find the container with id c322fd63010b7968bb50213f66f107454c4e62f11e75c29fc4a50946b5f4a3d2 Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.499977 4830 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:03:59 crc kubenswrapper[4830]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 18:03:59 crc kubenswrapper[4830]: if [[ -f "/env/_master" ]]; then Mar 18 18:03:59 crc kubenswrapper[4830]: set -o allexport Mar 18 18:03:59 crc kubenswrapper[4830]: source "/env/_master" Mar 18 18:03:59 crc kubenswrapper[4830]: set +o allexport Mar 18 18:03:59 crc kubenswrapper[4830]: fi Mar 18 18:03:59 crc kubenswrapper[4830]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 18 18:03:59 crc kubenswrapper[4830]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 18 18:03:59 crc kubenswrapper[4830]: ho_enable="--enable-hybrid-overlay" Mar 18 18:03:59 crc kubenswrapper[4830]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 18 18:03:59 crc kubenswrapper[4830]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 18 18:03:59 crc kubenswrapper[4830]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 18 18:03:59 crc kubenswrapper[4830]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 18:03:59 crc kubenswrapper[4830]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 18 18:03:59 crc kubenswrapper[4830]: --webhook-host=127.0.0.1 \ Mar 18 18:03:59 crc kubenswrapper[4830]: --webhook-port=9743 \ Mar 18 18:03:59 crc kubenswrapper[4830]: ${ho_enable} \ Mar 18 18:03:59 crc kubenswrapper[4830]: --enable-interconnect \ Mar 18 18:03:59 crc kubenswrapper[4830]: --disable-approver \ Mar 18 18:03:59 crc kubenswrapper[4830]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 18 18:03:59 crc kubenswrapper[4830]: --wait-for-kubernetes-api=200s \ Mar 18 18:03:59 crc kubenswrapper[4830]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 18 18:03:59 crc kubenswrapper[4830]: --loglevel="${LOGLEVEL}" Mar 18 18:03:59 crc kubenswrapper[4830]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 18:03:59 crc kubenswrapper[4830]: > logger="UnhandledError" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.504699 4830 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:03:59 crc kubenswrapper[4830]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 18:03:59 crc kubenswrapper[4830]: if [[ -f "/env/_master" ]]; then Mar 18 18:03:59 crc kubenswrapper[4830]: set -o allexport Mar 18 18:03:59 crc kubenswrapper[4830]: source "/env/_master" Mar 18 18:03:59 crc kubenswrapper[4830]: set +o allexport Mar 18 18:03:59 crc kubenswrapper[4830]: fi Mar 18 18:03:59 crc kubenswrapper[4830]: Mar 18 18:03:59 crc kubenswrapper[4830]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 18 18:03:59 crc kubenswrapper[4830]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 18:03:59 crc kubenswrapper[4830]: --disable-webhook \ Mar 18 18:03:59 crc kubenswrapper[4830]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 18 18:03:59 crc kubenswrapper[4830]: --loglevel="${LOGLEVEL}" Mar 18 18:03:59 crc kubenswrapper[4830]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 18:03:59 crc kubenswrapper[4830]: > logger="UnhandledError" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.506061 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.508419 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" Mar 18 18:03:59 crc kubenswrapper[4830]: W0318 18:03:59.509479 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbacdd483_ef3d_43b9_92c1_67f1eac421ad.slice/crio-34536aa6f6a23c024d5bf4a10f19706f82a3bb0f2c4c842d1a60d63bf07319e5 WatchSource:0}: Error finding container 34536aa6f6a23c024d5bf4a10f19706f82a3bb0f2c4c842d1a60d63bf07319e5: Status 404 returned error can't find the container with id 34536aa6f6a23c024d5bf4a10f19706f82a3bb0f2c4c842d1a60d63bf07319e5 Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.514845 4830 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:03:59 crc kubenswrapper[4830]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 18 18:03:59 crc kubenswrapper[4830]: set -uo pipefail Mar 18 18:03:59 crc kubenswrapper[4830]: Mar 18 18:03:59 crc kubenswrapper[4830]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 18 18:03:59 crc kubenswrapper[4830]: Mar 18 18:03:59 crc kubenswrapper[4830]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 18 18:03:59 crc kubenswrapper[4830]: HOSTS_FILE="/etc/hosts" Mar 18 18:03:59 crc kubenswrapper[4830]: TEMP_FILE="/etc/hosts.tmp" Mar 18 18:03:59 crc kubenswrapper[4830]: Mar 18 18:03:59 crc kubenswrapper[4830]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 18 18:03:59 crc kubenswrapper[4830]: Mar 18 18:03:59 crc kubenswrapper[4830]: # Make a temporary file with the old hosts file's attributes. Mar 18 18:03:59 crc kubenswrapper[4830]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 18 18:03:59 crc kubenswrapper[4830]: echo "Failed to preserve hosts file. Exiting." Mar 18 18:03:59 crc kubenswrapper[4830]: exit 1 Mar 18 18:03:59 crc kubenswrapper[4830]: fi Mar 18 18:03:59 crc kubenswrapper[4830]: Mar 18 18:03:59 crc kubenswrapper[4830]: while true; do Mar 18 18:03:59 crc kubenswrapper[4830]: declare -A svc_ips Mar 18 18:03:59 crc kubenswrapper[4830]: for svc in "${services[@]}"; do Mar 18 18:03:59 crc kubenswrapper[4830]: # Fetch service IP from cluster dns if present. We make several tries Mar 18 18:03:59 crc kubenswrapper[4830]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 18 18:03:59 crc kubenswrapper[4830]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 18 18:03:59 crc kubenswrapper[4830]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 18 18:03:59 crc kubenswrapper[4830]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 18:03:59 crc kubenswrapper[4830]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 18:03:59 crc kubenswrapper[4830]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 18:03:59 crc kubenswrapper[4830]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 18 18:03:59 crc kubenswrapper[4830]: for i in ${!cmds[*]} Mar 18 18:03:59 crc kubenswrapper[4830]: do Mar 18 18:03:59 crc kubenswrapper[4830]: ips=($(eval "${cmds[i]}")) Mar 18 18:03:59 crc kubenswrapper[4830]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 18 18:03:59 crc kubenswrapper[4830]: svc_ips["${svc}"]="${ips[@]}" Mar 18 18:03:59 crc kubenswrapper[4830]: break Mar 18 18:03:59 crc kubenswrapper[4830]: fi Mar 18 18:03:59 crc kubenswrapper[4830]: done Mar 18 18:03:59 crc kubenswrapper[4830]: done Mar 18 18:03:59 crc kubenswrapper[4830]: Mar 18 18:03:59 crc kubenswrapper[4830]: # Update /etc/hosts only if we get valid service IPs Mar 18 18:03:59 crc kubenswrapper[4830]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 18 18:03:59 crc kubenswrapper[4830]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 18 18:03:59 crc kubenswrapper[4830]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 18 18:03:59 crc kubenswrapper[4830]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 18 18:03:59 crc kubenswrapper[4830]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 18 18:03:59 crc kubenswrapper[4830]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 18 18:03:59 crc kubenswrapper[4830]: sleep 60 & wait Mar 18 18:03:59 crc kubenswrapper[4830]: continue Mar 18 18:03:59 crc kubenswrapper[4830]: fi Mar 18 18:03:59 crc kubenswrapper[4830]: Mar 18 18:03:59 crc kubenswrapper[4830]: # Append resolver entries for services Mar 18 18:03:59 crc kubenswrapper[4830]: rc=0 Mar 18 18:03:59 crc kubenswrapper[4830]: for svc in "${!svc_ips[@]}"; do Mar 18 18:03:59 crc kubenswrapper[4830]: for ip in ${svc_ips[${svc}]}; do Mar 18 18:03:59 crc kubenswrapper[4830]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 18 18:03:59 crc kubenswrapper[4830]: done Mar 18 18:03:59 crc kubenswrapper[4830]: done Mar 18 18:03:59 crc kubenswrapper[4830]: if [[ $rc -ne 0 ]]; then Mar 18 18:03:59 crc kubenswrapper[4830]: sleep 60 & wait Mar 18 18:03:59 crc kubenswrapper[4830]: continue Mar 18 18:03:59 crc kubenswrapper[4830]: fi Mar 18 18:03:59 crc kubenswrapper[4830]: Mar 18 18:03:59 crc kubenswrapper[4830]: Mar 18 18:03:59 crc kubenswrapper[4830]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 18 18:03:59 crc kubenswrapper[4830]: # Replace /etc/hosts with our modified version if needed Mar 18 18:03:59 crc kubenswrapper[4830]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 18 18:03:59 crc kubenswrapper[4830]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 18 18:03:59 crc kubenswrapper[4830]: fi Mar 18 18:03:59 crc kubenswrapper[4830]: sleep 60 & wait Mar 18 18:03:59 crc kubenswrapper[4830]: unset svc_ips Mar 18 18:03:59 crc kubenswrapper[4830]: done Mar 18 18:03:59 crc kubenswrapper[4830]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xwr9q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-fvhfm_openshift-dns(bacdd483-ef3d-43b9-92c1-67f1eac421ad): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 18:03:59 crc kubenswrapper[4830]: > logger="UnhandledError" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.516785 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-fvhfm" podUID="bacdd483-ef3d-43b9-92c1-67f1eac421ad" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.517926 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5tfzr" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.523234 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fvhfm" event={"ID":"bacdd483-ef3d-43b9-92c1-67f1eac421ad","Type":"ContainerStarted","Data":"34536aa6f6a23c024d5bf4a10f19706f82a3bb0f2c4c842d1a60d63bf07319e5"} Mar 18 18:03:59 crc kubenswrapper[4830]: W0318 18:03:59.523320 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbe02a32_24dc_4772_8a10_0128d3a304e4.slice/crio-20e4b33d13e309b589f8b809d99a3fb8ec60a38bcecf5d5b72bed33373ffceda WatchSource:0}: Error finding container 20e4b33d13e309b589f8b809d99a3fb8ec60a38bcecf5d5b72bed33373ffceda: Status 404 returned error can't find the container with id 20e4b33d13e309b589f8b809d99a3fb8ec60a38bcecf5d5b72bed33373ffceda Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.524477 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c322fd63010b7968bb50213f66f107454c4e62f11e75c29fc4a50946b5f4a3d2"} Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.524893 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zpw8m" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.527047 4830 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:03:59 crc kubenswrapper[4830]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 18 18:03:59 crc kubenswrapper[4830]: set -uo pipefail Mar 18 18:03:59 crc kubenswrapper[4830]: Mar 18 18:03:59 crc kubenswrapper[4830]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 18 18:03:59 crc kubenswrapper[4830]: Mar 18 18:03:59 crc kubenswrapper[4830]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 18 18:03:59 crc kubenswrapper[4830]: HOSTS_FILE="/etc/hosts" Mar 18 18:03:59 crc kubenswrapper[4830]: TEMP_FILE="/etc/hosts.tmp" Mar 18 18:03:59 crc kubenswrapper[4830]: Mar 18 18:03:59 crc kubenswrapper[4830]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 18 18:03:59 crc kubenswrapper[4830]: Mar 18 18:03:59 crc kubenswrapper[4830]: # Make a temporary file with the old hosts file's attributes. Mar 18 18:03:59 crc kubenswrapper[4830]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 18 18:03:59 crc kubenswrapper[4830]: echo "Failed to preserve hosts file. Exiting." Mar 18 18:03:59 crc kubenswrapper[4830]: exit 1 Mar 18 18:03:59 crc kubenswrapper[4830]: fi Mar 18 18:03:59 crc kubenswrapper[4830]: Mar 18 18:03:59 crc kubenswrapper[4830]: while true; do Mar 18 18:03:59 crc kubenswrapper[4830]: declare -A svc_ips Mar 18 18:03:59 crc kubenswrapper[4830]: for svc in "${services[@]}"; do Mar 18 18:03:59 crc kubenswrapper[4830]: # Fetch service IP from cluster dns if present. We make several tries Mar 18 18:03:59 crc kubenswrapper[4830]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 18 18:03:59 crc kubenswrapper[4830]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 18 18:03:59 crc kubenswrapper[4830]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 18 18:03:59 crc kubenswrapper[4830]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 18:03:59 crc kubenswrapper[4830]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 18:03:59 crc kubenswrapper[4830]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 18:03:59 crc kubenswrapper[4830]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 18 18:03:59 crc kubenswrapper[4830]: for i in ${!cmds[*]} Mar 18 18:03:59 crc kubenswrapper[4830]: do Mar 18 18:03:59 crc kubenswrapper[4830]: ips=($(eval "${cmds[i]}")) Mar 18 18:03:59 crc kubenswrapper[4830]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 18 18:03:59 crc kubenswrapper[4830]: svc_ips["${svc}"]="${ips[@]}" Mar 18 18:03:59 crc kubenswrapper[4830]: break Mar 18 18:03:59 crc kubenswrapper[4830]: fi Mar 18 18:03:59 crc kubenswrapper[4830]: done Mar 18 18:03:59 crc kubenswrapper[4830]: done Mar 18 18:03:59 crc kubenswrapper[4830]: Mar 18 18:03:59 crc kubenswrapper[4830]: # Update /etc/hosts only if we get valid service IPs Mar 18 18:03:59 crc kubenswrapper[4830]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 18 18:03:59 crc kubenswrapper[4830]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 18 18:03:59 crc kubenswrapper[4830]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 18 18:03:59 crc kubenswrapper[4830]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 18 18:03:59 crc kubenswrapper[4830]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 18 18:03:59 crc kubenswrapper[4830]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 18 18:03:59 crc kubenswrapper[4830]: sleep 60 & wait Mar 18 18:03:59 crc kubenswrapper[4830]: continue Mar 18 18:03:59 crc kubenswrapper[4830]: fi Mar 18 18:03:59 crc kubenswrapper[4830]: Mar 18 18:03:59 crc kubenswrapper[4830]: # Append resolver entries for services Mar 18 18:03:59 crc kubenswrapper[4830]: rc=0 Mar 18 18:03:59 crc kubenswrapper[4830]: for svc in "${!svc_ips[@]}"; do Mar 18 18:03:59 crc kubenswrapper[4830]: for ip in ${svc_ips[${svc}]}; do Mar 18 18:03:59 crc kubenswrapper[4830]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 18 18:03:59 crc kubenswrapper[4830]: done Mar 18 18:03:59 crc kubenswrapper[4830]: done Mar 18 18:03:59 crc kubenswrapper[4830]: if [[ $rc -ne 0 ]]; then Mar 18 18:03:59 crc kubenswrapper[4830]: sleep 60 & wait Mar 18 18:03:59 crc kubenswrapper[4830]: continue Mar 18 18:03:59 crc kubenswrapper[4830]: fi Mar 18 18:03:59 crc kubenswrapper[4830]: Mar 18 18:03:59 crc kubenswrapper[4830]: Mar 18 18:03:59 crc kubenswrapper[4830]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 18 18:03:59 crc kubenswrapper[4830]: # Replace /etc/hosts with our modified version if needed Mar 18 18:03:59 crc kubenswrapper[4830]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 18 18:03:59 crc kubenswrapper[4830]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 18 18:03:59 crc kubenswrapper[4830]: fi Mar 18 18:03:59 crc kubenswrapper[4830]: sleep 60 & wait Mar 18 18:03:59 crc kubenswrapper[4830]: unset svc_ips Mar 18 18:03:59 crc kubenswrapper[4830]: done Mar 18 18:03:59 crc kubenswrapper[4830]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xwr9q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-fvhfm_openshift-dns(bacdd483-ef3d-43b9-92c1-67f1eac421ad): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 18:03:59 crc kubenswrapper[4830]: > logger="UnhandledError" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.527731 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"4fc031b4cd68401f309ba3412704baff8631ffe5936d4ec99f78075c86fdea25"} Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.527981 4830 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:03:59 crc kubenswrapper[4830]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 18:03:59 crc kubenswrapper[4830]: if [[ -f "/env/_master" ]]; then Mar 18 18:03:59 crc kubenswrapper[4830]: set -o allexport Mar 18 18:03:59 crc kubenswrapper[4830]: source "/env/_master" Mar 18 18:03:59 crc kubenswrapper[4830]: set +o allexport Mar 18 18:03:59 crc kubenswrapper[4830]: fi Mar 18 18:03:59 crc kubenswrapper[4830]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 18 18:03:59 crc kubenswrapper[4830]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 18 18:03:59 crc kubenswrapper[4830]: ho_enable="--enable-hybrid-overlay" Mar 18 18:03:59 crc kubenswrapper[4830]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 18 18:03:59 crc kubenswrapper[4830]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 18 18:03:59 crc kubenswrapper[4830]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 18 18:03:59 crc kubenswrapper[4830]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 18:03:59 crc kubenswrapper[4830]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 18 18:03:59 crc kubenswrapper[4830]: --webhook-host=127.0.0.1 \ Mar 18 18:03:59 crc kubenswrapper[4830]: --webhook-port=9743 \ Mar 18 18:03:59 crc kubenswrapper[4830]: ${ho_enable} \ Mar 18 18:03:59 crc kubenswrapper[4830]: --enable-interconnect \ Mar 18 18:03:59 crc kubenswrapper[4830]: --disable-approver \ Mar 18 18:03:59 crc kubenswrapper[4830]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 18 18:03:59 crc kubenswrapper[4830]: --wait-for-kubernetes-api=200s \ Mar 18 18:03:59 crc kubenswrapper[4830]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 18 18:03:59 crc kubenswrapper[4830]: --loglevel="${LOGLEVEL}" Mar 18 18:03:59 crc kubenswrapper[4830]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 18:03:59 crc kubenswrapper[4830]: > logger="UnhandledError" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.527997 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l5n48,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.528179 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-fvhfm" podUID="bacdd483-ef3d-43b9-92c1-67f1eac421ad" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.530422 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l5n48,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.530666 4830 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:03:59 crc kubenswrapper[4830]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 18:03:59 crc kubenswrapper[4830]: if [[ -f "/env/_master" ]]; then Mar 18 18:03:59 crc kubenswrapper[4830]: set -o allexport Mar 18 18:03:59 crc kubenswrapper[4830]: source "/env/_master" Mar 18 18:03:59 crc kubenswrapper[4830]: set +o allexport Mar 18 18:03:59 crc kubenswrapper[4830]: fi Mar 18 18:03:59 crc kubenswrapper[4830]: Mar 18 18:03:59 crc kubenswrapper[4830]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 18 18:03:59 crc kubenswrapper[4830]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 18:03:59 crc kubenswrapper[4830]: --disable-webhook \ Mar 18 18:03:59 crc kubenswrapper[4830]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 18 18:03:59 crc kubenswrapper[4830]: --loglevel="${LOGLEVEL}" Mar 18 18:03:59 crc kubenswrapper[4830]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 18:03:59 crc kubenswrapper[4830]: > logger="UnhandledError" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.530693 4830 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:03:59 crc kubenswrapper[4830]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 18 18:03:59 crc kubenswrapper[4830]: set -o allexport Mar 18 18:03:59 crc kubenswrapper[4830]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 18 18:03:59 crc kubenswrapper[4830]: source /etc/kubernetes/apiserver-url.env Mar 18 18:03:59 crc kubenswrapper[4830]: else Mar 18 18:03:59 crc kubenswrapper[4830]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 18 18:03:59 crc kubenswrapper[4830]: exit 1 Mar 18 18:03:59 crc kubenswrapper[4830]: fi Mar 18 18:03:59 crc kubenswrapper[4830]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 18 18:03:59 crc kubenswrapper[4830]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 18:03:59 crc kubenswrapper[4830]: > logger="UnhandledError" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.531968 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.532031 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.532070 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.533275 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.533933 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5tfzr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d206127d-732b-421d-85ad-22d8e21c2d45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq7m7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5tfzr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: W0318 18:03:59.535430 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd206127d_732b_421d_85ad_22d8e21c2d45.slice/crio-a75414e698958a9164bb02ebdf49c1d934388324dcfeb5df60a501f8cf20c790 WatchSource:0}: Error finding container a75414e698958a9164bb02ebdf49c1d934388324dcfeb5df60a501f8cf20c790: Status 404 returned error can't find the container with id a75414e698958a9164bb02ebdf49c1d934388324dcfeb5df60a501f8cf20c790 Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.539083 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.545358 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-c5rtg" Mar 18 18:03:59 crc kubenswrapper[4830]: W0318 18:03:59.545575 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55b8eced_700a_4b44_8315_c5afac8ca1bf.slice/crio-8183879e5928f5a3917da2d450ebba9d1c89c6b8a76144a92e4e394f2f9da930 WatchSource:0}: Error finding container 8183879e5928f5a3917da2d450ebba9d1c89c6b8a76144a92e4e394f2f9da930: Status 404 returned error can't find the container with id 8183879e5928f5a3917da2d450ebba9d1c89c6b8a76144a92e4e394f2f9da930 Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.546636 4830 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:03:59 crc kubenswrapper[4830]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 18 18:03:59 crc kubenswrapper[4830]: while [ true ]; Mar 18 18:03:59 crc kubenswrapper[4830]: do Mar 18 18:03:59 crc kubenswrapper[4830]: for f in $(ls /tmp/serviceca); do Mar 18 18:03:59 crc kubenswrapper[4830]: echo $f Mar 18 18:03:59 crc kubenswrapper[4830]: ca_file_path="/tmp/serviceca/${f}" Mar 18 18:03:59 crc kubenswrapper[4830]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 18 18:03:59 crc kubenswrapper[4830]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 18 18:03:59 crc kubenswrapper[4830]: if [ -e "${reg_dir_path}" ]; then Mar 18 18:03:59 crc kubenswrapper[4830]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 18 18:03:59 crc kubenswrapper[4830]: else Mar 18 18:03:59 crc kubenswrapper[4830]: mkdir $reg_dir_path Mar 18 18:03:59 crc kubenswrapper[4830]: cp $ca_file_path $reg_dir_path/ca.crt Mar 18 18:03:59 crc kubenswrapper[4830]: fi Mar 18 18:03:59 crc kubenswrapper[4830]: done Mar 18 18:03:59 crc kubenswrapper[4830]: for d in $(ls /etc/docker/certs.d); do Mar 18 18:03:59 crc kubenswrapper[4830]: echo $d Mar 18 18:03:59 crc kubenswrapper[4830]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 18 18:03:59 crc kubenswrapper[4830]: reg_conf_path="/tmp/serviceca/${dp}" Mar 18 18:03:59 crc kubenswrapper[4830]: if [ ! -e "${reg_conf_path}" ]; then Mar 18 18:03:59 crc kubenswrapper[4830]: rm -rf /etc/docker/certs.d/$d Mar 18 18:03:59 crc kubenswrapper[4830]: fi Mar 18 18:03:59 crc kubenswrapper[4830]: done Mar 18 18:03:59 crc kubenswrapper[4830]: sleep 60 & wait ${!} Mar 18 18:03:59 crc kubenswrapper[4830]: done Mar 18 18:03:59 crc kubenswrapper[4830]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tq7m7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-5tfzr_openshift-image-registry(d206127d-732b-421d-85ad-22d8e21c2d45): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 18:03:59 crc kubenswrapper[4830]: > logger="UnhandledError" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.547682 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-5tfzr" podUID="d206127d-732b-421d-85ad-22d8e21c2d45" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.551947 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nnmtt" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.554745 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.554849 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.554879 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.554913 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.554938 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:59Z","lastTransitionTime":"2026-03-18T18:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.556866 4830 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:03:59 crc kubenswrapper[4830]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 18 18:03:59 crc kubenswrapper[4830]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 18 18:03:59 crc kubenswrapper[4830]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6c9kx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-zpw8m_openshift-multus(55b8eced-700a-4b44-8315-c5afac8ca1bf): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 18:03:59 crc kubenswrapper[4830]: > logger="UnhandledError" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.557963 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-zpw8m" podUID="55b8eced-700a-4b44-8315-c5afac8ca1bf" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.560456 4830 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:03:59 crc kubenswrapper[4830]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 18 18:03:59 crc kubenswrapper[4830]: apiVersion: v1 Mar 18 18:03:59 crc kubenswrapper[4830]: clusters: Mar 18 18:03:59 crc kubenswrapper[4830]: - cluster: Mar 18 18:03:59 crc kubenswrapper[4830]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 18 18:03:59 crc kubenswrapper[4830]: server: https://api-int.crc.testing:6443 Mar 18 18:03:59 crc kubenswrapper[4830]: name: default-cluster Mar 18 18:03:59 crc kubenswrapper[4830]: contexts: Mar 18 18:03:59 crc kubenswrapper[4830]: - context: Mar 18 18:03:59 crc kubenswrapper[4830]: cluster: default-cluster Mar 18 18:03:59 crc kubenswrapper[4830]: namespace: default Mar 18 18:03:59 crc kubenswrapper[4830]: user: default-auth Mar 18 18:03:59 crc kubenswrapper[4830]: name: default-context Mar 18 18:03:59 crc kubenswrapper[4830]: current-context: default-context Mar 18 18:03:59 crc kubenswrapper[4830]: kind: Config Mar 18 18:03:59 crc kubenswrapper[4830]: preferences: {} Mar 18 18:03:59 crc kubenswrapper[4830]: users: Mar 18 18:03:59 crc kubenswrapper[4830]: - name: default-auth Mar 18 18:03:59 crc kubenswrapper[4830]: user: Mar 18 18:03:59 crc kubenswrapper[4830]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 18 18:03:59 crc kubenswrapper[4830]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 18 18:03:59 crc kubenswrapper[4830]: EOF Mar 18 18:03:59 crc kubenswrapper[4830]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s8r9t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-vjt8t_openshift-ovn-kubernetes(af6abd23-401c-4f5a-a63a-19d7eed4f9ef): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 18:03:59 crc kubenswrapper[4830]: > logger="UnhandledError" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.561741 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.566326 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vjt8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: W0318 18:03:59.576020 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-c7619e54ecfd993cdeb4554b7860e957d74de9fc47d49d76abb635f43be4358f WatchSource:0}: Error finding container c7619e54ecfd993cdeb4554b7860e957d74de9fc47d49d76abb635f43be4358f: Status 404 returned error can't find the container with id c7619e54ecfd993cdeb4554b7860e957d74de9fc47d49d76abb635f43be4358f Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.582526 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.583876 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.585654 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5rtg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f094c167-4135-4e16-97f3-2759780a857a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5rtg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: W0318 18:03:59.586013 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0eb10a6f_af83_4366_9613_6350e3297007.slice/crio-29771612f9b31bbcfb92ad83542ee2b6dd006a9c0f08e02cf979faf13fce8845 WatchSource:0}: Error finding container 29771612f9b31bbcfb92ad83542ee2b6dd006a9c0f08e02cf979faf13fce8845: Status 404 returned error can't find the container with id 29771612f9b31bbcfb92ad83542ee2b6dd006a9c0f08e02cf979faf13fce8845 Mar 18 18:03:59 crc kubenswrapper[4830]: W0318 18:03:59.587146 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf094c167_4135_4e16_97f3_2759780a857a.slice/crio-3ccbef7212c43b83b4c65d8d45ebf5864bcdf31a336b5fdff8f91636eb81c238 WatchSource:0}: Error finding container 3ccbef7212c43b83b4c65d8d45ebf5864bcdf31a336b5fdff8f91636eb81c238: Status 404 returned error can't find the container with id 3ccbef7212c43b83b4c65d8d45ebf5864bcdf31a336b5fdff8f91636eb81c238 Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.589115 4830 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:03:59 crc kubenswrapper[4830]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 18 18:03:59 crc kubenswrapper[4830]: set -euo pipefail Mar 18 18:03:59 crc kubenswrapper[4830]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 18 18:03:59 crc kubenswrapper[4830]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 18 18:03:59 crc kubenswrapper[4830]: # As the secret mount is optional we must wait for the files to be present. Mar 18 18:03:59 crc kubenswrapper[4830]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 18 18:03:59 crc kubenswrapper[4830]: TS=$(date +%s) Mar 18 18:03:59 crc kubenswrapper[4830]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 18 18:03:59 crc kubenswrapper[4830]: HAS_LOGGED_INFO=0 Mar 18 18:03:59 crc kubenswrapper[4830]: Mar 18 18:03:59 crc kubenswrapper[4830]: log_missing_certs(){ Mar 18 18:03:59 crc kubenswrapper[4830]: CUR_TS=$(date +%s) Mar 18 18:03:59 crc kubenswrapper[4830]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 18 18:03:59 crc kubenswrapper[4830]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 18 18:03:59 crc kubenswrapper[4830]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 18 18:03:59 crc kubenswrapper[4830]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 18 18:03:59 crc kubenswrapper[4830]: HAS_LOGGED_INFO=1 Mar 18 18:03:59 crc kubenswrapper[4830]: fi Mar 18 18:03:59 crc kubenswrapper[4830]: } Mar 18 18:03:59 crc kubenswrapper[4830]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 18 18:03:59 crc kubenswrapper[4830]: log_missing_certs Mar 18 18:03:59 crc kubenswrapper[4830]: sleep 5 Mar 18 18:03:59 crc kubenswrapper[4830]: done Mar 18 18:03:59 crc kubenswrapper[4830]: Mar 18 18:03:59 crc kubenswrapper[4830]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 18 18:03:59 crc kubenswrapper[4830]: exec /usr/bin/kube-rbac-proxy \ Mar 18 18:03:59 crc kubenswrapper[4830]: --logtostderr \ Mar 18 18:03:59 crc kubenswrapper[4830]: --secure-listen-address=:9108 \ Mar 18 18:03:59 crc kubenswrapper[4830]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 18 18:03:59 crc kubenswrapper[4830]: --upstream=http://127.0.0.1:29108/ \ Mar 18 18:03:59 crc kubenswrapper[4830]: --tls-private-key-file=${TLS_PK} \ Mar 18 18:03:59 crc kubenswrapper[4830]: --tls-cert-file=${TLS_CERT} Mar 18 18:03:59 crc kubenswrapper[4830]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nkv8l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-nnmtt_openshift-ovn-kubernetes(0eb10a6f-af83-4366-9613-6350e3297007): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 18:03:59 crc kubenswrapper[4830]: > logger="UnhandledError" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.589322 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jg8nw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-c5rtg_openshift-multus(f094c167-4135-4e16-97f3-2759780a857a): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.591072 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-c5rtg" podUID="f094c167-4135-4e16-97f3-2759780a857a" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.592005 4830 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:03:59 crc kubenswrapper[4830]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 18:03:59 crc kubenswrapper[4830]: if [[ -f "/env/_master" ]]; then Mar 18 18:03:59 crc kubenswrapper[4830]: set -o allexport Mar 18 18:03:59 crc kubenswrapper[4830]: source "/env/_master" Mar 18 18:03:59 crc kubenswrapper[4830]: set +o allexport Mar 18 18:03:59 crc kubenswrapper[4830]: fi Mar 18 18:03:59 crc kubenswrapper[4830]: Mar 18 18:03:59 crc kubenswrapper[4830]: ovn_v4_join_subnet_opt= Mar 18 18:03:59 crc kubenswrapper[4830]: if [[ "" != "" ]]; then Mar 18 18:03:59 crc kubenswrapper[4830]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 18 18:03:59 crc kubenswrapper[4830]: fi Mar 18 18:03:59 crc kubenswrapper[4830]: ovn_v6_join_subnet_opt= Mar 18 18:03:59 crc kubenswrapper[4830]: if [[ "" != "" ]]; then Mar 18 18:03:59 crc kubenswrapper[4830]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 18 18:03:59 crc kubenswrapper[4830]: fi Mar 18 18:03:59 crc kubenswrapper[4830]: Mar 18 18:03:59 crc kubenswrapper[4830]: ovn_v4_transit_switch_subnet_opt= Mar 18 18:03:59 crc kubenswrapper[4830]: if [[ "" != "" ]]; then Mar 18 18:03:59 crc kubenswrapper[4830]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 18 18:03:59 crc kubenswrapper[4830]: fi Mar 18 18:03:59 crc kubenswrapper[4830]: ovn_v6_transit_switch_subnet_opt= Mar 18 18:03:59 crc kubenswrapper[4830]: if [[ "" != "" ]]; then Mar 18 18:03:59 crc kubenswrapper[4830]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 18 18:03:59 crc kubenswrapper[4830]: fi Mar 18 18:03:59 crc kubenswrapper[4830]: Mar 18 18:03:59 crc kubenswrapper[4830]: dns_name_resolver_enabled_flag= Mar 18 18:03:59 crc kubenswrapper[4830]: if [[ "false" == "true" ]]; then Mar 18 18:03:59 crc kubenswrapper[4830]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 18 18:03:59 crc kubenswrapper[4830]: fi Mar 18 18:03:59 crc kubenswrapper[4830]: Mar 18 18:03:59 crc kubenswrapper[4830]: persistent_ips_enabled_flag= Mar 18 18:03:59 crc kubenswrapper[4830]: if [[ "true" == "true" ]]; then Mar 18 18:03:59 crc kubenswrapper[4830]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 18 18:03:59 crc kubenswrapper[4830]: fi Mar 18 18:03:59 crc kubenswrapper[4830]: Mar 18 18:03:59 crc kubenswrapper[4830]: # This is needed so that converting clusters from GA to TP Mar 18 18:03:59 crc kubenswrapper[4830]: # will rollout control plane pods as well Mar 18 18:03:59 crc kubenswrapper[4830]: network_segmentation_enabled_flag= Mar 18 18:03:59 crc kubenswrapper[4830]: multi_network_enabled_flag= Mar 18 18:03:59 crc kubenswrapper[4830]: if [[ "true" == "true" ]]; then Mar 18 18:03:59 crc kubenswrapper[4830]: multi_network_enabled_flag="--enable-multi-network" Mar 18 18:03:59 crc kubenswrapper[4830]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 18 18:03:59 crc kubenswrapper[4830]: fi Mar 18 18:03:59 crc kubenswrapper[4830]: Mar 18 18:03:59 crc kubenswrapper[4830]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 18 18:03:59 crc kubenswrapper[4830]: exec /usr/bin/ovnkube \ Mar 18 18:03:59 crc kubenswrapper[4830]: --enable-interconnect \ Mar 18 18:03:59 crc kubenswrapper[4830]: --init-cluster-manager "${K8S_NODE}" \ Mar 18 18:03:59 crc kubenswrapper[4830]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 18 18:03:59 crc kubenswrapper[4830]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 18 18:03:59 crc kubenswrapper[4830]: --metrics-bind-address "127.0.0.1:29108" \ Mar 18 18:03:59 crc kubenswrapper[4830]: --metrics-enable-pprof \ Mar 18 18:03:59 crc kubenswrapper[4830]: --metrics-enable-config-duration \ Mar 18 18:03:59 crc kubenswrapper[4830]: ${ovn_v4_join_subnet_opt} \ Mar 18 18:03:59 crc kubenswrapper[4830]: ${ovn_v6_join_subnet_opt} \ Mar 18 18:03:59 crc kubenswrapper[4830]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 18 18:03:59 crc kubenswrapper[4830]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 18 18:03:59 crc kubenswrapper[4830]: ${dns_name_resolver_enabled_flag} \ Mar 18 18:03:59 crc kubenswrapper[4830]: ${persistent_ips_enabled_flag} \ Mar 18 18:03:59 crc kubenswrapper[4830]: ${multi_network_enabled_flag} \ Mar 18 18:03:59 crc kubenswrapper[4830]: ${network_segmentation_enabled_flag} Mar 18 18:03:59 crc kubenswrapper[4830]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nkv8l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-nnmtt_openshift-ovn-kubernetes(0eb10a6f-af83-4366-9613-6350e3297007): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 18:03:59 crc kubenswrapper[4830]: > logger="UnhandledError" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.593206 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nnmtt" podUID="0eb10a6f-af83-4366-9613-6350e3297007" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.600070 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.608751 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.616308 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fvhfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bacdd483-ef3d-43b9-92c1-67f1eac421ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwr9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fvhfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.625826 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe02a32-24dc-4772-8a10-0128d3a304e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5n48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5n48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-plzpb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.634898 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nnmtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eb10a6f-af83-4366-9613-6350e3297007\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkv8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkv8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nnmtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.650103 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.657980 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.658049 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.658067 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.658093 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.658114 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:59Z","lastTransitionTime":"2026-03-18T18:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.662864 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.678897 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.693948 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.709400 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpw8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b8eced-700a-4b44-8315-c5afac8ca1bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6c9kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpw8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.723608 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wx6kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"437f27f7-4531-4e3e-b3a9-a471c7630012\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24d89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24d89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wx6kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.736305 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fvhfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bacdd483-ef3d-43b9-92c1-67f1eac421ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwr9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fvhfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.751275 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe02a32-24dc-4772-8a10-0128d3a304e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5n48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5n48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-plzpb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.761093 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.761161 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.761176 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.761193 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.761246 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:59Z","lastTransitionTime":"2026-03-18T18:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.772697 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vjt8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.786990 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5rtg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f094c167-4135-4e16-97f3-2759780a857a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5rtg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.803838 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.817787 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.828641 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nnmtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eb10a6f-af83-4366-9613-6350e3297007\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkv8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkv8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nnmtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.843068 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpw8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b8eced-700a-4b44-8315-c5afac8ca1bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6c9kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpw8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.848333 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.848538 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.848650 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:00.848550956 +0000 UTC m=+75.416181348 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.848785 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.848914 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.848704 4830 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.849079 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.849115 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.849138 4830 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.849152 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:00.849129021 +0000 UTC m=+75.416759353 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.849209 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:00.849184912 +0000 UTC m=+75.416815284 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.848842 4830 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.849267 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:00.849253544 +0000 UTC m=+75.416883916 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.849060 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.849449 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.849627 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.849702 4830 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.849842 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:00.849831198 +0000 UTC m=+75.417461610 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.853869 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wx6kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"437f27f7-4531-4e3e-b3a9-a471c7630012\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24d89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24d89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wx6kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.864362 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.864413 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.864434 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.864497 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.864516 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:59Z","lastTransitionTime":"2026-03-18T18:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.866251 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.878182 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.889607 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.900387 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.911005 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5tfzr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d206127d-732b-421d-85ad-22d8e21c2d45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq7m7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5tfzr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.950379 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/437f27f7-4531-4e3e-b3a9-a471c7630012-metrics-certs\") pod \"network-metrics-daemon-wx6kd\" (UID: \"437f27f7-4531-4e3e-b3a9-a471c7630012\") " pod="openshift-multus/network-metrics-daemon-wx6kd" Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.950599 4830 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 18:03:59 crc kubenswrapper[4830]: E0318 18:03:59.950717 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/437f27f7-4531-4e3e-b3a9-a471c7630012-metrics-certs podName:437f27f7-4531-4e3e-b3a9-a471c7630012 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:00.95069214 +0000 UTC m=+75.518322552 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/437f27f7-4531-4e3e-b3a9-a471c7630012-metrics-certs") pod "network-metrics-daemon-wx6kd" (UID: "437f27f7-4531-4e3e-b3a9-a471c7630012") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.967036 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.967083 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.967104 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.967129 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:59 crc kubenswrapper[4830]: I0318 18:03:59.967146 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:59Z","lastTransitionTime":"2026-03-18T18:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.069828 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.069872 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.069883 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.069901 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.069914 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:00Z","lastTransitionTime":"2026-03-18T18:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.172950 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.173006 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.173023 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.173047 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.173062 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:00Z","lastTransitionTime":"2026-03-18T18:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.234049 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:00 crc kubenswrapper[4830]: E0318 18:04:00.234223 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.238736 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.239695 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.241311 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.242212 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.243645 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.244378 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.245349 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.247394 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.248915 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.251003 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.251997 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.253240 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.253832 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.254711 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.255730 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.256604 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.257546 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.257986 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.258530 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.259554 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.260045 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.261066 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.261489 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.262555 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.263047 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.263717 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.264853 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.265320 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.266751 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.267544 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.269220 4830 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.269432 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.272659 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.273928 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.275819 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.276798 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.276905 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.276981 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.277069 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.277155 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:00Z","lastTransitionTime":"2026-03-18T18:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.279485 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.280977 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.283273 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.284697 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.287086 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.288056 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.289976 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.290634 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.291307 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.291825 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.292429 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.293122 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.294144 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.294816 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.295401 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.295922 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.296555 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.297172 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.297687 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.380723 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.380801 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.380821 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.380844 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.380860 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:00Z","lastTransitionTime":"2026-03-18T18:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.482930 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.482983 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.482992 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.483008 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.483018 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:00Z","lastTransitionTime":"2026-03-18T18:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.531765 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" event={"ID":"af6abd23-401c-4f5a-a63a-19d7eed4f9ef","Type":"ContainerStarted","Data":"a7425dfb8b27990e707f48978b8a44c389acf9d1920a77ca6381f874ef3bdd3f"} Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.532750 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nnmtt" event={"ID":"0eb10a6f-af83-4366-9613-6350e3297007","Type":"ContainerStarted","Data":"29771612f9b31bbcfb92ad83542ee2b6dd006a9c0f08e02cf979faf13fce8845"} Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.533823 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c7619e54ecfd993cdeb4554b7860e957d74de9fc47d49d76abb635f43be4358f"} Mar 18 18:04:00 crc kubenswrapper[4830]: E0318 18:04:00.534114 4830 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:04:00 crc kubenswrapper[4830]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 18 18:04:00 crc kubenswrapper[4830]: apiVersion: v1 Mar 18 18:04:00 crc kubenswrapper[4830]: clusters: Mar 18 18:04:00 crc kubenswrapper[4830]: - cluster: Mar 18 18:04:00 crc kubenswrapper[4830]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 18 18:04:00 crc kubenswrapper[4830]: server: https://api-int.crc.testing:6443 Mar 18 18:04:00 crc kubenswrapper[4830]: name: default-cluster Mar 18 18:04:00 crc kubenswrapper[4830]: contexts: Mar 18 18:04:00 crc kubenswrapper[4830]: - context: Mar 18 18:04:00 crc kubenswrapper[4830]: cluster: default-cluster Mar 18 18:04:00 crc kubenswrapper[4830]: namespace: default Mar 18 18:04:00 crc kubenswrapper[4830]: user: default-auth Mar 18 18:04:00 crc kubenswrapper[4830]: name: default-context Mar 18 18:04:00 crc kubenswrapper[4830]: current-context: default-context Mar 18 18:04:00 crc kubenswrapper[4830]: kind: Config Mar 18 18:04:00 crc kubenswrapper[4830]: preferences: {} Mar 18 18:04:00 crc kubenswrapper[4830]: users: Mar 18 18:04:00 crc kubenswrapper[4830]: - name: default-auth Mar 18 18:04:00 crc kubenswrapper[4830]: user: Mar 18 18:04:00 crc kubenswrapper[4830]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 18 18:04:00 crc kubenswrapper[4830]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 18 18:04:00 crc kubenswrapper[4830]: EOF Mar 18 18:04:00 crc kubenswrapper[4830]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s8r9t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-vjt8t_openshift-ovn-kubernetes(af6abd23-401c-4f5a-a63a-19d7eed4f9ef): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 18:04:00 crc kubenswrapper[4830]: > logger="UnhandledError" Mar 18 18:04:00 crc kubenswrapper[4830]: E0318 18:04:00.535004 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 18:04:00 crc kubenswrapper[4830]: E0318 18:04:00.535250 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.535417 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zpw8m" event={"ID":"55b8eced-700a-4b44-8315-c5afac8ca1bf","Type":"ContainerStarted","Data":"8183879e5928f5a3917da2d450ebba9d1c89c6b8a76144a92e4e394f2f9da930"} Mar 18 18:04:00 crc kubenswrapper[4830]: E0318 18:04:00.535830 4830 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:04:00 crc kubenswrapper[4830]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 18 18:04:00 crc kubenswrapper[4830]: set -euo pipefail Mar 18 18:04:00 crc kubenswrapper[4830]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 18 18:04:00 crc kubenswrapper[4830]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 18 18:04:00 crc kubenswrapper[4830]: # As the secret mount is optional we must wait for the files to be present. Mar 18 18:04:00 crc kubenswrapper[4830]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 18 18:04:00 crc kubenswrapper[4830]: TS=$(date +%s) Mar 18 18:04:00 crc kubenswrapper[4830]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 18 18:04:00 crc kubenswrapper[4830]: HAS_LOGGED_INFO=0 Mar 18 18:04:00 crc kubenswrapper[4830]: Mar 18 18:04:00 crc kubenswrapper[4830]: log_missing_certs(){ Mar 18 18:04:00 crc kubenswrapper[4830]: CUR_TS=$(date +%s) Mar 18 18:04:00 crc kubenswrapper[4830]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 18 18:04:00 crc kubenswrapper[4830]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 18 18:04:00 crc kubenswrapper[4830]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 18 18:04:00 crc kubenswrapper[4830]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 18 18:04:00 crc kubenswrapper[4830]: HAS_LOGGED_INFO=1 Mar 18 18:04:00 crc kubenswrapper[4830]: fi Mar 18 18:04:00 crc kubenswrapper[4830]: } Mar 18 18:04:00 crc kubenswrapper[4830]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 18 18:04:00 crc kubenswrapper[4830]: log_missing_certs Mar 18 18:04:00 crc kubenswrapper[4830]: sleep 5 Mar 18 18:04:00 crc kubenswrapper[4830]: done Mar 18 18:04:00 crc kubenswrapper[4830]: Mar 18 18:04:00 crc kubenswrapper[4830]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 18 18:04:00 crc kubenswrapper[4830]: exec /usr/bin/kube-rbac-proxy \ Mar 18 18:04:00 crc kubenswrapper[4830]: --logtostderr \ Mar 18 18:04:00 crc kubenswrapper[4830]: --secure-listen-address=:9108 \ Mar 18 18:04:00 crc kubenswrapper[4830]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 18 18:04:00 crc kubenswrapper[4830]: --upstream=http://127.0.0.1:29108/ \ Mar 18 18:04:00 crc kubenswrapper[4830]: --tls-private-key-file=${TLS_PK} \ Mar 18 18:04:00 crc kubenswrapper[4830]: --tls-cert-file=${TLS_CERT} Mar 18 18:04:00 crc kubenswrapper[4830]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nkv8l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-nnmtt_openshift-ovn-kubernetes(0eb10a6f-af83-4366-9613-6350e3297007): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 18:04:00 crc kubenswrapper[4830]: > logger="UnhandledError" Mar 18 18:04:00 crc kubenswrapper[4830]: E0318 18:04:00.536172 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.536403 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5tfzr" event={"ID":"d206127d-732b-421d-85ad-22d8e21c2d45","Type":"ContainerStarted","Data":"a75414e698958a9164bb02ebdf49c1d934388324dcfeb5df60a501f8cf20c790"} Mar 18 18:04:00 crc kubenswrapper[4830]: E0318 18:04:00.537032 4830 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:04:00 crc kubenswrapper[4830]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 18 18:04:00 crc kubenswrapper[4830]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 18 18:04:00 crc kubenswrapper[4830]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6c9kx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-zpw8m_openshift-multus(55b8eced-700a-4b44-8315-c5afac8ca1bf): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 18:04:00 crc kubenswrapper[4830]: > logger="UnhandledError" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.537273 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerStarted","Data":"20e4b33d13e309b589f8b809d99a3fb8ec60a38bcecf5d5b72bed33373ffceda"} Mar 18 18:04:00 crc kubenswrapper[4830]: E0318 18:04:00.537924 4830 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:04:00 crc kubenswrapper[4830]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 18:04:00 crc kubenswrapper[4830]: if [[ -f "/env/_master" ]]; then Mar 18 18:04:00 crc kubenswrapper[4830]: set -o allexport Mar 18 18:04:00 crc kubenswrapper[4830]: source "/env/_master" Mar 18 18:04:00 crc kubenswrapper[4830]: set +o allexport Mar 18 18:04:00 crc kubenswrapper[4830]: fi Mar 18 18:04:00 crc kubenswrapper[4830]: Mar 18 18:04:00 crc kubenswrapper[4830]: ovn_v4_join_subnet_opt= Mar 18 18:04:00 crc kubenswrapper[4830]: if [[ "" != "" ]]; then Mar 18 18:04:00 crc kubenswrapper[4830]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 18 18:04:00 crc kubenswrapper[4830]: fi Mar 18 18:04:00 crc kubenswrapper[4830]: ovn_v6_join_subnet_opt= Mar 18 18:04:00 crc kubenswrapper[4830]: if [[ "" != "" ]]; then Mar 18 18:04:00 crc kubenswrapper[4830]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 18 18:04:00 crc kubenswrapper[4830]: fi Mar 18 18:04:00 crc kubenswrapper[4830]: Mar 18 18:04:00 crc kubenswrapper[4830]: ovn_v4_transit_switch_subnet_opt= Mar 18 18:04:00 crc kubenswrapper[4830]: if [[ "" != "" ]]; then Mar 18 18:04:00 crc kubenswrapper[4830]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 18 18:04:00 crc kubenswrapper[4830]: fi Mar 18 18:04:00 crc kubenswrapper[4830]: ovn_v6_transit_switch_subnet_opt= Mar 18 18:04:00 crc kubenswrapper[4830]: if [[ "" != "" ]]; then Mar 18 18:04:00 crc kubenswrapper[4830]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 18 18:04:00 crc kubenswrapper[4830]: fi Mar 18 18:04:00 crc kubenswrapper[4830]: Mar 18 18:04:00 crc kubenswrapper[4830]: dns_name_resolver_enabled_flag= Mar 18 18:04:00 crc kubenswrapper[4830]: if [[ "false" == "true" ]]; then Mar 18 18:04:00 crc kubenswrapper[4830]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 18 18:04:00 crc kubenswrapper[4830]: fi Mar 18 18:04:00 crc kubenswrapper[4830]: Mar 18 18:04:00 crc kubenswrapper[4830]: persistent_ips_enabled_flag= Mar 18 18:04:00 crc kubenswrapper[4830]: if [[ "true" == "true" ]]; then Mar 18 18:04:00 crc kubenswrapper[4830]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 18 18:04:00 crc kubenswrapper[4830]: fi Mar 18 18:04:00 crc kubenswrapper[4830]: Mar 18 18:04:00 crc kubenswrapper[4830]: # This is needed so that converting clusters from GA to TP Mar 18 18:04:00 crc kubenswrapper[4830]: # will rollout control plane pods as well Mar 18 18:04:00 crc kubenswrapper[4830]: network_segmentation_enabled_flag= Mar 18 18:04:00 crc kubenswrapper[4830]: multi_network_enabled_flag= Mar 18 18:04:00 crc kubenswrapper[4830]: if [[ "true" == "true" ]]; then Mar 18 18:04:00 crc kubenswrapper[4830]: multi_network_enabled_flag="--enable-multi-network" Mar 18 18:04:00 crc kubenswrapper[4830]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 18 18:04:00 crc kubenswrapper[4830]: fi Mar 18 18:04:00 crc kubenswrapper[4830]: Mar 18 18:04:00 crc kubenswrapper[4830]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 18 18:04:00 crc kubenswrapper[4830]: exec /usr/bin/ovnkube \ Mar 18 18:04:00 crc kubenswrapper[4830]: --enable-interconnect \ Mar 18 18:04:00 crc kubenswrapper[4830]: --init-cluster-manager "${K8S_NODE}" \ Mar 18 18:04:00 crc kubenswrapper[4830]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 18 18:04:00 crc kubenswrapper[4830]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 18 18:04:00 crc kubenswrapper[4830]: --metrics-bind-address "127.0.0.1:29108" \ Mar 18 18:04:00 crc kubenswrapper[4830]: --metrics-enable-pprof \ Mar 18 18:04:00 crc kubenswrapper[4830]: --metrics-enable-config-duration \ Mar 18 18:04:00 crc kubenswrapper[4830]: ${ovn_v4_join_subnet_opt} \ Mar 18 18:04:00 crc kubenswrapper[4830]: ${ovn_v6_join_subnet_opt} \ Mar 18 18:04:00 crc kubenswrapper[4830]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 18 18:04:00 crc kubenswrapper[4830]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 18 18:04:00 crc kubenswrapper[4830]: ${dns_name_resolver_enabled_flag} \ Mar 18 18:04:00 crc kubenswrapper[4830]: ${persistent_ips_enabled_flag} \ Mar 18 18:04:00 crc kubenswrapper[4830]: ${multi_network_enabled_flag} \ Mar 18 18:04:00 crc kubenswrapper[4830]: ${network_segmentation_enabled_flag} Mar 18 18:04:00 crc kubenswrapper[4830]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nkv8l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-nnmtt_openshift-ovn-kubernetes(0eb10a6f-af83-4366-9613-6350e3297007): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 18:04:00 crc kubenswrapper[4830]: > logger="UnhandledError" Mar 18 18:04:00 crc kubenswrapper[4830]: E0318 18:04:00.538018 4830 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:04:00 crc kubenswrapper[4830]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 18 18:04:00 crc kubenswrapper[4830]: while [ true ]; Mar 18 18:04:00 crc kubenswrapper[4830]: do Mar 18 18:04:00 crc kubenswrapper[4830]: for f in $(ls /tmp/serviceca); do Mar 18 18:04:00 crc kubenswrapper[4830]: echo $f Mar 18 18:04:00 crc kubenswrapper[4830]: ca_file_path="/tmp/serviceca/${f}" Mar 18 18:04:00 crc kubenswrapper[4830]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 18 18:04:00 crc kubenswrapper[4830]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 18 18:04:00 crc kubenswrapper[4830]: if [ -e "${reg_dir_path}" ]; then Mar 18 18:04:00 crc kubenswrapper[4830]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 18 18:04:00 crc kubenswrapper[4830]: else Mar 18 18:04:00 crc kubenswrapper[4830]: mkdir $reg_dir_path Mar 18 18:04:00 crc kubenswrapper[4830]: cp $ca_file_path $reg_dir_path/ca.crt Mar 18 18:04:00 crc kubenswrapper[4830]: fi Mar 18 18:04:00 crc kubenswrapper[4830]: done Mar 18 18:04:00 crc kubenswrapper[4830]: for d in $(ls /etc/docker/certs.d); do Mar 18 18:04:00 crc kubenswrapper[4830]: echo $d Mar 18 18:04:00 crc kubenswrapper[4830]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 18 18:04:00 crc kubenswrapper[4830]: reg_conf_path="/tmp/serviceca/${dp}" Mar 18 18:04:00 crc kubenswrapper[4830]: if [ ! -e "${reg_conf_path}" ]; then Mar 18 18:04:00 crc kubenswrapper[4830]: rm -rf /etc/docker/certs.d/$d Mar 18 18:04:00 crc kubenswrapper[4830]: fi Mar 18 18:04:00 crc kubenswrapper[4830]: done Mar 18 18:04:00 crc kubenswrapper[4830]: sleep 60 & wait ${!} Mar 18 18:04:00 crc kubenswrapper[4830]: done Mar 18 18:04:00 crc kubenswrapper[4830]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tq7m7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-5tfzr_openshift-image-registry(d206127d-732b-421d-85ad-22d8e21c2d45): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 18:04:00 crc kubenswrapper[4830]: > logger="UnhandledError" Mar 18 18:04:00 crc kubenswrapper[4830]: E0318 18:04:00.538095 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-zpw8m" podUID="55b8eced-700a-4b44-8315-c5afac8ca1bf" Mar 18 18:04:00 crc kubenswrapper[4830]: E0318 18:04:00.538301 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l5n48,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.538709 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c5rtg" event={"ID":"f094c167-4135-4e16-97f3-2759780a857a","Type":"ContainerStarted","Data":"3ccbef7212c43b83b4c65d8d45ebf5864bcdf31a336b5fdff8f91636eb81c238"} Mar 18 18:04:00 crc kubenswrapper[4830]: E0318 18:04:00.539476 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nnmtt" podUID="0eb10a6f-af83-4366-9613-6350e3297007" Mar 18 18:04:00 crc kubenswrapper[4830]: E0318 18:04:00.539485 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-5tfzr" podUID="d206127d-732b-421d-85ad-22d8e21c2d45" Mar 18 18:04:00 crc kubenswrapper[4830]: E0318 18:04:00.540253 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jg8nw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-c5rtg_openshift-multus(f094c167-4135-4e16-97f3-2759780a857a): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 18:04:00 crc kubenswrapper[4830]: E0318 18:04:00.540277 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l5n48,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 18:04:00 crc kubenswrapper[4830]: E0318 18:04:00.541864 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-c5rtg" podUID="f094c167-4135-4e16-97f3-2759780a857a" Mar 18 18:04:00 crc kubenswrapper[4830]: E0318 18:04:00.541889 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.548655 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.559968 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpw8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b8eced-700a-4b44-8315-c5afac8ca1bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6c9kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpw8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.569355 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wx6kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"437f27f7-4531-4e3e-b3a9-a471c7630012\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24d89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24d89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wx6kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.578983 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.585713 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.585757 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.585787 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.585802 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.585813 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:00Z","lastTransitionTime":"2026-03-18T18:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.589405 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.601638 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.609471 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5tfzr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d206127d-732b-421d-85ad-22d8e21c2d45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq7m7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5tfzr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.618578 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.625254 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fvhfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bacdd483-ef3d-43b9-92c1-67f1eac421ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwr9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fvhfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.633233 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe02a32-24dc-4772-8a10-0128d3a304e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5n48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5n48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-plzpb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.648404 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vjt8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.659633 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5rtg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f094c167-4135-4e16-97f3-2759780a857a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5rtg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.668250 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.675725 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nnmtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eb10a6f-af83-4366-9613-6350e3297007\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkv8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkv8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nnmtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.688988 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.689029 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.689041 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.689064 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.689077 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:00Z","lastTransitionTime":"2026-03-18T18:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.692488 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.703308 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.715012 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpw8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b8eced-700a-4b44-8315-c5afac8ca1bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6c9kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpw8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.726294 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wx6kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"437f27f7-4531-4e3e-b3a9-a471c7630012\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24d89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24d89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wx6kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.740477 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.757022 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.770094 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5tfzr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d206127d-732b-421d-85ad-22d8e21c2d45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq7m7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5tfzr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.787562 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.792602 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.792637 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.792648 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.792663 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.792673 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:00Z","lastTransitionTime":"2026-03-18T18:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.822064 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.857688 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fvhfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bacdd483-ef3d-43b9-92c1-67f1eac421ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwr9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fvhfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.860092 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:00 crc kubenswrapper[4830]: E0318 18:04:00.860263 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:02.860231751 +0000 UTC m=+77.427862103 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.860306 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.860350 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.860393 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:00 crc kubenswrapper[4830]: E0318 18:04:00.860520 4830 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 18:04:00 crc kubenswrapper[4830]: E0318 18:04:00.860566 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:02.860556609 +0000 UTC m=+77.428186941 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 18:04:00 crc kubenswrapper[4830]: E0318 18:04:00.860584 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 18:04:00 crc kubenswrapper[4830]: E0318 18:04:00.860623 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 18:04:00 crc kubenswrapper[4830]: E0318 18:04:00.860627 4830 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 18:04:00 crc kubenswrapper[4830]: E0318 18:04:00.860645 4830 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.860848 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:00 crc kubenswrapper[4830]: E0318 18:04:00.860887 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 18:04:00 crc kubenswrapper[4830]: E0318 18:04:00.860914 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:02.860765145 +0000 UTC m=+77.428395507 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 18:04:00 crc kubenswrapper[4830]: E0318 18:04:00.860931 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 18:04:00 crc kubenswrapper[4830]: E0318 18:04:00.860959 4830 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:04:00 crc kubenswrapper[4830]: E0318 18:04:00.860965 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:02.860953619 +0000 UTC m=+77.428584021 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:04:00 crc kubenswrapper[4830]: E0318 18:04:00.861035 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:02.861013331 +0000 UTC m=+77.428643693 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.895227 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.895279 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.895292 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.895310 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.895322 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:00Z","lastTransitionTime":"2026-03-18T18:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.899076 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe02a32-24dc-4772-8a10-0128d3a304e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5n48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5n48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-plzpb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.947223 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vjt8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:00 crc kubenswrapper[4830]: I0318 18:04:00.961462 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/437f27f7-4531-4e3e-b3a9-a471c7630012-metrics-certs\") pod \"network-metrics-daemon-wx6kd\" (UID: \"437f27f7-4531-4e3e-b3a9-a471c7630012\") " pod="openshift-multus/network-metrics-daemon-wx6kd" Mar 18 18:04:00 crc kubenswrapper[4830]: E0318 18:04:00.961627 4830 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 18:04:00 crc kubenswrapper[4830]: E0318 18:04:00.961741 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/437f27f7-4531-4e3e-b3a9-a471c7630012-metrics-certs podName:437f27f7-4531-4e3e-b3a9-a471c7630012 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:02.961718399 +0000 UTC m=+77.529348821 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/437f27f7-4531-4e3e-b3a9-a471c7630012-metrics-certs") pod "network-metrics-daemon-wx6kd" (UID: "437f27f7-4531-4e3e-b3a9-a471c7630012") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.002235 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.002288 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.002301 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.002320 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.002336 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:01Z","lastTransitionTime":"2026-03-18T18:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.016880 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5rtg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f094c167-4135-4e16-97f3-2759780a857a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5rtg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.032825 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nnmtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eb10a6f-af83-4366-9613-6350e3297007\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkv8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkv8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nnmtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.104589 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.104896 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.104978 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.105058 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.105159 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:01Z","lastTransitionTime":"2026-03-18T18:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.208057 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.208123 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.208133 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.208148 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.208160 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:01Z","lastTransitionTime":"2026-03-18T18:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.235040 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.235064 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.235064 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wx6kd" Mar 18 18:04:01 crc kubenswrapper[4830]: E0318 18:04:01.235864 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:01 crc kubenswrapper[4830]: E0318 18:04:01.235957 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:01 crc kubenswrapper[4830]: E0318 18:04:01.236142 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wx6kd" podUID="437f27f7-4531-4e3e-b3a9-a471c7630012" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.251507 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.251758 4830 scope.go:117] "RemoveContainer" containerID="811467b40c9890aea83b831baeb4e3799cdbf79ed318366ac0cc6e3a89dbda08" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.309973 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.310369 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.310380 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.310397 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.310409 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:01Z","lastTransitionTime":"2026-03-18T18:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.412517 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.412545 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.412553 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.412566 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.412575 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:01Z","lastTransitionTime":"2026-03-18T18:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.515197 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.515232 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.515242 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.515255 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.515265 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:01Z","lastTransitionTime":"2026-03-18T18:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.545332 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.547269 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4f7dc1c185e795944593de077f9e79ba62fab1373d37016cacb2a7bd48ad096f"} Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.547579 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.565117 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vjt8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.575700 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5rtg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f094c167-4135-4e16-97f3-2759780a857a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5rtg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.585004 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.595649 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.604051 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fvhfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bacdd483-ef3d-43b9-92c1-67f1eac421ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwr9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fvhfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.616360 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe02a32-24dc-4772-8a10-0128d3a304e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5n48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5n48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-plzpb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.617842 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.617874 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.617885 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.617901 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.617911 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:01Z","lastTransitionTime":"2026-03-18T18:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.628690 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nnmtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eb10a6f-af83-4366-9613-6350e3297007\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkv8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkv8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nnmtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.638356 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.651619 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.660747 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.669723 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.685634 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpw8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b8eced-700a-4b44-8315-c5afac8ca1bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6c9kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpw8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.698081 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wx6kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"437f27f7-4531-4e3e-b3a9-a471c7630012\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24d89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24d89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wx6kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.709381 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5tfzr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d206127d-732b-421d-85ad-22d8e21c2d45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq7m7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5tfzr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.720719 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.720750 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.720762 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.720792 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.720806 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:01Z","lastTransitionTime":"2026-03-18T18:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.725241 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd4d209-2ecf-4749-bf99-6819f6608a4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9580e49a2c4635e81e24f0db9f7240909ebf0b8a3129b88ffa7732693524f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9a34d3b0ae6f483b34fef4c3e9595efe271b15f58d9991918ce33ede99551e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bf7ea9535b8631243ce56c3b3f185a0bf45834f3aae18d9c17cdd00e664d75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f7dc1c185e795944593de077f9e79ba62fab1373d37016cacb2a7bd48ad096f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://811467b40c9890aea83b831baeb4e3799cdbf79ed318366ac0cc6e3a89dbda08\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 18:03:26.587125 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 18:03:26.587239 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 18:03:26.587934 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131405266/tls.crt::/tmp/serving-cert-1131405266/tls.key\\\\\\\"\\\\nI0318 18:03:26.953519 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 18:03:26.955923 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 18:03:26.955940 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 18:03:26.955958 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 18:03:26.955963 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 18:03:26.960855 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0318 18:03:26.960873 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0318 18:03:26.960890 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 18:03:26.960899 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 18:03:26.960906 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 18:03:26.960913 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 18:03:26.960917 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 18:03:26.960922 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 18:03:26.962438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb94fab85c6cbe1b987435f9965c724430c2674a665bde3e1f284e2a62adb20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e79885891f39b4348094dcd3e2043eb0759994e0596c427418a4b16a16af03b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e79885891f39b4348094dcd3e2043eb0759994e0596c427418a4b16a16af03b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.824090 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.824130 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.824139 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.824155 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.824166 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:01Z","lastTransitionTime":"2026-03-18T18:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.926861 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.926901 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.926911 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.926928 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:01 crc kubenswrapper[4830]: I0318 18:04:01.926939 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:01Z","lastTransitionTime":"2026-03-18T18:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.029858 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.029891 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.029899 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.029914 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.029925 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:02Z","lastTransitionTime":"2026-03-18T18:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.131804 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.131839 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.131848 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.131861 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.131869 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:02Z","lastTransitionTime":"2026-03-18T18:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.233711 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:02 crc kubenswrapper[4830]: E0318 18:04:02.233909 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.241349 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.241409 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.241428 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.241453 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.241470 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:02Z","lastTransitionTime":"2026-03-18T18:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.344544 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.344629 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.344643 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.344656 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.344665 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:02Z","lastTransitionTime":"2026-03-18T18:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.447425 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.447484 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.447502 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.447528 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.447549 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:02Z","lastTransitionTime":"2026-03-18T18:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.550087 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.550140 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.550157 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.550180 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.550197 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:02Z","lastTransitionTime":"2026-03-18T18:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.652271 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.652314 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.652326 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.652342 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.652353 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:02Z","lastTransitionTime":"2026-03-18T18:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.755409 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.755443 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.755453 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.755470 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.755481 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:02Z","lastTransitionTime":"2026-03-18T18:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.858059 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.858105 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.858113 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.858128 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.858137 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:02Z","lastTransitionTime":"2026-03-18T18:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.880501 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.880671 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.880710 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:02 crc kubenswrapper[4830]: E0318 18:04:02.880895 4830 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 18:04:02 crc kubenswrapper[4830]: E0318 18:04:02.880970 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:06.880949224 +0000 UTC m=+81.448579566 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 18:04:02 crc kubenswrapper[4830]: E0318 18:04:02.881012 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:06.880986805 +0000 UTC m=+81.448617177 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:02 crc kubenswrapper[4830]: E0318 18:04:02.881197 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 18:04:02 crc kubenswrapper[4830]: E0318 18:04:02.881233 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 18:04:02 crc kubenswrapper[4830]: E0318 18:04:02.881251 4830 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:04:02 crc kubenswrapper[4830]: E0318 18:04:02.881323 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:06.881299653 +0000 UTC m=+81.448930055 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:04:02 crc kubenswrapper[4830]: E0318 18:04:02.881388 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 18:04:02 crc kubenswrapper[4830]: E0318 18:04:02.881404 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 18:04:02 crc kubenswrapper[4830]: E0318 18:04:02.881417 4830 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.881419 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:02 crc kubenswrapper[4830]: E0318 18:04:02.881459 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:06.881448017 +0000 UTC m=+81.449078449 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.881487 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:02 crc kubenswrapper[4830]: E0318 18:04:02.881610 4830 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 18:04:02 crc kubenswrapper[4830]: E0318 18:04:02.881656 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:06.881641241 +0000 UTC m=+81.449271664 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.961244 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.961278 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.961288 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.961305 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.961317 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:02Z","lastTransitionTime":"2026-03-18T18:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:02 crc kubenswrapper[4830]: I0318 18:04:02.982179 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/437f27f7-4531-4e3e-b3a9-a471c7630012-metrics-certs\") pod \"network-metrics-daemon-wx6kd\" (UID: \"437f27f7-4531-4e3e-b3a9-a471c7630012\") " pod="openshift-multus/network-metrics-daemon-wx6kd" Mar 18 18:04:02 crc kubenswrapper[4830]: E0318 18:04:02.982404 4830 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 18:04:02 crc kubenswrapper[4830]: E0318 18:04:02.982464 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/437f27f7-4531-4e3e-b3a9-a471c7630012-metrics-certs podName:437f27f7-4531-4e3e-b3a9-a471c7630012 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:06.982445012 +0000 UTC m=+81.550075354 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/437f27f7-4531-4e3e-b3a9-a471c7630012-metrics-certs") pod "network-metrics-daemon-wx6kd" (UID: "437f27f7-4531-4e3e-b3a9-a471c7630012") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.064286 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.064347 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.064359 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.064386 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.064399 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:03Z","lastTransitionTime":"2026-03-18T18:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.166963 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.167013 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.167027 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.167048 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.167063 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:03Z","lastTransitionTime":"2026-03-18T18:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.233713 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.233762 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wx6kd" Mar 18 18:04:03 crc kubenswrapper[4830]: E0318 18:04:03.233968 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.233990 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:03 crc kubenswrapper[4830]: E0318 18:04:03.234124 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:03 crc kubenswrapper[4830]: E0318 18:04:03.234230 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wx6kd" podUID="437f27f7-4531-4e3e-b3a9-a471c7630012" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.269092 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.269166 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.269196 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.269226 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.269246 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:03Z","lastTransitionTime":"2026-03-18T18:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.372603 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.372683 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.372701 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.372734 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.372749 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:03Z","lastTransitionTime":"2026-03-18T18:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.476111 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.476184 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.476202 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.476227 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.476244 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:03Z","lastTransitionTime":"2026-03-18T18:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.579426 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.579464 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.579473 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.579486 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.579494 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:03Z","lastTransitionTime":"2026-03-18T18:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.682317 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.682433 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.682452 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.682468 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.682480 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:03Z","lastTransitionTime":"2026-03-18T18:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.785754 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.785845 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.785863 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.785891 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.785909 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:03Z","lastTransitionTime":"2026-03-18T18:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.889308 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.889391 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.889404 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.889427 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.889440 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:03Z","lastTransitionTime":"2026-03-18T18:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.993127 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.993173 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.993187 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.993204 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:03 crc kubenswrapper[4830]: I0318 18:04:03.993217 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:03Z","lastTransitionTime":"2026-03-18T18:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.096056 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.096119 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.096137 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.096162 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.096179 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:04Z","lastTransitionTime":"2026-03-18T18:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.199639 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.199693 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.199709 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.199733 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.199751 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:04Z","lastTransitionTime":"2026-03-18T18:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.234159 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:04 crc kubenswrapper[4830]: E0318 18:04:04.234329 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.302929 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.303004 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.303021 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.303042 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.303056 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:04Z","lastTransitionTime":"2026-03-18T18:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.408436 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.408625 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.408654 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.408722 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.408746 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:04Z","lastTransitionTime":"2026-03-18T18:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.511001 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.511068 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.511083 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.511107 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.511122 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:04Z","lastTransitionTime":"2026-03-18T18:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.614264 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.614327 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.614337 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.614355 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.614367 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:04Z","lastTransitionTime":"2026-03-18T18:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.716625 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.716689 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.716706 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.716730 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.716746 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:04Z","lastTransitionTime":"2026-03-18T18:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.819571 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.819616 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.819628 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.819643 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.819654 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:04Z","lastTransitionTime":"2026-03-18T18:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.922449 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.922499 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.922507 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.922521 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:04 crc kubenswrapper[4830]: I0318 18:04:04.922529 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:04Z","lastTransitionTime":"2026-03-18T18:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.024596 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.024676 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.024695 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.024718 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.024736 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:05Z","lastTransitionTime":"2026-03-18T18:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.129867 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.129924 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.129938 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.129965 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.129978 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:05Z","lastTransitionTime":"2026-03-18T18:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.233503 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.234185 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.234265 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wx6kd" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.234387 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.234412 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.234423 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.234437 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.234448 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:05Z","lastTransitionTime":"2026-03-18T18:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:05 crc kubenswrapper[4830]: E0318 18:04:05.234459 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:05 crc kubenswrapper[4830]: E0318 18:04:05.234621 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:05 crc kubenswrapper[4830]: E0318 18:04:05.234722 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wx6kd" podUID="437f27f7-4531-4e3e-b3a9-a471c7630012" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.337406 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.337474 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.337490 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.337520 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.337537 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:05Z","lastTransitionTime":"2026-03-18T18:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.441046 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.441116 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.441146 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.441169 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.441185 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:05Z","lastTransitionTime":"2026-03-18T18:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.544439 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.544489 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.544507 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.544532 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.544551 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:05Z","lastTransitionTime":"2026-03-18T18:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.647434 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.647493 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.647511 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.647537 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.647555 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:05Z","lastTransitionTime":"2026-03-18T18:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.676215 4830 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.751110 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.751178 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.751199 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.751224 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.751244 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:05Z","lastTransitionTime":"2026-03-18T18:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.855295 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.855370 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.855394 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.855425 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.855451 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:05Z","lastTransitionTime":"2026-03-18T18:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.958619 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.958686 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.958729 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.958765 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:05 crc kubenswrapper[4830]: I0318 18:04:05.958848 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:05Z","lastTransitionTime":"2026-03-18T18:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.062159 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.062210 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.062222 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.062241 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.062255 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:06Z","lastTransitionTime":"2026-03-18T18:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.165098 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.165177 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.165190 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.165212 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.165229 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:06Z","lastTransitionTime":"2026-03-18T18:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.234209 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:06 crc kubenswrapper[4830]: E0318 18:04:06.234386 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.248614 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.260664 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fvhfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bacdd483-ef3d-43b9-92c1-67f1eac421ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwr9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fvhfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.268134 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.268172 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.268184 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.268203 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.268215 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:06Z","lastTransitionTime":"2026-03-18T18:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.274365 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe02a32-24dc-4772-8a10-0128d3a304e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5n48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5n48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-plzpb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.303301 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vjt8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.323970 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5rtg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f094c167-4135-4e16-97f3-2759780a857a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5rtg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.340662 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.353825 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nnmtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eb10a6f-af83-4366-9613-6350e3297007\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkv8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkv8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nnmtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.368483 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.371534 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.371609 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.371640 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.371672 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.371697 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:06Z","lastTransitionTime":"2026-03-18T18:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.386279 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpw8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b8eced-700a-4b44-8315-c5afac8ca1bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6c9kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpw8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.399558 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wx6kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"437f27f7-4531-4e3e-b3a9-a471c7630012\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24d89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24d89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wx6kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.415741 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.433907 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.451671 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.463462 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5tfzr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d206127d-732b-421d-85ad-22d8e21c2d45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq7m7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5tfzr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.474808 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.474881 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.474907 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.474935 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.474953 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:06Z","lastTransitionTime":"2026-03-18T18:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.480599 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd4d209-2ecf-4749-bf99-6819f6608a4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9580e49a2c4635e81e24f0db9f7240909ebf0b8a3129b88ffa7732693524f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9a34d3b0ae6f483b34fef4c3e9595efe271b15f58d9991918ce33ede99551e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bf7ea9535b8631243ce56c3b3f185a0bf45834f3aae18d9c17cdd00e664d75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f7dc1c185e795944593de077f9e79ba62fab1373d37016cacb2a7bd48ad096f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://811467b40c9890aea83b831baeb4e3799cdbf79ed318366ac0cc6e3a89dbda08\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 18:03:26.587125 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 18:03:26.587239 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 18:03:26.587934 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131405266/tls.crt::/tmp/serving-cert-1131405266/tls.key\\\\\\\"\\\\nI0318 18:03:26.953519 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 18:03:26.955923 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 18:03:26.955940 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 18:03:26.955958 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 18:03:26.955963 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 18:03:26.960855 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0318 18:03:26.960873 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0318 18:03:26.960890 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 18:03:26.960899 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 18:03:26.960906 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 18:03:26.960913 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 18:03:26.960917 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 18:03:26.960922 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 18:03:26.962438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb94fab85c6cbe1b987435f9965c724430c2674a665bde3e1f284e2a62adb20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e79885891f39b4348094dcd3e2043eb0759994e0596c427418a4b16a16af03b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e79885891f39b4348094dcd3e2043eb0759994e0596c427418a4b16a16af03b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.578181 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.578243 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.578267 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.578337 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.578361 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:06Z","lastTransitionTime":"2026-03-18T18:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.681034 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.681162 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.681191 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.681227 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.681250 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:06Z","lastTransitionTime":"2026-03-18T18:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.784705 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.784764 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.784807 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.784836 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.784862 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:06Z","lastTransitionTime":"2026-03-18T18:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.885925 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.885974 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.885986 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.886006 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.886021 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:06Z","lastTransitionTime":"2026-03-18T18:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:06 crc kubenswrapper[4830]: E0318 18:04:06.899846 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f91fa51-750d-472a-937b-41a0fe3990f0\\\",\\\"systemUUID\\\":\\\"633bcea2-a7fe-4f06-927d-dd6893c932b6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.904336 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.904399 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.904417 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.904441 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.904458 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:06Z","lastTransitionTime":"2026-03-18T18:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:06 crc kubenswrapper[4830]: E0318 18:04:06.915254 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f91fa51-750d-472a-937b-41a0fe3990f0\\\",\\\"systemUUID\\\":\\\"633bcea2-a7fe-4f06-927d-dd6893c932b6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.918820 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.918856 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.918893 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.918909 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.918919 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:06Z","lastTransitionTime":"2026-03-18T18:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.927815 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.927985 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:06 crc kubenswrapper[4830]: E0318 18:04:06.928041 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:14.928004253 +0000 UTC m=+89.495634615 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.928114 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:06 crc kubenswrapper[4830]: E0318 18:04:06.928173 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.928188 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.928264 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:06 crc kubenswrapper[4830]: E0318 18:04:06.928204 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 18:04:06 crc kubenswrapper[4830]: E0318 18:04:06.928342 4830 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:04:06 crc kubenswrapper[4830]: E0318 18:04:06.928264 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 18:04:06 crc kubenswrapper[4830]: E0318 18:04:06.928376 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 18:04:06 crc kubenswrapper[4830]: E0318 18:04:06.928384 4830 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 18:04:06 crc kubenswrapper[4830]: E0318 18:04:06.928395 4830 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 18:04:06 crc kubenswrapper[4830]: E0318 18:04:06.928435 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:14.928405683 +0000 UTC m=+89.496036045 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:04:06 crc kubenswrapper[4830]: E0318 18:04:06.928394 4830 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:04:06 crc kubenswrapper[4830]: E0318 18:04:06.928623 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:14.928542617 +0000 UTC m=+89.496172989 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 18:04:06 crc kubenswrapper[4830]: E0318 18:04:06.928673 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:14.92865577 +0000 UTC m=+89.496286132 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 18:04:06 crc kubenswrapper[4830]: E0318 18:04:06.928706 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:14.92869401 +0000 UTC m=+89.496324382 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:04:06 crc kubenswrapper[4830]: E0318 18:04:06.930328 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f91fa51-750d-472a-937b-41a0fe3990f0\\\",\\\"systemUUID\\\":\\\"633bcea2-a7fe-4f06-927d-dd6893c932b6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.935329 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.935399 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.935420 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.935446 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.935466 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:06Z","lastTransitionTime":"2026-03-18T18:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:06 crc kubenswrapper[4830]: E0318 18:04:06.949926 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f91fa51-750d-472a-937b-41a0fe3990f0\\\",\\\"systemUUID\\\":\\\"633bcea2-a7fe-4f06-927d-dd6893c932b6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.954572 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.954622 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.954631 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.954647 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.954658 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:06Z","lastTransitionTime":"2026-03-18T18:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:06 crc kubenswrapper[4830]: E0318 18:04:06.968350 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f91fa51-750d-472a-937b-41a0fe3990f0\\\",\\\"systemUUID\\\":\\\"633bcea2-a7fe-4f06-927d-dd6893c932b6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:06 crc kubenswrapper[4830]: E0318 18:04:06.968451 4830 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.970371 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.970435 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.970459 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.970525 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:06 crc kubenswrapper[4830]: I0318 18:04:06.970554 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:06Z","lastTransitionTime":"2026-03-18T18:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.029336 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/437f27f7-4531-4e3e-b3a9-a471c7630012-metrics-certs\") pod \"network-metrics-daemon-wx6kd\" (UID: \"437f27f7-4531-4e3e-b3a9-a471c7630012\") " pod="openshift-multus/network-metrics-daemon-wx6kd" Mar 18 18:04:07 crc kubenswrapper[4830]: E0318 18:04:07.029576 4830 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 18:04:07 crc kubenswrapper[4830]: E0318 18:04:07.029662 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/437f27f7-4531-4e3e-b3a9-a471c7630012-metrics-certs podName:437f27f7-4531-4e3e-b3a9-a471c7630012 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:15.029638934 +0000 UTC m=+89.597269306 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/437f27f7-4531-4e3e-b3a9-a471c7630012-metrics-certs") pod "network-metrics-daemon-wx6kd" (UID: "437f27f7-4531-4e3e-b3a9-a471c7630012") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.074361 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.074439 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.074463 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.074494 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.074517 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:07Z","lastTransitionTime":"2026-03-18T18:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.177734 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.177827 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.177846 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.177872 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.177890 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:07Z","lastTransitionTime":"2026-03-18T18:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.234250 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.234342 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:07 crc kubenswrapper[4830]: E0318 18:04:07.234414 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:07 crc kubenswrapper[4830]: E0318 18:04:07.234528 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.234684 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wx6kd" Mar 18 18:04:07 crc kubenswrapper[4830]: E0318 18:04:07.234865 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wx6kd" podUID="437f27f7-4531-4e3e-b3a9-a471c7630012" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.280681 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.280754 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.280803 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.280832 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.280852 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:07Z","lastTransitionTime":"2026-03-18T18:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.383594 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.383667 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.383686 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.383716 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.383736 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:07Z","lastTransitionTime":"2026-03-18T18:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.486731 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.486861 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.486884 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.486909 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.486929 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:07Z","lastTransitionTime":"2026-03-18T18:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.590285 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.590400 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.590424 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.590460 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.590488 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:07Z","lastTransitionTime":"2026-03-18T18:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.693483 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.693549 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.693567 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.693594 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.693612 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:07Z","lastTransitionTime":"2026-03-18T18:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.795760 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.795852 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.795869 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.795895 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.795913 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:07Z","lastTransitionTime":"2026-03-18T18:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.899605 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.899673 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.899691 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.899717 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:07 crc kubenswrapper[4830]: I0318 18:04:07.899755 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:07Z","lastTransitionTime":"2026-03-18T18:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.002725 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.003053 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.003089 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.003159 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.003185 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:08Z","lastTransitionTime":"2026-03-18T18:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.106559 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.106600 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.106609 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.106622 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.106630 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:08Z","lastTransitionTime":"2026-03-18T18:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.209616 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.209703 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.209727 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.209758 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.209809 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:08Z","lastTransitionTime":"2026-03-18T18:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.234455 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:08 crc kubenswrapper[4830]: E0318 18:04:08.234690 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.312386 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.312433 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.312446 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.312465 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.312480 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:08Z","lastTransitionTime":"2026-03-18T18:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.415199 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.415259 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.415277 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.415302 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.415320 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:08Z","lastTransitionTime":"2026-03-18T18:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.517551 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.517580 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.517591 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.517604 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.517612 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:08Z","lastTransitionTime":"2026-03-18T18:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.619900 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.619966 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.619987 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.620013 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.620030 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:08Z","lastTransitionTime":"2026-03-18T18:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.723269 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.723336 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.723354 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.723379 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.723396 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:08Z","lastTransitionTime":"2026-03-18T18:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.826814 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.826860 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.826877 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.826901 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.826918 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:08Z","lastTransitionTime":"2026-03-18T18:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.930199 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.930509 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.930688 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.930963 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:08 crc kubenswrapper[4830]: I0318 18:04:08.931174 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:08Z","lastTransitionTime":"2026-03-18T18:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.034194 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.034276 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.034306 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.034339 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.034363 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:09Z","lastTransitionTime":"2026-03-18T18:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.136983 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.137071 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.137095 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.137128 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.137182 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:09Z","lastTransitionTime":"2026-03-18T18:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.233890 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.234033 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:09 crc kubenswrapper[4830]: E0318 18:04:09.234132 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.233894 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wx6kd" Mar 18 18:04:09 crc kubenswrapper[4830]: E0318 18:04:09.234269 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:09 crc kubenswrapper[4830]: E0318 18:04:09.234378 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wx6kd" podUID="437f27f7-4531-4e3e-b3a9-a471c7630012" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.240104 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.240145 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.240173 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.240195 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.240212 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:09Z","lastTransitionTime":"2026-03-18T18:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.343862 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.343951 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.343978 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.344011 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.344034 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:09Z","lastTransitionTime":"2026-03-18T18:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.447247 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.447305 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.447323 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.447345 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.447361 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:09Z","lastTransitionTime":"2026-03-18T18:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.550256 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.550313 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.550329 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.550354 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.550374 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:09Z","lastTransitionTime":"2026-03-18T18:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.652834 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.652871 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.652881 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.652899 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.652911 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:09Z","lastTransitionTime":"2026-03-18T18:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.755129 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.755183 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.755199 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.755223 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.755241 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:09Z","lastTransitionTime":"2026-03-18T18:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.857638 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.857699 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.857716 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.857740 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.857757 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:09Z","lastTransitionTime":"2026-03-18T18:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.960840 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.960917 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.960944 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.960976 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:09 crc kubenswrapper[4830]: I0318 18:04:09.960998 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:09Z","lastTransitionTime":"2026-03-18T18:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.064547 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.064588 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.064598 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.064613 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.064624 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:10Z","lastTransitionTime":"2026-03-18T18:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.167093 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.167123 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.167131 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.167144 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.167153 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:10Z","lastTransitionTime":"2026-03-18T18:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.234410 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:10 crc kubenswrapper[4830]: E0318 18:04:10.234640 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.249757 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.269605 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.269651 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.269661 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.269677 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.269688 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:10Z","lastTransitionTime":"2026-03-18T18:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.373358 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.373422 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.373442 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.373466 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.373486 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:10Z","lastTransitionTime":"2026-03-18T18:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.476276 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.476323 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.476336 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.476407 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.476419 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:10Z","lastTransitionTime":"2026-03-18T18:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.578610 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.578659 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.578673 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.578696 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.578712 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:10Z","lastTransitionTime":"2026-03-18T18:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.681228 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.681272 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.681288 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.681311 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.681328 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:10Z","lastTransitionTime":"2026-03-18T18:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.701695 4830 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.783831 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.783901 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.783917 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.783942 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.783959 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:10Z","lastTransitionTime":"2026-03-18T18:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.887224 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.887305 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.887335 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.887364 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.887386 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:10Z","lastTransitionTime":"2026-03-18T18:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.989384 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.989440 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.989457 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.989481 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:10 crc kubenswrapper[4830]: I0318 18:04:10.989499 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:10Z","lastTransitionTime":"2026-03-18T18:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.092620 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.092687 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.092706 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.092732 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.092754 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:11Z","lastTransitionTime":"2026-03-18T18:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.195590 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.195676 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.195695 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.195721 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.195740 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:11Z","lastTransitionTime":"2026-03-18T18:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.234290 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.234378 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:11 crc kubenswrapper[4830]: E0318 18:04:11.234478 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.234530 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wx6kd" Mar 18 18:04:11 crc kubenswrapper[4830]: E0318 18:04:11.234674 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:11 crc kubenswrapper[4830]: E0318 18:04:11.234903 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wx6kd" podUID="437f27f7-4531-4e3e-b3a9-a471c7630012" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.299365 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.299432 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.299455 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.299482 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.299500 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:11Z","lastTransitionTime":"2026-03-18T18:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.401557 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.401604 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.401615 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.401632 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.401645 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:11Z","lastTransitionTime":"2026-03-18T18:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.503359 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.503833 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.503848 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.503869 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.503886 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:11Z","lastTransitionTime":"2026-03-18T18:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.579835 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"9c48d3ab24e27a7ba00e635097b4b08ea73d23344f34d58d01b5ab21908a8664"} Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.581721 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5tfzr" event={"ID":"d206127d-732b-421d-85ad-22d8e21c2d45","Type":"ContainerStarted","Data":"d31ff3b574687d20a6ccfa9ea1f63229af216bef5d9a2046bdad6c5ec387ce52"} Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.600490 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vjt8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.605810 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.605857 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.605870 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.605890 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.605904 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:11Z","lastTransitionTime":"2026-03-18T18:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.615872 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5rtg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f094c167-4135-4e16-97f3-2759780a857a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5rtg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.636270 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdaf223b-0ac6-4d47-a763-3aa440f90080\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd3c79efdbce82e14dec081988c8f05523e450d086a8e5dc1fab6cb8aebbe9ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a97ef8900f390c555fed4fd4f83a3b8a04f9b2b41a85f6510bd1ddbd976985ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7658a39c2b154f77998f776e55298f84fa182d2045829eebbc4f29f08bda73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c253b92e92a7458c8a01ed178c35f6010c143f1793ef8819e7f2c783b1775c7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44cf543b8f0fea93357d3ae053176ea1630eec591872cda68bd609c6313ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15f48f3fa235651f99cbfcf9cd993a0485b6256a36016b0c9998c27b810ba19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c15f48f3fa235651f99cbfcf9cd993a0485b6256a36016b0c9998c27b810ba19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ccc52e48a6f9894012a3b7d3a96a0e7f198549ea319a502edad64c9c37f7e06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ccc52e48a6f9894012a3b7d3a96a0e7f198549ea319a502edad64c9c37f7e06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4fb47a1b541afa2621f4a50b182cfe71133422fbd2c8ac8b720420d81486625e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fb47a1b541afa2621f4a50b182cfe71133422fbd2c8ac8b720420d81486625e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.647965 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.661057 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.677053 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fvhfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bacdd483-ef3d-43b9-92c1-67f1eac421ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwr9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fvhfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.694346 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe02a32-24dc-4772-8a10-0128d3a304e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5n48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5n48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-plzpb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.705976 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nnmtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eb10a6f-af83-4366-9613-6350e3297007\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkv8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkv8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nnmtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.708308 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.708414 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.708443 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.708487 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.708513 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:11Z","lastTransitionTime":"2026-03-18T18:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.723765 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c48d3ab24e27a7ba00e635097b4b08ea73d23344f34d58d01b5ab21908a8664\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.737510 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.754415 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.770855 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.789308 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpw8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b8eced-700a-4b44-8315-c5afac8ca1bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6c9kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpw8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.806750 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wx6kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"437f27f7-4531-4e3e-b3a9-a471c7630012\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24d89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24d89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wx6kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.812550 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.812634 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.812656 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.812685 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.812701 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:11Z","lastTransitionTime":"2026-03-18T18:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.821205 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5tfzr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d206127d-732b-421d-85ad-22d8e21c2d45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq7m7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5tfzr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.838475 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd4d209-2ecf-4749-bf99-6819f6608a4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9580e49a2c4635e81e24f0db9f7240909ebf0b8a3129b88ffa7732693524f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9a34d3b0ae6f483b34fef4c3e9595efe271b15f58d9991918ce33ede99551e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bf7ea9535b8631243ce56c3b3f185a0bf45834f3aae18d9c17cdd00e664d75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f7dc1c185e795944593de077f9e79ba62fab1373d37016cacb2a7bd48ad096f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://811467b40c9890aea83b831baeb4e3799cdbf79ed318366ac0cc6e3a89dbda08\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 18:03:26.587125 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 18:03:26.587239 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 18:03:26.587934 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131405266/tls.crt::/tmp/serving-cert-1131405266/tls.key\\\\\\\"\\\\nI0318 18:03:26.953519 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 18:03:26.955923 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 18:03:26.955940 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 18:03:26.955958 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 18:03:26.955963 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 18:03:26.960855 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0318 18:03:26.960873 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0318 18:03:26.960890 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 18:03:26.960899 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 18:03:26.960906 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 18:03:26.960913 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 18:03:26.960917 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 18:03:26.960922 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 18:03:26.962438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb94fab85c6cbe1b987435f9965c724430c2674a665bde3e1f284e2a62adb20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e79885891f39b4348094dcd3e2043eb0759994e0596c427418a4b16a16af03b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e79885891f39b4348094dcd3e2043eb0759994e0596c427418a4b16a16af03b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.849562 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5tfzr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d206127d-732b-421d-85ad-22d8e21c2d45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31ff3b574687d20a6ccfa9ea1f63229af216bef5d9a2046bdad6c5ec387ce52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq7m7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5tfzr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.865754 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd4d209-2ecf-4749-bf99-6819f6608a4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9580e49a2c4635e81e24f0db9f7240909ebf0b8a3129b88ffa7732693524f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9a34d3b0ae6f483b34fef4c3e9595efe271b15f58d9991918ce33ede99551e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bf7ea9535b8631243ce56c3b3f185a0bf45834f3aae18d9c17cdd00e664d75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f7dc1c185e795944593de077f9e79ba62fab1373d37016cacb2a7bd48ad096f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://811467b40c9890aea83b831baeb4e3799cdbf79ed318366ac0cc6e3a89dbda08\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 18:03:26.587125 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 18:03:26.587239 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 18:03:26.587934 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131405266/tls.crt::/tmp/serving-cert-1131405266/tls.key\\\\\\\"\\\\nI0318 18:03:26.953519 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 18:03:26.955923 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 18:03:26.955940 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 18:03:26.955958 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 18:03:26.955963 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 18:03:26.960855 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0318 18:03:26.960873 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0318 18:03:26.960890 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 18:03:26.960899 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 18:03:26.960906 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 18:03:26.960913 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 18:03:26.960917 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 18:03:26.960922 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 18:03:26.962438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb94fab85c6cbe1b987435f9965c724430c2674a665bde3e1f284e2a62adb20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e79885891f39b4348094dcd3e2043eb0759994e0596c427418a4b16a16af03b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e79885891f39b4348094dcd3e2043eb0759994e0596c427418a4b16a16af03b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.893051 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vjt8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.913496 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5rtg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f094c167-4135-4e16-97f3-2759780a857a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5rtg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.915253 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.915315 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.915335 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.915364 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.915382 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:11Z","lastTransitionTime":"2026-03-18T18:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.944518 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdaf223b-0ac6-4d47-a763-3aa440f90080\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd3c79efdbce82e14dec081988c8f05523e450d086a8e5dc1fab6cb8aebbe9ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a97ef8900f390c555fed4fd4f83a3b8a04f9b2b41a85f6510bd1ddbd976985ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7658a39c2b154f77998f776e55298f84fa182d2045829eebbc4f29f08bda73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c253b92e92a7458c8a01ed178c35f6010c143f1793ef8819e7f2c783b1775c7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44cf543b8f0fea93357d3ae053176ea1630eec591872cda68bd609c6313ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15f48f3fa235651f99cbfcf9cd993a0485b6256a36016b0c9998c27b810ba19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c15f48f3fa235651f99cbfcf9cd993a0485b6256a36016b0c9998c27b810ba19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ccc52e48a6f9894012a3b7d3a96a0e7f198549ea319a502edad64c9c37f7e06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ccc52e48a6f9894012a3b7d3a96a0e7f198549ea319a502edad64c9c37f7e06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4fb47a1b541afa2621f4a50b182cfe71133422fbd2c8ac8b720420d81486625e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fb47a1b541afa2621f4a50b182cfe71133422fbd2c8ac8b720420d81486625e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.956998 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.973128 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.983296 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fvhfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bacdd483-ef3d-43b9-92c1-67f1eac421ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwr9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fvhfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:11 crc kubenswrapper[4830]: I0318 18:04:11.994596 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe02a32-24dc-4772-8a10-0128d3a304e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5n48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5n48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-plzpb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.008696 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nnmtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eb10a6f-af83-4366-9613-6350e3297007\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkv8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkv8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nnmtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.018008 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.018071 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.018090 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.018117 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.018135 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:12Z","lastTransitionTime":"2026-03-18T18:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.027152 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c48d3ab24e27a7ba00e635097b4b08ea73d23344f34d58d01b5ab21908a8664\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.044294 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.062114 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.074576 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.087571 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpw8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b8eced-700a-4b44-8315-c5afac8ca1bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6c9kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpw8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.098202 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wx6kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"437f27f7-4531-4e3e-b3a9-a471c7630012\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24d89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24d89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wx6kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.120822 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.120879 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.120902 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.120930 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.120950 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:12Z","lastTransitionTime":"2026-03-18T18:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.224288 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.224351 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.224368 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.224392 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.224411 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:12Z","lastTransitionTime":"2026-03-18T18:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.234163 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:12 crc kubenswrapper[4830]: E0318 18:04:12.234316 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.327165 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.327271 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.327297 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.327327 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.327350 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:12Z","lastTransitionTime":"2026-03-18T18:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.381940 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.398076 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.415762 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.429425 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.429459 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.429467 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.429483 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.429492 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:12Z","lastTransitionTime":"2026-03-18T18:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.434093 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpw8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b8eced-700a-4b44-8315-c5afac8ca1bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6c9kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpw8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.447093 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wx6kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"437f27f7-4531-4e3e-b3a9-a471c7630012\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24d89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24d89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wx6kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.463675 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c48d3ab24e27a7ba00e635097b4b08ea73d23344f34d58d01b5ab21908a8664\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.479385 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.489116 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5tfzr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d206127d-732b-421d-85ad-22d8e21c2d45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31ff3b574687d20a6ccfa9ea1f63229af216bef5d9a2046bdad6c5ec387ce52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq7m7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5tfzr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.504936 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd4d209-2ecf-4749-bf99-6819f6608a4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9580e49a2c4635e81e24f0db9f7240909ebf0b8a3129b88ffa7732693524f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9a34d3b0ae6f483b34fef4c3e9595efe271b15f58d9991918ce33ede99551e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bf7ea9535b8631243ce56c3b3f185a0bf45834f3aae18d9c17cdd00e664d75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f7dc1c185e795944593de077f9e79ba62fab1373d37016cacb2a7bd48ad096f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://811467b40c9890aea83b831baeb4e3799cdbf79ed318366ac0cc6e3a89dbda08\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 18:03:26.587125 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 18:03:26.587239 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 18:03:26.587934 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131405266/tls.crt::/tmp/serving-cert-1131405266/tls.key\\\\\\\"\\\\nI0318 18:03:26.953519 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 18:03:26.955923 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 18:03:26.955940 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 18:03:26.955958 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 18:03:26.955963 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 18:03:26.960855 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0318 18:03:26.960873 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0318 18:03:26.960890 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 18:03:26.960899 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 18:03:26.960906 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 18:03:26.960913 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 18:03:26.960917 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 18:03:26.960922 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 18:03:26.962438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb94fab85c6cbe1b987435f9965c724430c2674a665bde3e1f284e2a62adb20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e79885891f39b4348094dcd3e2043eb0759994e0596c427418a4b16a16af03b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e79885891f39b4348094dcd3e2043eb0759994e0596c427418a4b16a16af03b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.519813 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.532424 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.532506 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.532530 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.532567 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.532588 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:12Z","lastTransitionTime":"2026-03-18T18:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.534956 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.546085 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fvhfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bacdd483-ef3d-43b9-92c1-67f1eac421ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwr9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fvhfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.560174 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe02a32-24dc-4772-8a10-0128d3a304e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5n48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5n48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-plzpb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.579852 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vjt8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.602828 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5rtg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f094c167-4135-4e16-97f3-2759780a857a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5rtg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.622891 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdaf223b-0ac6-4d47-a763-3aa440f90080\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd3c79efdbce82e14dec081988c8f05523e450d086a8e5dc1fab6cb8aebbe9ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a97ef8900f390c555fed4fd4f83a3b8a04f9b2b41a85f6510bd1ddbd976985ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7658a39c2b154f77998f776e55298f84fa182d2045829eebbc4f29f08bda73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c253b92e92a7458c8a01ed178c35f6010c143f1793ef8819e7f2c783b1775c7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44cf543b8f0fea93357d3ae053176ea1630eec591872cda68bd609c6313ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15f48f3fa235651f99cbfcf9cd993a0485b6256a36016b0c9998c27b810ba19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c15f48f3fa235651f99cbfcf9cd993a0485b6256a36016b0c9998c27b810ba19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ccc52e48a6f9894012a3b7d3a96a0e7f198549ea319a502edad64c9c37f7e06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ccc52e48a6f9894012a3b7d3a96a0e7f198549ea319a502edad64c9c37f7e06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4fb47a1b541afa2621f4a50b182cfe71133422fbd2c8ac8b720420d81486625e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fb47a1b541afa2621f4a50b182cfe71133422fbd2c8ac8b720420d81486625e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.635719 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.635827 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.635845 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.635873 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.635889 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:12Z","lastTransitionTime":"2026-03-18T18:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.637265 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nnmtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eb10a6f-af83-4366-9613-6350e3297007\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkv8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkv8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nnmtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.738951 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.739003 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.739013 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.739046 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.739060 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:12Z","lastTransitionTime":"2026-03-18T18:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.841666 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.841742 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.841766 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.841836 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.841865 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:12Z","lastTransitionTime":"2026-03-18T18:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.945165 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.945214 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.945227 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.945245 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:12 crc kubenswrapper[4830]: I0318 18:04:12.945258 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:12Z","lastTransitionTime":"2026-03-18T18:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.049204 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.049270 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.049288 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.049312 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.049331 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:13Z","lastTransitionTime":"2026-03-18T18:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.152043 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.152087 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.152096 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.152113 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.152123 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:13Z","lastTransitionTime":"2026-03-18T18:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.233928 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.234043 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wx6kd" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.234078 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:13 crc kubenswrapper[4830]: E0318 18:04:13.234134 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:13 crc kubenswrapper[4830]: E0318 18:04:13.234458 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wx6kd" podUID="437f27f7-4531-4e3e-b3a9-a471c7630012" Mar 18 18:04:13 crc kubenswrapper[4830]: E0318 18:04:13.234644 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.255432 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.255482 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.255495 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.255516 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.255528 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:13Z","lastTransitionTime":"2026-03-18T18:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.359170 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.359207 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.359218 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.359236 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.359250 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:13Z","lastTransitionTime":"2026-03-18T18:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.461168 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.461210 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.461218 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.461232 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.461243 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:13Z","lastTransitionTime":"2026-03-18T18:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.564600 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.564659 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.564676 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.564698 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.564717 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:13Z","lastTransitionTime":"2026-03-18T18:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.667875 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.667952 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.667969 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.667996 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.668015 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:13Z","lastTransitionTime":"2026-03-18T18:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.771541 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.771607 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.771632 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.771663 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.771682 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:13Z","lastTransitionTime":"2026-03-18T18:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.874236 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.874281 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.874300 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.874321 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.874338 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:13Z","lastTransitionTime":"2026-03-18T18:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.977956 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.978019 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.978039 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.978063 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:13 crc kubenswrapper[4830]: I0318 18:04:13.978082 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:13Z","lastTransitionTime":"2026-03-18T18:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.080711 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.080819 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.080846 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.080876 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.080899 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:14Z","lastTransitionTime":"2026-03-18T18:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.183812 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.183887 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.183912 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.183943 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.183967 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:14Z","lastTransitionTime":"2026-03-18T18:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.234612 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:14 crc kubenswrapper[4830]: E0318 18:04:14.234906 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.286584 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.286645 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.286661 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.286683 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.286697 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:14Z","lastTransitionTime":"2026-03-18T18:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.388715 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.388748 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.388758 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.388788 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.388801 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:14Z","lastTransitionTime":"2026-03-18T18:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.491120 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.491561 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.491573 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.491592 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.491604 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:14Z","lastTransitionTime":"2026-03-18T18:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.593742 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.593818 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.593831 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.593878 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.593896 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:14Z","lastTransitionTime":"2026-03-18T18:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.595720 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerStarted","Data":"259ed3b1a8b8af90c86863ce643b9bab5be41f5df67499bc33ee2e175003b867"} Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.595791 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerStarted","Data":"1de4e26f6767da64f02f9792da506b0d4a20c0e15b76e432cd3ee81dff89156a"} Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.597516 4830 generic.go:334] "Generic (PLEG): container finished" podID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerID="25bfc51a1d89fadf1e82f3a2d35c1dcbf73f13fcfa73185e9b4db03fb61fdfbc" exitCode=0 Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.597602 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" event={"ID":"af6abd23-401c-4f5a-a63a-19d7eed4f9ef","Type":"ContainerDied","Data":"25bfc51a1d89fadf1e82f3a2d35c1dcbf73f13fcfa73185e9b4db03fb61fdfbc"} Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.600593 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"17b05309a4d675534fcb4a61d75e557ff90f62dc82e8a2100b64976e61730ca2"} Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.600645 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"64ab21bf35fd2ca9b93f2eb114db2c2d7af170606d1f94faea307897086a5a7c"} Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.603648 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nnmtt" event={"ID":"0eb10a6f-af83-4366-9613-6350e3297007","Type":"ContainerStarted","Data":"96791d02366262344457fbc80c5c1c1a3eb97647a97ba278de413009d5ba2aba"} Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.610612 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.624043 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.635631 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fvhfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bacdd483-ef3d-43b9-92c1-67f1eac421ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwr9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fvhfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.647541 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe02a32-24dc-4772-8a10-0128d3a304e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://259ed3b1a8b8af90c86863ce643b9bab5be41f5df67499bc33ee2e175003b867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5n48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de4e26f6767da64f02f9792da506b0d4a20c0e15b76e432cd3ee81dff89156a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5n48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-plzpb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.667878 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vjt8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.684005 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5rtg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f094c167-4135-4e16-97f3-2759780a857a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5rtg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.696320 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.696377 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.696398 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.696423 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.696443 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:14Z","lastTransitionTime":"2026-03-18T18:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.706041 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdaf223b-0ac6-4d47-a763-3aa440f90080\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd3c79efdbce82e14dec081988c8f05523e450d086a8e5dc1fab6cb8aebbe9ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a97ef8900f390c555fed4fd4f83a3b8a04f9b2b41a85f6510bd1ddbd976985ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7658a39c2b154f77998f776e55298f84fa182d2045829eebbc4f29f08bda73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c253b92e92a7458c8a01ed178c35f6010c143f1793ef8819e7f2c783b1775c7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44cf543b8f0fea93357d3ae053176ea1630eec591872cda68bd609c6313ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15f48f3fa235651f99cbfcf9cd993a0485b6256a36016b0c9998c27b810ba19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c15f48f3fa235651f99cbfcf9cd993a0485b6256a36016b0c9998c27b810ba19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ccc52e48a6f9894012a3b7d3a96a0e7f198549ea319a502edad64c9c37f7e06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ccc52e48a6f9894012a3b7d3a96a0e7f198549ea319a502edad64c9c37f7e06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4fb47a1b541afa2621f4a50b182cfe71133422fbd2c8ac8b720420d81486625e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fb47a1b541afa2621f4a50b182cfe71133422fbd2c8ac8b720420d81486625e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.718110 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nnmtt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eb10a6f-af83-4366-9613-6350e3297007\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkv8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkv8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nnmtt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.731342 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.745245 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.765522 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpw8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55b8eced-700a-4b44-8315-c5afac8ca1bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6c9kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpw8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.777022 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wx6kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"437f27f7-4531-4e3e-b3a9-a471c7630012\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24d89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24d89\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wx6kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.790348 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c48d3ab24e27a7ba00e635097b4b08ea73d23344f34d58d01b5ab21908a8664\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.848389 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.848424 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.848433 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.848449 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.848459 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:14Z","lastTransitionTime":"2026-03-18T18:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.849895 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.864833 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5tfzr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d206127d-732b-421d-85ad-22d8e21c2d45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31ff3b574687d20a6ccfa9ea1f63229af216bef5d9a2046bdad6c5ec387ce52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq7m7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5tfzr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.880737 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd4d209-2ecf-4749-bf99-6819f6608a4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9580e49a2c4635e81e24f0db9f7240909ebf0b8a3129b88ffa7732693524f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9a34d3b0ae6f483b34fef4c3e9595efe271b15f58d9991918ce33ede99551e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bf7ea9535b8631243ce56c3b3f185a0bf45834f3aae18d9c17cdd00e664d75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f7dc1c185e795944593de077f9e79ba62fab1373d37016cacb2a7bd48ad096f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://811467b40c9890aea83b831baeb4e3799cdbf79ed318366ac0cc6e3a89dbda08\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 18:03:26.587125 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 18:03:26.587239 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 18:03:26.587934 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131405266/tls.crt::/tmp/serving-cert-1131405266/tls.key\\\\\\\"\\\\nI0318 18:03:26.953519 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 18:03:26.955923 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 18:03:26.955940 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 18:03:26.955958 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 18:03:26.955963 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 18:03:26.960855 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0318 18:03:26.960873 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0318 18:03:26.960890 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 18:03:26.960899 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 18:03:26.960906 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 18:03:26.960913 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 18:03:26.960917 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 18:03:26.960922 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 18:03:26.962438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb94fab85c6cbe1b987435f9965c724430c2674a665bde3e1f284e2a62adb20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e79885891f39b4348094dcd3e2043eb0759994e0596c427418a4b16a16af03b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e79885891f39b4348094dcd3e2043eb0759994e0596c427418a4b16a16af03b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.889911 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5tfzr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d206127d-732b-421d-85ad-22d8e21c2d45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31ff3b574687d20a6ccfa9ea1f63229af216bef5d9a2046bdad6c5ec387ce52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq7m7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5tfzr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.901029 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cd4d209-2ecf-4749-bf99-6819f6608a4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9580e49a2c4635e81e24f0db9f7240909ebf0b8a3129b88ffa7732693524f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9a34d3b0ae6f483b34fef4c3e9595efe271b15f58d9991918ce33ede99551e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bf7ea9535b8631243ce56c3b3f185a0bf45834f3aae18d9c17cdd00e664d75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f7dc1c185e795944593de077f9e79ba62fab1373d37016cacb2a7bd48ad096f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://811467b40c9890aea83b831baeb4e3799cdbf79ed318366ac0cc6e3a89dbda08\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 18:03:26.587125 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 18:03:26.587239 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 18:03:26.587934 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131405266/tls.crt::/tmp/serving-cert-1131405266/tls.key\\\\\\\"\\\\nI0318 18:03:26.953519 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 18:03:26.955923 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 18:03:26.955940 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 18:03:26.955958 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 18:03:26.955963 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 18:03:26.960855 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0318 18:03:26.960873 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0318 18:03:26.960890 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 18:03:26.960899 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 18:03:26.960906 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 18:03:26.960913 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 18:03:26.960917 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 18:03:26.960922 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 18:03:26.962438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cb94fab85c6cbe1b987435f9965c724430c2674a665bde3e1f284e2a62adb20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e79885891f39b4348094dcd3e2043eb0759994e0596c427418a4b16a16af03b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e79885891f39b4348094dcd3e2043eb0759994e0596c427418a4b16a16af03b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.919202 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdaf223b-0ac6-4d47-a763-3aa440f90080\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd3c79efdbce82e14dec081988c8f05523e450d086a8e5dc1fab6cb8aebbe9ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a97ef8900f390c555fed4fd4f83a3b8a04f9b2b41a85f6510bd1ddbd976985ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7658a39c2b154f77998f776e55298f84fa182d2045829eebbc4f29f08bda73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c253b92e92a7458c8a01ed178c35f6010c143f1793ef8819e7f2c783b1775c7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44cf543b8f0fea93357d3ae053176ea1630eec591872cda68bd609c6313ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15f48f3fa235651f99cbfcf9cd993a0485b6256a36016b0c9998c27b810ba19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c15f48f3fa235651f99cbfcf9cd993a0485b6256a36016b0c9998c27b810ba19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ccc52e48a6f9894012a3b7d3a96a0e7f198549ea319a502edad64c9c37f7e06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ccc52e48a6f9894012a3b7d3a96a0e7f198549ea319a502edad64c9c37f7e06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4fb47a1b541afa2621f4a50b182cfe71133422fbd2c8ac8b720420d81486625e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fb47a1b541afa2621f4a50b182cfe71133422fbd2c8ac8b720420d81486625e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.929837 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.941329 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.944193 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:14 crc kubenswrapper[4830]: E0318 18:04:14.944300 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:30.944280401 +0000 UTC m=+105.511910733 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.944344 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.944399 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.944440 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:14 crc kubenswrapper[4830]: E0318 18:04:14.944494 4830 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 18:04:14 crc kubenswrapper[4830]: E0318 18:04:14.944629 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 18:04:14 crc kubenswrapper[4830]: E0318 18:04:14.944664 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 18:04:14 crc kubenswrapper[4830]: E0318 18:04:14.944636 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:30.9446287 +0000 UTC m=+105.512259032 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 18:04:14 crc kubenswrapper[4830]: E0318 18:04:14.944652 4830 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.944502 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:14 crc kubenswrapper[4830]: E0318 18:04:14.944715 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:30.944708772 +0000 UTC m=+105.512339104 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 18:04:14 crc kubenswrapper[4830]: E0318 18:04:14.944581 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 18:04:14 crc kubenswrapper[4830]: E0318 18:04:14.944741 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 18:04:14 crc kubenswrapper[4830]: E0318 18:04:14.944753 4830 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:04:14 crc kubenswrapper[4830]: E0318 18:04:14.944687 4830 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:04:14 crc kubenswrapper[4830]: E0318 18:04:14.944844 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:30.944824615 +0000 UTC m=+105.512455017 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:04:14 crc kubenswrapper[4830]: E0318 18:04:14.944930 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:30.944890297 +0000 UTC m=+105.512520799 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.950553 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.950596 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.950607 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.950622 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.950634 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:14Z","lastTransitionTime":"2026-03-18T18:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.951575 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fvhfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bacdd483-ef3d-43b9-92c1-67f1eac421ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwr9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fvhfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.961994 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe02a32-24dc-4772-8a10-0128d3a304e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://259ed3b1a8b8af90c86863ce643b9bab5be41f5df67499bc33ee2e175003b867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5n48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de4e26f6767da64f02f9792da506b0d4a20c0e15b76e432cd3ee81dff89156a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5n48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-plzpb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.983562 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25bfc51a1d89fadf1e82f3a2d35c1dcbf73f13fcfa73185e9b4db03fb61fdfbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25bfc51a1d89fadf1e82f3a2d35c1dcbf73f13fcfa73185e9b4db03fb61fdfbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8r9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vjt8t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:14 crc kubenswrapper[4830]: I0318 18:04:14.996513 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c5rtg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f094c167-4135-4e16-97f3-2759780a857a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jg8nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c5rtg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.045399 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/437f27f7-4531-4e3e-b3a9-a471c7630012-metrics-certs\") pod \"network-metrics-daemon-wx6kd\" (UID: \"437f27f7-4531-4e3e-b3a9-a471c7630012\") " pod="openshift-multus/network-metrics-daemon-wx6kd" Mar 18 18:04:15 crc kubenswrapper[4830]: E0318 18:04:15.045616 4830 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 18:04:15 crc kubenswrapper[4830]: E0318 18:04:15.045788 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/437f27f7-4531-4e3e-b3a9-a471c7630012-metrics-certs podName:437f27f7-4531-4e3e-b3a9-a471c7630012 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:31.045724018 +0000 UTC m=+105.613354530 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/437f27f7-4531-4e3e-b3a9-a471c7630012-metrics-certs") pod "network-metrics-daemon-wx6kd" (UID: "437f27f7-4531-4e3e-b3a9-a471c7630012") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.053966 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.054004 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.054016 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.054032 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.054042 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:15Z","lastTransitionTime":"2026-03-18T18:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.155861 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.155902 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.155911 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.155925 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.155935 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:15Z","lastTransitionTime":"2026-03-18T18:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.234286 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wx6kd" Mar 18 18:04:15 crc kubenswrapper[4830]: E0318 18:04:15.234495 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wx6kd" podUID="437f27f7-4531-4e3e-b3a9-a471c7630012" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.234620 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:15 crc kubenswrapper[4830]: E0318 18:04:15.234705 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.234789 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:15 crc kubenswrapper[4830]: E0318 18:04:15.234853 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.259031 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.259471 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.259489 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.259511 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.259524 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:15Z","lastTransitionTime":"2026-03-18T18:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.363558 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.363608 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.363621 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.363640 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.363653 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:15Z","lastTransitionTime":"2026-03-18T18:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.466294 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.466325 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.466332 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.466345 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.466354 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:15Z","lastTransitionTime":"2026-03-18T18:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.569332 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.569375 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.569387 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.569406 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.569418 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:15Z","lastTransitionTime":"2026-03-18T18:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.608559 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nnmtt" event={"ID":"0eb10a6f-af83-4366-9613-6350e3297007","Type":"ContainerStarted","Data":"4f1038fac6bf785ba2d24ef40ff16b40db4395f2bb3c44a01cd595143970935a"} Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.613993 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" event={"ID":"af6abd23-401c-4f5a-a63a-19d7eed4f9ef","Type":"ContainerStarted","Data":"9c055074fab1f5b68af5ed7cc89c931f51bee0cb3e31c0853b5929a15a28e524"} Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.614039 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" event={"ID":"af6abd23-401c-4f5a-a63a-19d7eed4f9ef","Type":"ContainerStarted","Data":"49631a35ca0e3e3c551bf4a7bbf459082852d2954932d8e8212368032ae2cc14"} Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.614059 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" event={"ID":"af6abd23-401c-4f5a-a63a-19d7eed4f9ef","Type":"ContainerStarted","Data":"7b85b051550a6614e24d0456dc036cbe8af92feb00795c5b5d8524ce25df2891"} Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.614077 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" event={"ID":"af6abd23-401c-4f5a-a63a-19d7eed4f9ef","Type":"ContainerStarted","Data":"c629b99c9a5b1f6a1e086a9549a39f86a1f029131265428efa37c24d248aa0d9"} Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.614095 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" event={"ID":"af6abd23-401c-4f5a-a63a-19d7eed4f9ef","Type":"ContainerStarted","Data":"81c3b57759b7e66d561bfe18dcaaf96eac2559368ca5365ef503c9a51c40257a"} Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.637881 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podStartSLOduration=49.637860498 podStartE2EDuration="49.637860498s" podCreationTimestamp="2026-03-18 18:03:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:15.63750677 +0000 UTC m=+90.205137102" watchObservedRunningTime="2026-03-18 18:04:15.637860498 +0000 UTC m=+90.205490840" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.671154 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.671184 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.671194 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.671210 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.671220 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:15Z","lastTransitionTime":"2026-03-18T18:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.725417 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=5.725371043 podStartE2EDuration="5.725371043s" podCreationTimestamp="2026-03-18 18:04:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:15.723720961 +0000 UTC m=+90.291351293" watchObservedRunningTime="2026-03-18 18:04:15.725371043 +0000 UTC m=+90.293001375" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.770877 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nnmtt" podStartSLOduration=48.770754001 podStartE2EDuration="48.770754001s" podCreationTimestamp="2026-03-18 18:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:15.77073754 +0000 UTC m=+90.338367892" watchObservedRunningTime="2026-03-18 18:04:15.770754001 +0000 UTC m=+90.338384353" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.775965 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.776002 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.776014 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.776049 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.776060 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:15Z","lastTransitionTime":"2026-03-18T18:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.805723 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.805701055 podStartE2EDuration="14.805701055s" podCreationTimestamp="2026-03-18 18:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:15.804873994 +0000 UTC m=+90.372504336" watchObservedRunningTime="2026-03-18 18:04:15.805701055 +0000 UTC m=+90.373331387" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.805880 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-5tfzr" podStartSLOduration=49.805875509 podStartE2EDuration="49.805875509s" podCreationTimestamp="2026-03-18 18:03:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:15.78851527 +0000 UTC m=+90.356145612" watchObservedRunningTime="2026-03-18 18:04:15.805875509 +0000 UTC m=+90.373505841" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.879504 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.879556 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.879569 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.879592 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.879606 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:15Z","lastTransitionTime":"2026-03-18T18:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.983467 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.983515 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.983531 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.983550 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:15 crc kubenswrapper[4830]: I0318 18:04:15.983562 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:15Z","lastTransitionTime":"2026-03-18T18:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.087533 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.087594 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.087609 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.087629 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.087647 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:16Z","lastTransitionTime":"2026-03-18T18:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.190992 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.191443 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.191454 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.191471 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.191504 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:16Z","lastTransitionTime":"2026-03-18T18:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.236688 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:16 crc kubenswrapper[4830]: E0318 18:04:16.237424 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.252254 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.294873 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.294908 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.294920 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.294938 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.294949 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:16Z","lastTransitionTime":"2026-03-18T18:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.397086 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.397151 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.397164 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.397222 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.397240 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:16Z","lastTransitionTime":"2026-03-18T18:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.500188 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.500551 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.500559 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.500572 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.500582 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:16Z","lastTransitionTime":"2026-03-18T18:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.603543 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.603601 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.603612 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.603632 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.603646 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:16Z","lastTransitionTime":"2026-03-18T18:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.619195 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c5rtg" event={"ID":"f094c167-4135-4e16-97f3-2759780a857a","Type":"ContainerStarted","Data":"ced75054c90ba320ed3a8a3d90f40ff2ec0fe7e5fac5cc56f91695a5908e7fd4"} Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.621451 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zpw8m" event={"ID":"55b8eced-700a-4b44-8315-c5afac8ca1bf","Type":"ContainerStarted","Data":"b2527278040093822f87c66655799a34ece81575e3a39c64302b99c1b2945142"} Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.623602 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fvhfm" event={"ID":"bacdd483-ef3d-43b9-92c1-67f1eac421ad","Type":"ContainerStarted","Data":"093aeedb6ea721c32933f7b470cc91fb92a36f13c668e979b26316e830aa1359"} Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.633925 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" event={"ID":"af6abd23-401c-4f5a-a63a-19d7eed4f9ef","Type":"ContainerStarted","Data":"3ed91f96f623365d7facd62210c861526be22ad64521a84c54d9a3aabf209195"} Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.655313 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=0.655282918 podStartE2EDuration="655.282918ms" podCreationTimestamp="2026-03-18 18:04:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:16.654863598 +0000 UTC m=+91.222493970" watchObservedRunningTime="2026-03-18 18:04:16.655282918 +0000 UTC m=+91.222913250" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.670002 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-fvhfm" podStartSLOduration=50.66997308 podStartE2EDuration="50.66997308s" podCreationTimestamp="2026-03-18 18:03:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:16.66881379 +0000 UTC m=+91.236444122" watchObservedRunningTime="2026-03-18 18:04:16.66997308 +0000 UTC m=+91.237603452" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.689540 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-zpw8m" podStartSLOduration=50.689511624 podStartE2EDuration="50.689511624s" podCreationTimestamp="2026-03-18 18:03:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:16.688734454 +0000 UTC m=+91.256364786" watchObservedRunningTime="2026-03-18 18:04:16.689511624 +0000 UTC m=+91.257141996" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.706908 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.706965 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.706976 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.706995 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.707007 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:16Z","lastTransitionTime":"2026-03-18T18:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.809851 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.809896 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.809906 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.809928 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.809943 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:16Z","lastTransitionTime":"2026-03-18T18:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.912369 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.912434 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.912444 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.912466 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:16 crc kubenswrapper[4830]: I0318 18:04:16.912480 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:16Z","lastTransitionTime":"2026-03-18T18:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.014956 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.015031 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.015055 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.015084 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.015106 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:17Z","lastTransitionTime":"2026-03-18T18:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.117974 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.118003 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.118014 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.118030 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.118043 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:17Z","lastTransitionTime":"2026-03-18T18:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.131493 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.131557 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.131580 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.131611 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.131633 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:17Z","lastTransitionTime":"2026-03-18T18:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.187537 4830 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.193720 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-pt4jq"] Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.194231 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pt4jq" Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.196212 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.196386 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.196390 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.196626 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.198232 4830 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.234477 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:17 crc kubenswrapper[4830]: E0318 18:04:17.234639 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.234718 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.234729 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wx6kd" Mar 18 18:04:17 crc kubenswrapper[4830]: E0318 18:04:17.234859 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:17 crc kubenswrapper[4830]: E0318 18:04:17.235017 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wx6kd" podUID="437f27f7-4531-4e3e-b3a9-a471c7630012" Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.270552 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1fc06932-e8d5-4a6d-b572-841cdd9b9a3d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-pt4jq\" (UID: \"1fc06932-e8d5-4a6d-b572-841cdd9b9a3d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pt4jq" Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.270597 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1fc06932-e8d5-4a6d-b572-841cdd9b9a3d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-pt4jq\" (UID: \"1fc06932-e8d5-4a6d-b572-841cdd9b9a3d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pt4jq" Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.270615 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1fc06932-e8d5-4a6d-b572-841cdd9b9a3d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-pt4jq\" (UID: \"1fc06932-e8d5-4a6d-b572-841cdd9b9a3d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pt4jq" Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.270639 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1fc06932-e8d5-4a6d-b572-841cdd9b9a3d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-pt4jq\" (UID: \"1fc06932-e8d5-4a6d-b572-841cdd9b9a3d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pt4jq" Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.270862 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fc06932-e8d5-4a6d-b572-841cdd9b9a3d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-pt4jq\" (UID: \"1fc06932-e8d5-4a6d-b572-841cdd9b9a3d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pt4jq" Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.372322 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1fc06932-e8d5-4a6d-b572-841cdd9b9a3d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-pt4jq\" (UID: \"1fc06932-e8d5-4a6d-b572-841cdd9b9a3d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pt4jq" Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.372387 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1fc06932-e8d5-4a6d-b572-841cdd9b9a3d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-pt4jq\" (UID: \"1fc06932-e8d5-4a6d-b572-841cdd9b9a3d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pt4jq" Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.372423 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1fc06932-e8d5-4a6d-b572-841cdd9b9a3d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-pt4jq\" (UID: \"1fc06932-e8d5-4a6d-b572-841cdd9b9a3d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pt4jq" Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.372490 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fc06932-e8d5-4a6d-b572-841cdd9b9a3d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-pt4jq\" (UID: \"1fc06932-e8d5-4a6d-b572-841cdd9b9a3d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pt4jq" Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.372523 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1fc06932-e8d5-4a6d-b572-841cdd9b9a3d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-pt4jq\" (UID: \"1fc06932-e8d5-4a6d-b572-841cdd9b9a3d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pt4jq" Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.372573 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1fc06932-e8d5-4a6d-b572-841cdd9b9a3d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-pt4jq\" (UID: \"1fc06932-e8d5-4a6d-b572-841cdd9b9a3d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pt4jq" Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.372669 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1fc06932-e8d5-4a6d-b572-841cdd9b9a3d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-pt4jq\" (UID: \"1fc06932-e8d5-4a6d-b572-841cdd9b9a3d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pt4jq" Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.373479 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1fc06932-e8d5-4a6d-b572-841cdd9b9a3d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-pt4jq\" (UID: \"1fc06932-e8d5-4a6d-b572-841cdd9b9a3d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pt4jq" Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.380175 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fc06932-e8d5-4a6d-b572-841cdd9b9a3d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-pt4jq\" (UID: \"1fc06932-e8d5-4a6d-b572-841cdd9b9a3d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pt4jq" Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.398533 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1fc06932-e8d5-4a6d-b572-841cdd9b9a3d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-pt4jq\" (UID: \"1fc06932-e8d5-4a6d-b572-841cdd9b9a3d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pt4jq" Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.506909 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pt4jq" Mar 18 18:04:17 crc kubenswrapper[4830]: W0318 18:04:17.522346 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fc06932_e8d5_4a6d_b572_841cdd9b9a3d.slice/crio-f5a439a3a3962ddf612c53dcc54210cc3de893bbbcb4040c39b143db07fdd6f3 WatchSource:0}: Error finding container f5a439a3a3962ddf612c53dcc54210cc3de893bbbcb4040c39b143db07fdd6f3: Status 404 returned error can't find the container with id f5a439a3a3962ddf612c53dcc54210cc3de893bbbcb4040c39b143db07fdd6f3 Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.638290 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"a2dc1134e70826d624ced2fd2d325ff2bf6cf855a04426894a19ae437628cd37"} Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.641862 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pt4jq" event={"ID":"1fc06932-e8d5-4a6d-b572-841cdd9b9a3d","Type":"ContainerStarted","Data":"aaad3153df60eb729f6b4499419d7dca0a03e72b4136ad1821294f8cdb9dd041"} Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.641905 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pt4jq" event={"ID":"1fc06932-e8d5-4a6d-b572-841cdd9b9a3d","Type":"ContainerStarted","Data":"f5a439a3a3962ddf612c53dcc54210cc3de893bbbcb4040c39b143db07fdd6f3"} Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.645237 4830 generic.go:334] "Generic (PLEG): container finished" podID="f094c167-4135-4e16-97f3-2759780a857a" containerID="ced75054c90ba320ed3a8a3d90f40ff2ec0fe7e5fac5cc56f91695a5908e7fd4" exitCode=0 Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.645328 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c5rtg" event={"ID":"f094c167-4135-4e16-97f3-2759780a857a","Type":"ContainerDied","Data":"ced75054c90ba320ed3a8a3d90f40ff2ec0fe7e5fac5cc56f91695a5908e7fd4"} Mar 18 18:04:17 crc kubenswrapper[4830]: I0318 18:04:17.708815 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pt4jq" podStartSLOduration=51.708759321 podStartE2EDuration="51.708759321s" podCreationTimestamp="2026-03-18 18:03:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:17.708484724 +0000 UTC m=+92.276115056" watchObservedRunningTime="2026-03-18 18:04:17.708759321 +0000 UTC m=+92.276389683" Mar 18 18:04:18 crc kubenswrapper[4830]: I0318 18:04:18.233857 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:18 crc kubenswrapper[4830]: E0318 18:04:18.233997 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:18 crc kubenswrapper[4830]: I0318 18:04:18.655936 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" event={"ID":"af6abd23-401c-4f5a-a63a-19d7eed4f9ef","Type":"ContainerStarted","Data":"786c218fae0f716a035dc2a2c4e3799634bd5378dbf735aaceadfbab8bb1514e"} Mar 18 18:04:18 crc kubenswrapper[4830]: I0318 18:04:18.657859 4830 generic.go:334] "Generic (PLEG): container finished" podID="f094c167-4135-4e16-97f3-2759780a857a" containerID="faad14ff949dce8cd56d5082d889cf689eb8ad5f63fe4da38cd3f80a9edfb6f9" exitCode=0 Mar 18 18:04:18 crc kubenswrapper[4830]: I0318 18:04:18.657893 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c5rtg" event={"ID":"f094c167-4135-4e16-97f3-2759780a857a","Type":"ContainerDied","Data":"faad14ff949dce8cd56d5082d889cf689eb8ad5f63fe4da38cd3f80a9edfb6f9"} Mar 18 18:04:19 crc kubenswrapper[4830]: I0318 18:04:19.234372 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wx6kd" Mar 18 18:04:19 crc kubenswrapper[4830]: I0318 18:04:19.234443 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:19 crc kubenswrapper[4830]: I0318 18:04:19.234382 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:19 crc kubenswrapper[4830]: E0318 18:04:19.234579 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wx6kd" podUID="437f27f7-4531-4e3e-b3a9-a471c7630012" Mar 18 18:04:19 crc kubenswrapper[4830]: E0318 18:04:19.234694 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:19 crc kubenswrapper[4830]: E0318 18:04:19.234879 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:19 crc kubenswrapper[4830]: I0318 18:04:19.663732 4830 generic.go:334] "Generic (PLEG): container finished" podID="f094c167-4135-4e16-97f3-2759780a857a" containerID="15e214e16cc466e42983c908c98690612b63e43b81b92491016eb267f922d992" exitCode=0 Mar 18 18:04:19 crc kubenswrapper[4830]: I0318 18:04:19.663825 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c5rtg" event={"ID":"f094c167-4135-4e16-97f3-2759780a857a","Type":"ContainerDied","Data":"15e214e16cc466e42983c908c98690612b63e43b81b92491016eb267f922d992"} Mar 18 18:04:20 crc kubenswrapper[4830]: I0318 18:04:20.127600 4830 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 18 18:04:20 crc kubenswrapper[4830]: I0318 18:04:20.233931 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:20 crc kubenswrapper[4830]: E0318 18:04:20.234110 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:20 crc kubenswrapper[4830]: I0318 18:04:20.670920 4830 generic.go:334] "Generic (PLEG): container finished" podID="f094c167-4135-4e16-97f3-2759780a857a" containerID="a88169133241c6daa19c2562afb214ddd54d2f487437661d7597b15c218c7123" exitCode=0 Mar 18 18:04:20 crc kubenswrapper[4830]: I0318 18:04:20.671024 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c5rtg" event={"ID":"f094c167-4135-4e16-97f3-2759780a857a","Type":"ContainerDied","Data":"a88169133241c6daa19c2562afb214ddd54d2f487437661d7597b15c218c7123"} Mar 18 18:04:20 crc kubenswrapper[4830]: I0318 18:04:20.680360 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" event={"ID":"af6abd23-401c-4f5a-a63a-19d7eed4f9ef","Type":"ContainerStarted","Data":"c62190ab09749ef2cbc13187502af58d6172a5156803e23363a6f5bf35f7d482"} Mar 18 18:04:20 crc kubenswrapper[4830]: I0318 18:04:20.681862 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:04:20 crc kubenswrapper[4830]: I0318 18:04:20.682038 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:04:20 crc kubenswrapper[4830]: I0318 18:04:20.682075 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:04:20 crc kubenswrapper[4830]: I0318 18:04:20.711451 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:04:20 crc kubenswrapper[4830]: I0318 18:04:20.712982 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:04:20 crc kubenswrapper[4830]: I0318 18:04:20.728020 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" podStartSLOduration=53.727988956 podStartE2EDuration="53.727988956s" podCreationTimestamp="2026-03-18 18:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:20.72537177 +0000 UTC m=+95.293002122" watchObservedRunningTime="2026-03-18 18:04:20.727988956 +0000 UTC m=+95.295619298" Mar 18 18:04:21 crc kubenswrapper[4830]: I0318 18:04:21.233866 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wx6kd" Mar 18 18:04:21 crc kubenswrapper[4830]: I0318 18:04:21.233878 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:21 crc kubenswrapper[4830]: E0318 18:04:21.234411 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wx6kd" podUID="437f27f7-4531-4e3e-b3a9-a471c7630012" Mar 18 18:04:21 crc kubenswrapper[4830]: I0318 18:04:21.233953 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:21 crc kubenswrapper[4830]: E0318 18:04:21.234479 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:21 crc kubenswrapper[4830]: E0318 18:04:21.234617 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:21 crc kubenswrapper[4830]: I0318 18:04:21.686188 4830 generic.go:334] "Generic (PLEG): container finished" podID="f094c167-4135-4e16-97f3-2759780a857a" containerID="2616097d82630bd41232b9f09873c533586c9abcd01d09533cd69f8ccf3d3d78" exitCode=0 Mar 18 18:04:21 crc kubenswrapper[4830]: I0318 18:04:21.687149 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c5rtg" event={"ID":"f094c167-4135-4e16-97f3-2759780a857a","Type":"ContainerDied","Data":"2616097d82630bd41232b9f09873c533586c9abcd01d09533cd69f8ccf3d3d78"} Mar 18 18:04:22 crc kubenswrapper[4830]: I0318 18:04:22.235037 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:22 crc kubenswrapper[4830]: E0318 18:04:22.235200 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:22 crc kubenswrapper[4830]: I0318 18:04:22.673131 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wx6kd"] Mar 18 18:04:22 crc kubenswrapper[4830]: I0318 18:04:22.673335 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wx6kd" Mar 18 18:04:22 crc kubenswrapper[4830]: E0318 18:04:22.673479 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wx6kd" podUID="437f27f7-4531-4e3e-b3a9-a471c7630012" Mar 18 18:04:22 crc kubenswrapper[4830]: I0318 18:04:22.692961 4830 generic.go:334] "Generic (PLEG): container finished" podID="f094c167-4135-4e16-97f3-2759780a857a" containerID="94e20f6f5a5982f3245a19d1415109f6ec5bff377b033b0787d0499b6f4d353c" exitCode=0 Mar 18 18:04:22 crc kubenswrapper[4830]: I0318 18:04:22.693026 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c5rtg" event={"ID":"f094c167-4135-4e16-97f3-2759780a857a","Type":"ContainerDied","Data":"94e20f6f5a5982f3245a19d1415109f6ec5bff377b033b0787d0499b6f4d353c"} Mar 18 18:04:23 crc kubenswrapper[4830]: I0318 18:04:23.233789 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:23 crc kubenswrapper[4830]: E0318 18:04:23.233942 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:23 crc kubenswrapper[4830]: I0318 18:04:23.234386 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:23 crc kubenswrapper[4830]: E0318 18:04:23.234464 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:23 crc kubenswrapper[4830]: I0318 18:04:23.705099 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c5rtg" event={"ID":"f094c167-4135-4e16-97f3-2759780a857a","Type":"ContainerStarted","Data":"0ae55cb47913ba0e9fc282b49f1575b9a2372ae2ab6fba61ebc37f08722bb47e"} Mar 18 18:04:23 crc kubenswrapper[4830]: I0318 18:04:23.733370 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-c5rtg" podStartSLOduration=56.73334608 podStartE2EDuration="56.73334608s" podCreationTimestamp="2026-03-18 18:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:23.731077053 +0000 UTC m=+98.298707425" watchObservedRunningTime="2026-03-18 18:04:23.73334608 +0000 UTC m=+98.300976452" Mar 18 18:04:24 crc kubenswrapper[4830]: I0318 18:04:24.234341 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:24 crc kubenswrapper[4830]: E0318 18:04:24.234507 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:24 crc kubenswrapper[4830]: I0318 18:04:24.234534 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wx6kd" Mar 18 18:04:24 crc kubenswrapper[4830]: E0318 18:04:24.234614 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wx6kd" podUID="437f27f7-4531-4e3e-b3a9-a471c7630012" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.233795 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.233853 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:25 crc kubenswrapper[4830]: E0318 18:04:25.233895 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:25 crc kubenswrapper[4830]: E0318 18:04:25.234010 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.287847 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.288151 4830 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.330033 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.330422 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.332169 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cnwlr"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.332368 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cnwlr" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.333968 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.334369 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.336217 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.336312 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.336362 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.336527 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.339709 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.339827 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.340038 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.340120 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.340240 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.342481 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-kr648"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.343222 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kr648" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.343916 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.343966 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.344155 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.346202 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-f77k6"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.346982 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-f77k6" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.351070 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.351534 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.355970 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.360285 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.360319 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.360637 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.360831 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.368095 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.369432 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.370957 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.373892 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cnwlr"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.375196 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.390139 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.392532 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-fltkw"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.393164 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-s77pq"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.393562 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-s77pq" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.393900 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fltkw" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.393994 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gvn7f"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.394545 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gvn7f" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.402516 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.403008 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.403157 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.403298 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.403441 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.403581 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.404549 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.404617 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gm5vv"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.405251 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j2j42"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.405345 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.405871 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm5vv" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.407892 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-j2j42" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.412342 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.412410 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.412480 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.412531 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.414109 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-cww5w"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.414534 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-cww5w" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.414863 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fhz5c"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.415240 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fhz5c" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.416754 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g5rg5"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.417051 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g5rg5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.417649 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgk4d"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.417992 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgk4d" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.418939 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.419446 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.419823 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.420120 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.420647 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-s6twn"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.420970 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-kr648"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.421046 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-s6twn" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.422705 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-27p2h"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.423221 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27p2h" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.423794 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.423896 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.423925 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.423942 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.424186 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.425044 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.425356 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.425504 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.425649 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.425808 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.425955 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.426089 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.426226 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.426389 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.426528 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.426669 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.426819 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.426980 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.427072 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6kfbf"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.427512 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xv8d6"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.427794 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-xv8d6" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.427912 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6kfbf" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.431594 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.432100 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.432284 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.434800 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bzlw5"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.447976 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.448454 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.448807 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.449018 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.449317 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.449853 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.450039 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.450581 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.451108 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.451601 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.452095 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.452337 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.452757 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.453094 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.453715 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.454153 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.462169 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.488684 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-t7xl2"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.489122 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-t7xl2" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.490399 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.488691 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.491290 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.490707 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.491730 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d43b41ee-1e3d-42bf-8856-0678d441ac96-encryption-config\") pod \"apiserver-7bbb656c7d-78nbq\" (UID: \"d43b41ee-1e3d-42bf-8856-0678d441ac96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.491783 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhvcj\" (UniqueName: \"kubernetes.io/projected/884979c1-fccc-4bd6-b6db-4ec35bd9bdf7-kube-api-access-jhvcj\") pod \"openshift-config-operator-7777fb866f-kr648\" (UID: \"884979c1-fccc-4bd6-b6db-4ec35bd9bdf7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kr648" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.491804 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/35a6cab9-b63f-4ed5-ac08-897e894498c5-client-ca\") pod \"controller-manager-879f6c89f-cnwlr\" (UID: \"35a6cab9-b63f-4ed5-ac08-897e894498c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cnwlr" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.491834 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/884979c1-fccc-4bd6-b6db-4ec35bd9bdf7-available-featuregates\") pod \"openshift-config-operator-7777fb866f-kr648\" (UID: \"884979c1-fccc-4bd6-b6db-4ec35bd9bdf7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kr648" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.491853 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9279fbd5-1378-4a9a-b35d-85a7b9430674-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-f77k6\" (UID: \"9279fbd5-1378-4a9a-b35d-85a7b9430674\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f77k6" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.491868 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d43b41ee-1e3d-42bf-8856-0678d441ac96-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-78nbq\" (UID: \"d43b41ee-1e3d-42bf-8856-0678d441ac96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.491887 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/35a6cab9-b63f-4ed5-ac08-897e894498c5-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-cnwlr\" (UID: \"35a6cab9-b63f-4ed5-ac08-897e894498c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cnwlr" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.491903 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zxs7\" (UniqueName: \"kubernetes.io/projected/35a6cab9-b63f-4ed5-ac08-897e894498c5-kube-api-access-6zxs7\") pod \"controller-manager-879f6c89f-cnwlr\" (UID: \"35a6cab9-b63f-4ed5-ac08-897e894498c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cnwlr" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.491919 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35a6cab9-b63f-4ed5-ac08-897e894498c5-serving-cert\") pod \"controller-manager-879f6c89f-cnwlr\" (UID: \"35a6cab9-b63f-4ed5-ac08-897e894498c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cnwlr" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.491948 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkxts\" (UniqueName: \"kubernetes.io/projected/d43b41ee-1e3d-42bf-8856-0678d441ac96-kube-api-access-dkxts\") pod \"apiserver-7bbb656c7d-78nbq\" (UID: \"d43b41ee-1e3d-42bf-8856-0678d441ac96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.491963 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbbwn\" (UniqueName: \"kubernetes.io/projected/9279fbd5-1378-4a9a-b35d-85a7b9430674-kube-api-access-hbbwn\") pod \"machine-api-operator-5694c8668f-f77k6\" (UID: \"9279fbd5-1378-4a9a-b35d-85a7b9430674\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f77k6" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.491980 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d43b41ee-1e3d-42bf-8856-0678d441ac96-audit-dir\") pod \"apiserver-7bbb656c7d-78nbq\" (UID: \"d43b41ee-1e3d-42bf-8856-0678d441ac96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.492001 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/884979c1-fccc-4bd6-b6db-4ec35bd9bdf7-serving-cert\") pod \"openshift-config-operator-7777fb866f-kr648\" (UID: \"884979c1-fccc-4bd6-b6db-4ec35bd9bdf7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kr648" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.492276 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9279fbd5-1378-4a9a-b35d-85a7b9430674-images\") pod \"machine-api-operator-5694c8668f-f77k6\" (UID: \"9279fbd5-1378-4a9a-b35d-85a7b9430674\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f77k6" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.492301 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35a6cab9-b63f-4ed5-ac08-897e894498c5-config\") pod \"controller-manager-879f6c89f-cnwlr\" (UID: \"35a6cab9-b63f-4ed5-ac08-897e894498c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cnwlr" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.492315 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9279fbd5-1378-4a9a-b35d-85a7b9430674-config\") pod \"machine-api-operator-5694c8668f-f77k6\" (UID: \"9279fbd5-1378-4a9a-b35d-85a7b9430674\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f77k6" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.492333 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d43b41ee-1e3d-42bf-8856-0678d441ac96-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-78nbq\" (UID: \"d43b41ee-1e3d-42bf-8856-0678d441ac96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.492350 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d43b41ee-1e3d-42bf-8856-0678d441ac96-audit-policies\") pod \"apiserver-7bbb656c7d-78nbq\" (UID: \"d43b41ee-1e3d-42bf-8856-0678d441ac96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.492366 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d43b41ee-1e3d-42bf-8856-0678d441ac96-serving-cert\") pod \"apiserver-7bbb656c7d-78nbq\" (UID: \"d43b41ee-1e3d-42bf-8856-0678d441ac96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.492383 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d43b41ee-1e3d-42bf-8856-0678d441ac96-etcd-client\") pod \"apiserver-7bbb656c7d-78nbq\" (UID: \"d43b41ee-1e3d-42bf-8856-0678d441ac96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.492603 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.496573 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-f4wwz"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.497151 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f4wwz" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.498018 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-rpbbt"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.498418 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-rpbbt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.498967 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.500075 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5q9t5"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.500667 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5q9t5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.500941 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.501113 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.501323 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.501362 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.501470 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.503437 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.503562 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nr285"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.504583 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.504752 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.504996 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.505018 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.505399 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.507294 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2xz7d"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.507835 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2xz7d" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.512458 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.512535 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.512634 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.512672 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.512678 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.512974 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.514827 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.517188 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.518039 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.518784 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-wnkwx"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.520094 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-wnkwx" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.524225 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-w4n2s"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.524840 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-wjtsp"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.525061 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-w4n2s" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.525261 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-wjtsp" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.525307 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kczvm"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.525716 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.528508 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sztnk"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.528960 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sztnk" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.534018 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.534877 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-lfz57"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.535425 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lfz57" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.535843 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ltsqn"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.536391 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ltsqn" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.537203 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-mfdzp"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.537694 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-mfdzp" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.539106 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4cqtc"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.545955 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-59q79"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.546229 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4cqtc" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.547253 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-59q79" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.549053 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564280-6hx2v"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.551619 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-6hx2v" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.552806 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.555353 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-npj9g"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.556027 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-npj9g" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.558229 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9x2m7"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.558657 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9x2m7" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.559070 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnxfs"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.559446 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnxfs" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.560059 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-f77k6"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.561977 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-qx95k"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.562706 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qx95k" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.563069 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-vk4md"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.563550 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-vk4md" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.564192 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-fltkw"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.565655 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-cww5w"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.566661 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-s77pq"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.567645 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gvn7f"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.568646 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j2j42"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.569725 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgk4d"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.571875 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g5rg5"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.572349 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.573053 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gm5vv"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.574644 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ltsqn"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.576514 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-wjtsp"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.579073 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-rpbbt"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.582908 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kczvm"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.584840 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-t7xl2"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.586829 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-27p2h"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.589259 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4cqtc"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.591109 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fhz5c"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.592268 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.593005 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6kfbf"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.594135 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nr285"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.594432 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d43b41ee-1e3d-42bf-8856-0678d441ac96-etcd-client\") pod \"apiserver-7bbb656c7d-78nbq\" (UID: \"d43b41ee-1e3d-42bf-8856-0678d441ac96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.595433 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2e1a48f2-d6ff-4699-aea7-66f08f0a4e4b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fhz5c\" (UID: \"2e1a48f2-d6ff-4699-aea7-66f08f0a4e4b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fhz5c" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.595469 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d43b41ee-1e3d-42bf-8856-0678d441ac96-encryption-config\") pod \"apiserver-7bbb656c7d-78nbq\" (UID: \"d43b41ee-1e3d-42bf-8856-0678d441ac96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.595508 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/498fcaeb-168e-4860-9f5d-a7c72ee1808f-serving-cert\") pod \"etcd-operator-b45778765-xv8d6\" (UID: \"498fcaeb-168e-4860-9f5d-a7c72ee1808f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xv8d6" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.595506 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5q9t5"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.595538 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhvcj\" (UniqueName: \"kubernetes.io/projected/884979c1-fccc-4bd6-b6db-4ec35bd9bdf7-kube-api-access-jhvcj\") pod \"openshift-config-operator-7777fb866f-kr648\" (UID: \"884979c1-fccc-4bd6-b6db-4ec35bd9bdf7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kr648" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.595601 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwbpk\" (UniqueName: \"kubernetes.io/projected/c8de305f-4cde-4354-ad95-b74003e014a2-kube-api-access-xwbpk\") pod \"machine-config-controller-84d6567774-gm5vv\" (UID: \"c8de305f-4cde-4354-ad95-b74003e014a2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm5vv" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.595625 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/215da1aa-97ec-4ef7-a65d-597190dc6c63-bound-sa-token\") pod \"ingress-operator-5b745b69d9-fltkw\" (UID: \"215da1aa-97ec-4ef7-a65d-597190dc6c63\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fltkw" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.595647 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/35a6cab9-b63f-4ed5-ac08-897e894498c5-client-ca\") pod \"controller-manager-879f6c89f-cnwlr\" (UID: \"35a6cab9-b63f-4ed5-ac08-897e894498c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cnwlr" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.595670 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ba02dfe-ace9-4644-b56c-cba779cfb2ec-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-gvn7f\" (UID: \"6ba02dfe-ace9-4644-b56c-cba779cfb2ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gvn7f" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.595690 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96025a62-2435-451a-93bf-b03d24d6cfc1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-s6twn\" (UID: \"96025a62-2435-451a-93bf-b03d24d6cfc1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s6twn" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.595706 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8bcb1a4d-f708-4d3a-81f1-8373e36eb474-serving-cert\") pod \"console-operator-58897d9998-cww5w\" (UID: \"8bcb1a4d-f708-4d3a-81f1-8373e36eb474\") " pod="openshift-console-operator/console-operator-58897d9998-cww5w" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.595728 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc2pc\" (UniqueName: \"kubernetes.io/projected/a043d02e-8a8a-42e6-839d-15dc6c0b43b6-kube-api-access-rc2pc\") pod \"cluster-samples-operator-665b6dd947-qgk4d\" (UID: \"a043d02e-8a8a-42e6-839d-15dc6c0b43b6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgk4d" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.595761 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/884979c1-fccc-4bd6-b6db-4ec35bd9bdf7-available-featuregates\") pod \"openshift-config-operator-7777fb866f-kr648\" (UID: \"884979c1-fccc-4bd6-b6db-4ec35bd9bdf7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kr648" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.595799 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/498fcaeb-168e-4860-9f5d-a7c72ee1808f-etcd-client\") pod \"etcd-operator-b45778765-xv8d6\" (UID: \"498fcaeb-168e-4860-9f5d-a7c72ee1808f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xv8d6" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.595818 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/defba3b0-b36e-4b8e-a8b1-577782a54249-client-ca\") pod \"route-controller-manager-6576b87f9c-27p2h\" (UID: \"defba3b0-b36e-4b8e-a8b1-577782a54249\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27p2h" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.595835 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl5sk\" (UniqueName: \"kubernetes.io/projected/01d0fb39-b10a-4717-8e77-ed73f95166bd-kube-api-access-tl5sk\") pod \"package-server-manager-789f6589d5-6kfbf\" (UID: \"01d0fb39-b10a-4717-8e77-ed73f95166bd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6kfbf" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.595857 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9279fbd5-1378-4a9a-b35d-85a7b9430674-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-f77k6\" (UID: \"9279fbd5-1378-4a9a-b35d-85a7b9430674\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f77k6" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.595875 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d43b41ee-1e3d-42bf-8856-0678d441ac96-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-78nbq\" (UID: \"d43b41ee-1e3d-42bf-8856-0678d441ac96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.595893 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96025a62-2435-451a-93bf-b03d24d6cfc1-config\") pod \"authentication-operator-69f744f599-s6twn\" (UID: \"96025a62-2435-451a-93bf-b03d24d6cfc1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s6twn" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.595911 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scv79\" (UniqueName: \"kubernetes.io/projected/96025a62-2435-451a-93bf-b03d24d6cfc1-kube-api-access-scv79\") pod \"authentication-operator-69f744f599-s6twn\" (UID: \"96025a62-2435-451a-93bf-b03d24d6cfc1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s6twn" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.595928 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/215da1aa-97ec-4ef7-a65d-597190dc6c63-trusted-ca\") pod \"ingress-operator-5b745b69d9-fltkw\" (UID: \"215da1aa-97ec-4ef7-a65d-597190dc6c63\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fltkw" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.595946 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96025a62-2435-451a-93bf-b03d24d6cfc1-serving-cert\") pod \"authentication-operator-69f744f599-s6twn\" (UID: \"96025a62-2435-451a-93bf-b03d24d6cfc1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s6twn" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.595965 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/01d0fb39-b10a-4717-8e77-ed73f95166bd-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6kfbf\" (UID: \"01d0fb39-b10a-4717-8e77-ed73f95166bd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6kfbf" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.595984 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/2b7160ce-096f-4305-9954-982608b133ac-image-import-ca\") pod \"apiserver-76f77b778f-bzlw5\" (UID: \"2b7160ce-096f-4305-9954-982608b133ac\") " pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596000 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e1a48f2-d6ff-4699-aea7-66f08f0a4e4b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fhz5c\" (UID: \"2e1a48f2-d6ff-4699-aea7-66f08f0a4e4b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fhz5c" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596017 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b7160ce-096f-4305-9954-982608b133ac-config\") pod \"apiserver-76f77b778f-bzlw5\" (UID: \"2b7160ce-096f-4305-9954-982608b133ac\") " pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596034 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/498fcaeb-168e-4860-9f5d-a7c72ee1808f-etcd-service-ca\") pod \"etcd-operator-b45778765-xv8d6\" (UID: \"498fcaeb-168e-4860-9f5d-a7c72ee1808f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xv8d6" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596050 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/215da1aa-97ec-4ef7-a65d-597190dc6c63-metrics-tls\") pod \"ingress-operator-5b745b69d9-fltkw\" (UID: \"215da1aa-97ec-4ef7-a65d-597190dc6c63\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fltkw" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596065 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/2b7160ce-096f-4305-9954-982608b133ac-audit\") pod \"apiserver-76f77b778f-bzlw5\" (UID: \"2b7160ce-096f-4305-9954-982608b133ac\") " pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596098 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4nb6\" (UniqueName: \"kubernetes.io/projected/2b7160ce-096f-4305-9954-982608b133ac-kube-api-access-s4nb6\") pod \"apiserver-76f77b778f-bzlw5\" (UID: \"2b7160ce-096f-4305-9954-982608b133ac\") " pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596116 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a043d02e-8a8a-42e6-839d-15dc6c0b43b6-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qgk4d\" (UID: \"a043d02e-8a8a-42e6-839d-15dc6c0b43b6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgk4d" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596131 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tvnn\" (UniqueName: \"kubernetes.io/projected/defba3b0-b36e-4b8e-a8b1-577782a54249-kube-api-access-6tvnn\") pod \"route-controller-manager-6576b87f9c-27p2h\" (UID: \"defba3b0-b36e-4b8e-a8b1-577782a54249\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27p2h" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596147 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8de305f-4cde-4354-ad95-b74003e014a2-proxy-tls\") pod \"machine-config-controller-84d6567774-gm5vv\" (UID: \"c8de305f-4cde-4354-ad95-b74003e014a2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm5vv" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596162 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e1a48f2-d6ff-4699-aea7-66f08f0a4e4b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fhz5c\" (UID: \"2e1a48f2-d6ff-4699-aea7-66f08f0a4e4b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fhz5c" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596159 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/884979c1-fccc-4bd6-b6db-4ec35bd9bdf7-available-featuregates\") pod \"openshift-config-operator-7777fb866f-kr648\" (UID: \"884979c1-fccc-4bd6-b6db-4ec35bd9bdf7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kr648" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596176 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2b7160ce-096f-4305-9954-982608b133ac-etcd-serving-ca\") pod \"apiserver-76f77b778f-bzlw5\" (UID: \"2b7160ce-096f-4305-9954-982608b133ac\") " pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596194 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2b7160ce-096f-4305-9954-982608b133ac-node-pullsecrets\") pod \"apiserver-76f77b778f-bzlw5\" (UID: \"2b7160ce-096f-4305-9954-982608b133ac\") " pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596221 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/35a6cab9-b63f-4ed5-ac08-897e894498c5-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-cnwlr\" (UID: \"35a6cab9-b63f-4ed5-ac08-897e894498c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cnwlr" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596425 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zxs7\" (UniqueName: \"kubernetes.io/projected/35a6cab9-b63f-4ed5-ac08-897e894498c5-kube-api-access-6zxs7\") pod \"controller-manager-879f6c89f-cnwlr\" (UID: \"35a6cab9-b63f-4ed5-ac08-897e894498c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cnwlr" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596442 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhl9k\" (UniqueName: \"kubernetes.io/projected/602fd67e-3c82-46aa-879d-17bbd976e85b-kube-api-access-lhl9k\") pod \"multus-admission-controller-857f4d67dd-s77pq\" (UID: \"602fd67e-3c82-46aa-879d-17bbd976e85b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-s77pq" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596459 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35a6cab9-b63f-4ed5-ac08-897e894498c5-serving-cert\") pod \"controller-manager-879f6c89f-cnwlr\" (UID: \"35a6cab9-b63f-4ed5-ac08-897e894498c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cnwlr" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596475 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96025a62-2435-451a-93bf-b03d24d6cfc1-service-ca-bundle\") pod \"authentication-operator-69f744f599-s6twn\" (UID: \"96025a62-2435-451a-93bf-b03d24d6cfc1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s6twn" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596489 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2b7160ce-096f-4305-9954-982608b133ac-etcd-client\") pod \"apiserver-76f77b778f-bzlw5\" (UID: \"2b7160ce-096f-4305-9954-982608b133ac\") " pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596505 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c8de305f-4cde-4354-ad95-b74003e014a2-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gm5vv\" (UID: \"c8de305f-4cde-4354-ad95-b74003e014a2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm5vv" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596521 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/defba3b0-b36e-4b8e-a8b1-577782a54249-serving-cert\") pod \"route-controller-manager-6576b87f9c-27p2h\" (UID: \"defba3b0-b36e-4b8e-a8b1-577782a54249\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27p2h" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596538 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmntm\" (UniqueName: \"kubernetes.io/projected/57c79b01-642f-4c45-886c-b3e852c0bc23-kube-api-access-zmntm\") pod \"control-plane-machine-set-operator-78cbb6b69f-g5rg5\" (UID: \"57c79b01-642f-4c45-886c-b3e852c0bc23\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g5rg5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596554 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2b7160ce-096f-4305-9954-982608b133ac-audit-dir\") pod \"apiserver-76f77b778f-bzlw5\" (UID: \"2b7160ce-096f-4305-9954-982608b133ac\") " pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596561 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/35a6cab9-b63f-4ed5-ac08-897e894498c5-client-ca\") pod \"controller-manager-879f6c89f-cnwlr\" (UID: \"35a6cab9-b63f-4ed5-ac08-897e894498c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cnwlr" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596577 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkxts\" (UniqueName: \"kubernetes.io/projected/d43b41ee-1e3d-42bf-8856-0678d441ac96-kube-api-access-dkxts\") pod \"apiserver-7bbb656c7d-78nbq\" (UID: \"d43b41ee-1e3d-42bf-8856-0678d441ac96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596596 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/498fcaeb-168e-4860-9f5d-a7c72ee1808f-etcd-ca\") pod \"etcd-operator-b45778765-xv8d6\" (UID: \"498fcaeb-168e-4860-9f5d-a7c72ee1808f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xv8d6" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596615 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/602fd67e-3c82-46aa-879d-17bbd976e85b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-s77pq\" (UID: \"602fd67e-3c82-46aa-879d-17bbd976e85b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-s77pq" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596636 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defba3b0-b36e-4b8e-a8b1-577782a54249-config\") pod \"route-controller-manager-6576b87f9c-27p2h\" (UID: \"defba3b0-b36e-4b8e-a8b1-577782a54249\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27p2h" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596661 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmw9w\" (UniqueName: \"kubernetes.io/projected/498fcaeb-168e-4860-9f5d-a7c72ee1808f-kube-api-access-fmw9w\") pod \"etcd-operator-b45778765-xv8d6\" (UID: \"498fcaeb-168e-4860-9f5d-a7c72ee1808f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xv8d6" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596679 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42w7x\" (UniqueName: \"kubernetes.io/projected/8bcb1a4d-f708-4d3a-81f1-8373e36eb474-kube-api-access-42w7x\") pod \"console-operator-58897d9998-cww5w\" (UID: \"8bcb1a4d-f708-4d3a-81f1-8373e36eb474\") " pod="openshift-console-operator/console-operator-58897d9998-cww5w" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596701 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbbwn\" (UniqueName: \"kubernetes.io/projected/9279fbd5-1378-4a9a-b35d-85a7b9430674-kube-api-access-hbbwn\") pod \"machine-api-operator-5694c8668f-f77k6\" (UID: \"9279fbd5-1378-4a9a-b35d-85a7b9430674\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f77k6" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596747 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d43b41ee-1e3d-42bf-8856-0678d441ac96-audit-dir\") pod \"apiserver-7bbb656c7d-78nbq\" (UID: \"d43b41ee-1e3d-42bf-8856-0678d441ac96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596801 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/884979c1-fccc-4bd6-b6db-4ec35bd9bdf7-serving-cert\") pod \"openshift-config-operator-7777fb866f-kr648\" (UID: \"884979c1-fccc-4bd6-b6db-4ec35bd9bdf7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kr648" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596823 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz2tl\" (UniqueName: \"kubernetes.io/projected/8b0c7aa1-248e-4847-97f5-c08c17e78c3d-kube-api-access-rz2tl\") pod \"downloads-7954f5f757-t7xl2\" (UID: \"8b0c7aa1-248e-4847-97f5-c08c17e78c3d\") " pod="openshift-console/downloads-7954f5f757-t7xl2" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596839 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7332042a-dffc-4c3e-94eb-2a1dedc58062-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-j2j42\" (UID: \"7332042a-dffc-4c3e-94eb-2a1dedc58062\") " pod="openshift-marketplace/marketplace-operator-79b997595-j2j42" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596857 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ba02dfe-ace9-4644-b56c-cba779cfb2ec-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-gvn7f\" (UID: \"6ba02dfe-ace9-4644-b56c-cba779cfb2ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gvn7f" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596877 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld9v5\" (UniqueName: \"kubernetes.io/projected/6ba02dfe-ace9-4644-b56c-cba779cfb2ec-kube-api-access-ld9v5\") pod \"kube-storage-version-migrator-operator-b67b599dd-gvn7f\" (UID: \"6ba02dfe-ace9-4644-b56c-cba779cfb2ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gvn7f" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596900 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/57c79b01-642f-4c45-886c-b3e852c0bc23-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-g5rg5\" (UID: \"57c79b01-642f-4c45-886c-b3e852c0bc23\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g5rg5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596919 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b7160ce-096f-4305-9954-982608b133ac-serving-cert\") pod \"apiserver-76f77b778f-bzlw5\" (UID: \"2b7160ce-096f-4305-9954-982608b133ac\") " pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596941 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9279fbd5-1378-4a9a-b35d-85a7b9430674-images\") pod \"machine-api-operator-5694c8668f-f77k6\" (UID: \"9279fbd5-1378-4a9a-b35d-85a7b9430674\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f77k6" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596963 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/498fcaeb-168e-4860-9f5d-a7c72ee1808f-config\") pod \"etcd-operator-b45778765-xv8d6\" (UID: \"498fcaeb-168e-4860-9f5d-a7c72ee1808f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xv8d6" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596974 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d43b41ee-1e3d-42bf-8856-0678d441ac96-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-78nbq\" (UID: \"d43b41ee-1e3d-42bf-8856-0678d441ac96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.596987 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35a6cab9-b63f-4ed5-ac08-897e894498c5-config\") pod \"controller-manager-879f6c89f-cnwlr\" (UID: \"35a6cab9-b63f-4ed5-ac08-897e894498c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cnwlr" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.597011 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9279fbd5-1378-4a9a-b35d-85a7b9430674-config\") pod \"machine-api-operator-5694c8668f-f77k6\" (UID: \"9279fbd5-1378-4a9a-b35d-85a7b9430674\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f77k6" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.597032 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d43b41ee-1e3d-42bf-8856-0678d441ac96-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-78nbq\" (UID: \"d43b41ee-1e3d-42bf-8856-0678d441ac96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.597054 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b7160ce-096f-4305-9954-982608b133ac-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bzlw5\" (UID: \"2b7160ce-096f-4305-9954-982608b133ac\") " pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.597084 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7332042a-dffc-4c3e-94eb-2a1dedc58062-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-j2j42\" (UID: \"7332042a-dffc-4c3e-94eb-2a1dedc58062\") " pod="openshift-marketplace/marketplace-operator-79b997595-j2j42" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.597106 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d43b41ee-1e3d-42bf-8856-0678d441ac96-audit-policies\") pod \"apiserver-7bbb656c7d-78nbq\" (UID: \"d43b41ee-1e3d-42bf-8856-0678d441ac96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.597129 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bcb1a4d-f708-4d3a-81f1-8373e36eb474-config\") pod \"console-operator-58897d9998-cww5w\" (UID: \"8bcb1a4d-f708-4d3a-81f1-8373e36eb474\") " pod="openshift-console-operator/console-operator-58897d9998-cww5w" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.597153 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d43b41ee-1e3d-42bf-8856-0678d441ac96-serving-cert\") pod \"apiserver-7bbb656c7d-78nbq\" (UID: \"d43b41ee-1e3d-42bf-8856-0678d441ac96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.597183 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf8s7\" (UniqueName: \"kubernetes.io/projected/215da1aa-97ec-4ef7-a65d-597190dc6c63-kube-api-access-nf8s7\") pod \"ingress-operator-5b745b69d9-fltkw\" (UID: \"215da1aa-97ec-4ef7-a65d-597190dc6c63\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fltkw" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.597203 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8bcb1a4d-f708-4d3a-81f1-8373e36eb474-trusted-ca\") pod \"console-operator-58897d9998-cww5w\" (UID: \"8bcb1a4d-f708-4d3a-81f1-8373e36eb474\") " pod="openshift-console-operator/console-operator-58897d9998-cww5w" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.597222 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2b7160ce-096f-4305-9954-982608b133ac-encryption-config\") pod \"apiserver-76f77b778f-bzlw5\" (UID: \"2b7160ce-096f-4305-9954-982608b133ac\") " pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.597241 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-855js\" (UniqueName: \"kubernetes.io/projected/7332042a-dffc-4c3e-94eb-2a1dedc58062-kube-api-access-855js\") pod \"marketplace-operator-79b997595-j2j42\" (UID: \"7332042a-dffc-4c3e-94eb-2a1dedc58062\") " pod="openshift-marketplace/marketplace-operator-79b997595-j2j42" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.597316 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bzlw5"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.597520 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d43b41ee-1e3d-42bf-8856-0678d441ac96-audit-dir\") pod \"apiserver-7bbb656c7d-78nbq\" (UID: \"d43b41ee-1e3d-42bf-8856-0678d441ac96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.597992 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/35a6cab9-b63f-4ed5-ac08-897e894498c5-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-cnwlr\" (UID: \"35a6cab9-b63f-4ed5-ac08-897e894498c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cnwlr" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.599013 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d43b41ee-1e3d-42bf-8856-0678d441ac96-audit-policies\") pod \"apiserver-7bbb656c7d-78nbq\" (UID: \"d43b41ee-1e3d-42bf-8856-0678d441ac96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.599065 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xv8d6"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.599072 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9279fbd5-1378-4a9a-b35d-85a7b9430674-config\") pod \"machine-api-operator-5694c8668f-f77k6\" (UID: \"9279fbd5-1378-4a9a-b35d-85a7b9430674\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f77k6" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.599193 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d43b41ee-1e3d-42bf-8856-0678d441ac96-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-78nbq\" (UID: \"d43b41ee-1e3d-42bf-8856-0678d441ac96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.599648 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35a6cab9-b63f-4ed5-ac08-897e894498c5-config\") pod \"controller-manager-879f6c89f-cnwlr\" (UID: \"35a6cab9-b63f-4ed5-ac08-897e894498c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cnwlr" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.599669 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9279fbd5-1378-4a9a-b35d-85a7b9430674-images\") pod \"machine-api-operator-5694c8668f-f77k6\" (UID: \"9279fbd5-1378-4a9a-b35d-85a7b9430674\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f77k6" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.599781 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-wnkwx"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.600882 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-r267p"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.600930 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9279fbd5-1378-4a9a-b35d-85a7b9430674-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-f77k6\" (UID: \"9279fbd5-1378-4a9a-b35d-85a7b9430674\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f77k6" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.602604 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-r267p" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.603479 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d43b41ee-1e3d-42bf-8856-0678d441ac96-serving-cert\") pod \"apiserver-7bbb656c7d-78nbq\" (UID: \"d43b41ee-1e3d-42bf-8856-0678d441ac96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.603868 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d43b41ee-1e3d-42bf-8856-0678d441ac96-etcd-client\") pod \"apiserver-7bbb656c7d-78nbq\" (UID: \"d43b41ee-1e3d-42bf-8856-0678d441ac96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.604110 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d43b41ee-1e3d-42bf-8856-0678d441ac96-encryption-config\") pod \"apiserver-7bbb656c7d-78nbq\" (UID: \"d43b41ee-1e3d-42bf-8856-0678d441ac96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.604165 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-s6twn"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.604530 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-w4n2s"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.605255 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/884979c1-fccc-4bd6-b6db-4ec35bd9bdf7-serving-cert\") pod \"openshift-config-operator-7777fb866f-kr648\" (UID: \"884979c1-fccc-4bd6-b6db-4ec35bd9bdf7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kr648" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.606102 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qx95k"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.607158 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-gjjbf"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.607866 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gjjbf" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.608241 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-lfz57"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.609170 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35a6cab9-b63f-4ed5-ac08-897e894498c5-serving-cert\") pod \"controller-manager-879f6c89f-cnwlr\" (UID: \"35a6cab9-b63f-4ed5-ac08-897e894498c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cnwlr" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.609427 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sztnk"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.612037 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnxfs"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.612514 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.613600 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2xz7d"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.619897 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-npj9g"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.632612 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.636302 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-59q79"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.638014 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-r267p"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.639047 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564280-6hx2v"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.640735 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gjjbf"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.641890 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9x2m7"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.643805 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-lrwxl"] Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.644340 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-lrwxl" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.653012 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.672258 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.692156 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.697803 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/498fcaeb-168e-4860-9f5d-a7c72ee1808f-etcd-client\") pod \"etcd-operator-b45778765-xv8d6\" (UID: \"498fcaeb-168e-4860-9f5d-a7c72ee1808f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xv8d6" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.697836 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/defba3b0-b36e-4b8e-a8b1-577782a54249-client-ca\") pod \"route-controller-manager-6576b87f9c-27p2h\" (UID: \"defba3b0-b36e-4b8e-a8b1-577782a54249\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27p2h" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.697861 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s2kn\" (UniqueName: \"kubernetes.io/projected/19aab548-96a0-4056-8226-f9e7cf4b3ca3-kube-api-access-5s2kn\") pod \"collect-profiles-29564280-6hx2v\" (UID: \"19aab548-96a0-4056-8226-f9e7cf4b3ca3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-6hx2v" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.697878 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvrcp\" (UniqueName: \"kubernetes.io/projected/b2e93e85-70a0-4853-8fc1-2101d5b26069-kube-api-access-kvrcp\") pod \"packageserver-d55dfcdfc-5q9t5\" (UID: \"b2e93e85-70a0-4853-8fc1-2101d5b26069\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5q9t5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.697894 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96025a62-2435-451a-93bf-b03d24d6cfc1-config\") pod \"authentication-operator-69f744f599-s6twn\" (UID: \"96025a62-2435-451a-93bf-b03d24d6cfc1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s6twn" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.697911 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/215da1aa-97ec-4ef7-a65d-597190dc6c63-trusted-ca\") pod \"ingress-operator-5b745b69d9-fltkw\" (UID: \"215da1aa-97ec-4ef7-a65d-597190dc6c63\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fltkw" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.697926 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/2b7160ce-096f-4305-9954-982608b133ac-image-import-ca\") pod \"apiserver-76f77b778f-bzlw5\" (UID: \"2b7160ce-096f-4305-9954-982608b133ac\") " pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.697943 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.697963 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a043d02e-8a8a-42e6-839d-15dc6c0b43b6-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qgk4d\" (UID: \"a043d02e-8a8a-42e6-839d-15dc6c0b43b6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgk4d" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.697981 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/215da1aa-97ec-4ef7-a65d-597190dc6c63-metrics-tls\") pod \"ingress-operator-5b745b69d9-fltkw\" (UID: \"215da1aa-97ec-4ef7-a65d-597190dc6c63\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fltkw" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.697999 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4nb6\" (UniqueName: \"kubernetes.io/projected/2b7160ce-096f-4305-9954-982608b133ac-kube-api-access-s4nb6\") pod \"apiserver-76f77b778f-bzlw5\" (UID: \"2b7160ce-096f-4305-9954-982608b133ac\") " pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698015 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/008bfbc9-9b16-4769-ba0a-116a67b7fdb4-default-certificate\") pod \"router-default-5444994796-mfdzp\" (UID: \"008bfbc9-9b16-4769-ba0a-116a67b7fdb4\") " pod="openshift-ingress/router-default-5444994796-mfdzp" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698030 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698047 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tvnn\" (UniqueName: \"kubernetes.io/projected/defba3b0-b36e-4b8e-a8b1-577782a54249-kube-api-access-6tvnn\") pod \"route-controller-manager-6576b87f9c-27p2h\" (UID: \"defba3b0-b36e-4b8e-a8b1-577782a54249\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27p2h" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698064 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8de305f-4cde-4354-ad95-b74003e014a2-proxy-tls\") pod \"machine-config-controller-84d6567774-gm5vv\" (UID: \"c8de305f-4cde-4354-ad95-b74003e014a2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm5vv" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698080 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2b7160ce-096f-4305-9954-982608b133ac-etcd-serving-ca\") pod \"apiserver-76f77b778f-bzlw5\" (UID: \"2b7160ce-096f-4305-9954-982608b133ac\") " pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698096 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khtw6\" (UniqueName: \"kubernetes.io/projected/84a21e6e-7bda-408b-a607-f02b4f807535-kube-api-access-khtw6\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698111 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b2e93e85-70a0-4853-8fc1-2101d5b26069-tmpfs\") pod \"packageserver-d55dfcdfc-5q9t5\" (UID: \"b2e93e85-70a0-4853-8fc1-2101d5b26069\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5q9t5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698129 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96025a62-2435-451a-93bf-b03d24d6cfc1-service-ca-bundle\") pod \"authentication-operator-69f744f599-s6twn\" (UID: \"96025a62-2435-451a-93bf-b03d24d6cfc1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s6twn" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698145 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c8de305f-4cde-4354-ad95-b74003e014a2-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gm5vv\" (UID: \"c8de305f-4cde-4354-ad95-b74003e014a2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm5vv" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698161 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/defba3b0-b36e-4b8e-a8b1-577782a54249-serving-cert\") pod \"route-controller-manager-6576b87f9c-27p2h\" (UID: \"defba3b0-b36e-4b8e-a8b1-577782a54249\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27p2h" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698175 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2b7160ce-096f-4305-9954-982608b133ac-audit-dir\") pod \"apiserver-76f77b778f-bzlw5\" (UID: \"2b7160ce-096f-4305-9954-982608b133ac\") " pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698190 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/abd70175-8048-40dc-8f82-72d1112b0af0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-59q79\" (UID: \"abd70175-8048-40dc-8f82-72d1112b0af0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-59q79" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698211 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b2e93e85-70a0-4853-8fc1-2101d5b26069-apiservice-cert\") pod \"packageserver-d55dfcdfc-5q9t5\" (UID: \"b2e93e85-70a0-4853-8fc1-2101d5b26069\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5q9t5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698230 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/498fcaeb-168e-4860-9f5d-a7c72ee1808f-etcd-ca\") pod \"etcd-operator-b45778765-xv8d6\" (UID: \"498fcaeb-168e-4860-9f5d-a7c72ee1808f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xv8d6" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698247 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defba3b0-b36e-4b8e-a8b1-577782a54249-config\") pod \"route-controller-manager-6576b87f9c-27p2h\" (UID: \"defba3b0-b36e-4b8e-a8b1-577782a54249\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27p2h" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698264 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2782ade8-7344-423c-8ace-e9fe0b0fd207-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-xnxfs\" (UID: \"2782ade8-7344-423c-8ace-e9fe0b0fd207\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnxfs" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698279 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfvxs\" (UniqueName: \"kubernetes.io/projected/69e95a31-ebb4-4647-b358-ad5a85023485-kube-api-access-cfvxs\") pod \"machine-config-operator-74547568cd-npj9g\" (UID: \"69e95a31-ebb4-4647-b358-ad5a85023485\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-npj9g" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698296 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmw9w\" (UniqueName: \"kubernetes.io/projected/498fcaeb-168e-4860-9f5d-a7c72ee1808f-kube-api-access-fmw9w\") pod \"etcd-operator-b45778765-xv8d6\" (UID: \"498fcaeb-168e-4860-9f5d-a7c72ee1808f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xv8d6" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698319 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/69e95a31-ebb4-4647-b358-ad5a85023485-proxy-tls\") pod \"machine-config-operator-74547568cd-npj9g\" (UID: \"69e95a31-ebb4-4647-b358-ad5a85023485\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-npj9g" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698335 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2782ade8-7344-423c-8ace-e9fe0b0fd207-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-xnxfs\" (UID: \"2782ade8-7344-423c-8ace-e9fe0b0fd207\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnxfs" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698349 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2032a3f5-b88c-423b-a25d-3768950ac81c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-4cqtc\" (UID: \"2032a3f5-b88c-423b-a25d-3768950ac81c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4cqtc" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698364 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng69v\" (UniqueName: \"kubernetes.io/projected/2782ade8-7344-423c-8ace-e9fe0b0fd207-kube-api-access-ng69v\") pod \"openshift-controller-manager-operator-756b6f6bc6-xnxfs\" (UID: \"2782ade8-7344-423c-8ace-e9fe0b0fd207\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnxfs" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698388 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz2tl\" (UniqueName: \"kubernetes.io/projected/8b0c7aa1-248e-4847-97f5-c08c17e78c3d-kube-api-access-rz2tl\") pod \"downloads-7954f5f757-t7xl2\" (UID: \"8b0c7aa1-248e-4847-97f5-c08c17e78c3d\") " pod="openshift-console/downloads-7954f5f757-t7xl2" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698404 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ba02dfe-ace9-4644-b56c-cba779cfb2ec-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-gvn7f\" (UID: \"6ba02dfe-ace9-4644-b56c-cba779cfb2ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gvn7f" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698418 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/01289133-eb09-4497-8df9-bfd2ee3e0357-node-bootstrap-token\") pod \"machine-config-server-vk4md\" (UID: \"01289133-eb09-4497-8df9-bfd2ee3e0357\") " pod="openshift-machine-config-operator/machine-config-server-vk4md" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698436 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/498fcaeb-168e-4860-9f5d-a7c72ee1808f-config\") pod \"etcd-operator-b45778765-xv8d6\" (UID: \"498fcaeb-168e-4860-9f5d-a7c72ee1808f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xv8d6" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698451 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld9v5\" (UniqueName: \"kubernetes.io/projected/6ba02dfe-ace9-4644-b56c-cba779cfb2ec-kube-api-access-ld9v5\") pod \"kube-storage-version-migrator-operator-b67b599dd-gvn7f\" (UID: \"6ba02dfe-ace9-4644-b56c-cba779cfb2ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gvn7f" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698467 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/57c79b01-642f-4c45-886c-b3e852c0bc23-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-g5rg5\" (UID: \"57c79b01-642f-4c45-886c-b3e852c0bc23\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g5rg5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698483 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/008bfbc9-9b16-4769-ba0a-116a67b7fdb4-service-ca-bundle\") pod \"router-default-5444994796-mfdzp\" (UID: \"008bfbc9-9b16-4769-ba0a-116a67b7fdb4\") " pod="openshift-ingress/router-default-5444994796-mfdzp" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698498 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698516 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7332042a-dffc-4c3e-94eb-2a1dedc58062-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-j2j42\" (UID: \"7332042a-dffc-4c3e-94eb-2a1dedc58062\") " pod="openshift-marketplace/marketplace-operator-79b997595-j2j42" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698532 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf8s7\" (UniqueName: \"kubernetes.io/projected/215da1aa-97ec-4ef7-a65d-597190dc6c63-kube-api-access-nf8s7\") pod \"ingress-operator-5b745b69d9-fltkw\" (UID: \"215da1aa-97ec-4ef7-a65d-597190dc6c63\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fltkw" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698548 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698563 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94158f29-fe7c-44f5-85a6-ec3e7f39a50a-config\") pod \"kube-controller-manager-operator-78b949d7b-ltsqn\" (UID: \"94158f29-fe7c-44f5-85a6-ec3e7f39a50a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ltsqn" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698580 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8bcb1a4d-f708-4d3a-81f1-8373e36eb474-trusted-ca\") pod \"console-operator-58897d9998-cww5w\" (UID: \"8bcb1a4d-f708-4d3a-81f1-8373e36eb474\") " pod="openshift-console-operator/console-operator-58897d9998-cww5w" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698594 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-855js\" (UniqueName: \"kubernetes.io/projected/7332042a-dffc-4c3e-94eb-2a1dedc58062-kube-api-access-855js\") pod \"marketplace-operator-79b997595-j2j42\" (UID: \"7332042a-dffc-4c3e-94eb-2a1dedc58062\") " pod="openshift-marketplace/marketplace-operator-79b997595-j2j42" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698611 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/19aab548-96a0-4056-8226-f9e7cf4b3ca3-config-volume\") pod \"collect-profiles-29564280-6hx2v\" (UID: \"19aab548-96a0-4056-8226-f9e7cf4b3ca3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-6hx2v" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698627 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698643 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/498fcaeb-168e-4860-9f5d-a7c72ee1808f-serving-cert\") pod \"etcd-operator-b45778765-xv8d6\" (UID: \"498fcaeb-168e-4860-9f5d-a7c72ee1808f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xv8d6" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698660 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698681 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwbpk\" (UniqueName: \"kubernetes.io/projected/c8de305f-4cde-4354-ad95-b74003e014a2-kube-api-access-xwbpk\") pod \"machine-config-controller-84d6567774-gm5vv\" (UID: \"c8de305f-4cde-4354-ad95-b74003e014a2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm5vv" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698697 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/215da1aa-97ec-4ef7-a65d-597190dc6c63-bound-sa-token\") pod \"ingress-operator-5b745b69d9-fltkw\" (UID: \"215da1aa-97ec-4ef7-a65d-597190dc6c63\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fltkw" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698713 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ba02dfe-ace9-4644-b56c-cba779cfb2ec-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-gvn7f\" (UID: \"6ba02dfe-ace9-4644-b56c-cba779cfb2ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gvn7f" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698760 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/19aab548-96a0-4056-8226-f9e7cf4b3ca3-secret-volume\") pod \"collect-profiles-29564280-6hx2v\" (UID: \"19aab548-96a0-4056-8226-f9e7cf4b3ca3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-6hx2v" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698793 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b2e93e85-70a0-4853-8fc1-2101d5b26069-webhook-cert\") pod \"packageserver-d55dfcdfc-5q9t5\" (UID: \"b2e93e85-70a0-4853-8fc1-2101d5b26069\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5q9t5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698812 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc2pc\" (UniqueName: \"kubernetes.io/projected/a043d02e-8a8a-42e6-839d-15dc6c0b43b6-kube-api-access-rc2pc\") pod \"cluster-samples-operator-665b6dd947-qgk4d\" (UID: \"a043d02e-8a8a-42e6-839d-15dc6c0b43b6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgk4d" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698830 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8bcb1a4d-f708-4d3a-81f1-8373e36eb474-serving-cert\") pod \"console-operator-58897d9998-cww5w\" (UID: \"8bcb1a4d-f708-4d3a-81f1-8373e36eb474\") " pod="openshift-console-operator/console-operator-58897d9998-cww5w" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698853 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl5sk\" (UniqueName: \"kubernetes.io/projected/01d0fb39-b10a-4717-8e77-ed73f95166bd-kube-api-access-tl5sk\") pod \"package-server-manager-789f6589d5-6kfbf\" (UID: \"01d0fb39-b10a-4717-8e77-ed73f95166bd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6kfbf" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698869 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scv79\" (UniqueName: \"kubernetes.io/projected/96025a62-2435-451a-93bf-b03d24d6cfc1-kube-api-access-scv79\") pod \"authentication-operator-69f744f599-s6twn\" (UID: \"96025a62-2435-451a-93bf-b03d24d6cfc1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s6twn" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698883 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/01289133-eb09-4497-8df9-bfd2ee3e0357-certs\") pod \"machine-config-server-vk4md\" (UID: \"01289133-eb09-4497-8df9-bfd2ee3e0357\") " pod="openshift-machine-config-operator/machine-config-server-vk4md" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698900 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e1a48f2-d6ff-4699-aea7-66f08f0a4e4b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fhz5c\" (UID: \"2e1a48f2-d6ff-4699-aea7-66f08f0a4e4b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fhz5c" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698915 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96025a62-2435-451a-93bf-b03d24d6cfc1-serving-cert\") pod \"authentication-operator-69f744f599-s6twn\" (UID: \"96025a62-2435-451a-93bf-b03d24d6cfc1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s6twn" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698933 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/01d0fb39-b10a-4717-8e77-ed73f95166bd-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6kfbf\" (UID: \"01d0fb39-b10a-4717-8e77-ed73f95166bd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6kfbf" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698950 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcsvv\" (UniqueName: \"kubernetes.io/projected/008bfbc9-9b16-4769-ba0a-116a67b7fdb4-kube-api-access-lcsvv\") pod \"router-default-5444994796-mfdzp\" (UID: \"008bfbc9-9b16-4769-ba0a-116a67b7fdb4\") " pod="openshift-ingress/router-default-5444994796-mfdzp" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698966 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698981 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b7160ce-096f-4305-9954-982608b133ac-config\") pod \"apiserver-76f77b778f-bzlw5\" (UID: \"2b7160ce-096f-4305-9954-982608b133ac\") " pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.698998 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/498fcaeb-168e-4860-9f5d-a7c72ee1808f-etcd-service-ca\") pod \"etcd-operator-b45778765-xv8d6\" (UID: \"498fcaeb-168e-4860-9f5d-a7c72ee1808f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xv8d6" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.699012 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/2b7160ce-096f-4305-9954-982608b133ac-audit\") pod \"apiserver-76f77b778f-bzlw5\" (UID: \"2b7160ce-096f-4305-9954-982608b133ac\") " pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.699026 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/94158f29-fe7c-44f5-85a6-ec3e7f39a50a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ltsqn\" (UID: \"94158f29-fe7c-44f5-85a6-ec3e7f39a50a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ltsqn" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.699041 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ht7b\" (UniqueName: \"kubernetes.io/projected/2032a3f5-b88c-423b-a25d-3768950ac81c-kube-api-access-7ht7b\") pod \"olm-operator-6b444d44fb-4cqtc\" (UID: \"2032a3f5-b88c-423b-a25d-3768950ac81c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4cqtc" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.699058 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e1a48f2-d6ff-4699-aea7-66f08f0a4e4b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fhz5c\" (UID: \"2e1a48f2-d6ff-4699-aea7-66f08f0a4e4b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fhz5c" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.699672 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2b7160ce-096f-4305-9954-982608b133ac-node-pullsecrets\") pod \"apiserver-76f77b778f-bzlw5\" (UID: \"2b7160ce-096f-4305-9954-982608b133ac\") " pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.699709 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/84a21e6e-7bda-408b-a607-f02b4f807535-audit-policies\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.699732 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.699789 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhl9k\" (UniqueName: \"kubernetes.io/projected/602fd67e-3c82-46aa-879d-17bbd976e85b-kube-api-access-lhl9k\") pod \"multus-admission-controller-857f4d67dd-s77pq\" (UID: \"602fd67e-3c82-46aa-879d-17bbd976e85b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-s77pq" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.699815 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2b7160ce-096f-4305-9954-982608b133ac-etcd-client\") pod \"apiserver-76f77b778f-bzlw5\" (UID: \"2b7160ce-096f-4305-9954-982608b133ac\") " pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.699857 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmntm\" (UniqueName: \"kubernetes.io/projected/57c79b01-642f-4c45-886c-b3e852c0bc23-kube-api-access-zmntm\") pod \"control-plane-machine-set-operator-78cbb6b69f-g5rg5\" (UID: \"57c79b01-642f-4c45-886c-b3e852c0bc23\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g5rg5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.699885 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/602fd67e-3c82-46aa-879d-17bbd976e85b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-s77pq\" (UID: \"602fd67e-3c82-46aa-879d-17bbd976e85b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-s77pq" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.699916 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b2d85f1d-8085-42e9-b3a3-dfa5eed4c3cd-metrics-tls\") pod \"dns-operator-744455d44c-wnkwx\" (UID: \"b2d85f1d-8085-42e9-b3a3-dfa5eed4c3cd\") " pod="openshift-dns-operator/dns-operator-744455d44c-wnkwx" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.699952 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42w7x\" (UniqueName: \"kubernetes.io/projected/8bcb1a4d-f708-4d3a-81f1-8373e36eb474-kube-api-access-42w7x\") pod \"console-operator-58897d9998-cww5w\" (UID: \"8bcb1a4d-f708-4d3a-81f1-8373e36eb474\") " pod="openshift-console-operator/console-operator-58897d9998-cww5w" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.699975 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/84a21e6e-7bda-408b-a607-f02b4f807535-audit-dir\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.700001 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.700025 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.700049 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7332042a-dffc-4c3e-94eb-2a1dedc58062-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-j2j42\" (UID: \"7332042a-dffc-4c3e-94eb-2a1dedc58062\") " pod="openshift-marketplace/marketplace-operator-79b997595-j2j42" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.700092 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6njd\" (UniqueName: \"kubernetes.io/projected/b2d85f1d-8085-42e9-b3a3-dfa5eed4c3cd-kube-api-access-x6njd\") pod \"dns-operator-744455d44c-wnkwx\" (UID: \"b2d85f1d-8085-42e9-b3a3-dfa5eed4c3cd\") " pod="openshift-dns-operator/dns-operator-744455d44c-wnkwx" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.700455 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/498fcaeb-168e-4860-9f5d-a7c72ee1808f-config\") pod \"etcd-operator-b45778765-xv8d6\" (UID: \"498fcaeb-168e-4860-9f5d-a7c72ee1808f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xv8d6" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.701009 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96025a62-2435-451a-93bf-b03d24d6cfc1-service-ca-bundle\") pod \"authentication-operator-69f744f599-s6twn\" (UID: \"96025a62-2435-451a-93bf-b03d24d6cfc1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s6twn" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.701674 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7332042a-dffc-4c3e-94eb-2a1dedc58062-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-j2j42\" (UID: \"7332042a-dffc-4c3e-94eb-2a1dedc58062\") " pod="openshift-marketplace/marketplace-operator-79b997595-j2j42" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.701695 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/498fcaeb-168e-4860-9f5d-a7c72ee1808f-etcd-ca\") pod \"etcd-operator-b45778765-xv8d6\" (UID: \"498fcaeb-168e-4860-9f5d-a7c72ee1808f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xv8d6" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.702407 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e1a48f2-d6ff-4699-aea7-66f08f0a4e4b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fhz5c\" (UID: \"2e1a48f2-d6ff-4699-aea7-66f08f0a4e4b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fhz5c" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.702460 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2b7160ce-096f-4305-9954-982608b133ac-node-pullsecrets\") pod \"apiserver-76f77b778f-bzlw5\" (UID: \"2b7160ce-096f-4305-9954-982608b133ac\") " pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.702617 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ba02dfe-ace9-4644-b56c-cba779cfb2ec-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-gvn7f\" (UID: \"6ba02dfe-ace9-4644-b56c-cba779cfb2ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gvn7f" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.703213 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/2b7160ce-096f-4305-9954-982608b133ac-audit\") pod \"apiserver-76f77b778f-bzlw5\" (UID: \"2b7160ce-096f-4305-9954-982608b133ac\") " pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.703472 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/498fcaeb-168e-4860-9f5d-a7c72ee1808f-etcd-client\") pod \"etcd-operator-b45778765-xv8d6\" (UID: \"498fcaeb-168e-4860-9f5d-a7c72ee1808f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xv8d6" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.704743 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2b7160ce-096f-4305-9954-982608b133ac-etcd-serving-ca\") pod \"apiserver-76f77b778f-bzlw5\" (UID: \"2b7160ce-096f-4305-9954-982608b133ac\") " pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.705153 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c8de305f-4cde-4354-ad95-b74003e014a2-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gm5vv\" (UID: \"c8de305f-4cde-4354-ad95-b74003e014a2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm5vv" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.705283 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/498fcaeb-168e-4860-9f5d-a7c72ee1808f-etcd-service-ca\") pod \"etcd-operator-b45778765-xv8d6\" (UID: \"498fcaeb-168e-4860-9f5d-a7c72ee1808f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xv8d6" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.705443 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/defba3b0-b36e-4b8e-a8b1-577782a54249-client-ca\") pod \"route-controller-manager-6576b87f9c-27p2h\" (UID: \"defba3b0-b36e-4b8e-a8b1-577782a54249\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27p2h" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.705604 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96025a62-2435-451a-93bf-b03d24d6cfc1-serving-cert\") pod \"authentication-operator-69f744f599-s6twn\" (UID: \"96025a62-2435-451a-93bf-b03d24d6cfc1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s6twn" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.705624 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defba3b0-b36e-4b8e-a8b1-577782a54249-config\") pod \"route-controller-manager-6576b87f9c-27p2h\" (UID: \"defba3b0-b36e-4b8e-a8b1-577782a54249\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27p2h" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.705698 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b7160ce-096f-4305-9954-982608b133ac-serving-cert\") pod \"apiserver-76f77b778f-bzlw5\" (UID: \"2b7160ce-096f-4305-9954-982608b133ac\") " pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.705811 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96025a62-2435-451a-93bf-b03d24d6cfc1-config\") pod \"authentication-operator-69f744f599-s6twn\" (UID: \"96025a62-2435-451a-93bf-b03d24d6cfc1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s6twn" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.705843 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2b7160ce-096f-4305-9954-982608b133ac-audit-dir\") pod \"apiserver-76f77b778f-bzlw5\" (UID: \"2b7160ce-096f-4305-9954-982608b133ac\") " pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.705963 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8bcb1a4d-f708-4d3a-81f1-8373e36eb474-trusted-ca\") pod \"console-operator-58897d9998-cww5w\" (UID: \"8bcb1a4d-f708-4d3a-81f1-8373e36eb474\") " pod="openshift-console-operator/console-operator-58897d9998-cww5w" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.706057 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b7160ce-096f-4305-9954-982608b133ac-config\") pod \"apiserver-76f77b778f-bzlw5\" (UID: \"2b7160ce-096f-4305-9954-982608b133ac\") " pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.706157 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b7160ce-096f-4305-9954-982608b133ac-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bzlw5\" (UID: \"2b7160ce-096f-4305-9954-982608b133ac\") " pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.706231 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/008bfbc9-9b16-4769-ba0a-116a67b7fdb4-stats-auth\") pod \"router-default-5444994796-mfdzp\" (UID: \"008bfbc9-9b16-4769-ba0a-116a67b7fdb4\") " pod="openshift-ingress/router-default-5444994796-mfdzp" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.706441 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bcb1a4d-f708-4d3a-81f1-8373e36eb474-config\") pod \"console-operator-58897d9998-cww5w\" (UID: \"8bcb1a4d-f708-4d3a-81f1-8373e36eb474\") " pod="openshift-console-operator/console-operator-58897d9998-cww5w" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.706518 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abd70175-8048-40dc-8f82-72d1112b0af0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-59q79\" (UID: \"abd70175-8048-40dc-8f82-72d1112b0af0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-59q79" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.706553 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46bgv\" (UniqueName: \"kubernetes.io/projected/01289133-eb09-4497-8df9-bfd2ee3e0357-kube-api-access-46bgv\") pod \"machine-config-server-vk4md\" (UID: \"01289133-eb09-4497-8df9-bfd2ee3e0357\") " pod="openshift-machine-config-operator/machine-config-server-vk4md" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.706588 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.706642 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2b7160ce-096f-4305-9954-982608b133ac-encryption-config\") pod \"apiserver-76f77b778f-bzlw5\" (UID: \"2b7160ce-096f-4305-9954-982608b133ac\") " pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.706887 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/2b7160ce-096f-4305-9954-982608b133ac-image-import-ca\") pod \"apiserver-76f77b778f-bzlw5\" (UID: \"2b7160ce-096f-4305-9954-982608b133ac\") " pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.707021 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b7160ce-096f-4305-9954-982608b133ac-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bzlw5\" (UID: \"2b7160ce-096f-4305-9954-982608b133ac\") " pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.707063 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2e1a48f2-d6ff-4699-aea7-66f08f0a4e4b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fhz5c\" (UID: \"2e1a48f2-d6ff-4699-aea7-66f08f0a4e4b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fhz5c" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.707091 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/215da1aa-97ec-4ef7-a65d-597190dc6c63-trusted-ca\") pod \"ingress-operator-5b745b69d9-fltkw\" (UID: \"215da1aa-97ec-4ef7-a65d-597190dc6c63\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fltkw" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.707169 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/008bfbc9-9b16-4769-ba0a-116a67b7fdb4-metrics-certs\") pod \"router-default-5444994796-mfdzp\" (UID: \"008bfbc9-9b16-4769-ba0a-116a67b7fdb4\") " pod="openshift-ingress/router-default-5444994796-mfdzp" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.707209 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abd70175-8048-40dc-8f82-72d1112b0af0-config\") pod \"kube-apiserver-operator-766d6c64bb-59q79\" (UID: \"abd70175-8048-40dc-8f82-72d1112b0af0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-59q79" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.707280 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94158f29-fe7c-44f5-85a6-ec3e7f39a50a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ltsqn\" (UID: \"94158f29-fe7c-44f5-85a6-ec3e7f39a50a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ltsqn" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.707315 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bcb1a4d-f708-4d3a-81f1-8373e36eb474-config\") pod \"console-operator-58897d9998-cww5w\" (UID: \"8bcb1a4d-f708-4d3a-81f1-8373e36eb474\") " pod="openshift-console-operator/console-operator-58897d9998-cww5w" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.707338 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96025a62-2435-451a-93bf-b03d24d6cfc1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-s6twn\" (UID: \"96025a62-2435-451a-93bf-b03d24d6cfc1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s6twn" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.707374 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/69e95a31-ebb4-4647-b358-ad5a85023485-auth-proxy-config\") pod \"machine-config-operator-74547568cd-npj9g\" (UID: \"69e95a31-ebb4-4647-b358-ad5a85023485\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-npj9g" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.707530 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2032a3f5-b88c-423b-a25d-3768950ac81c-srv-cert\") pod \"olm-operator-6b444d44fb-4cqtc\" (UID: \"2032a3f5-b88c-423b-a25d-3768950ac81c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4cqtc" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.707578 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/69e95a31-ebb4-4647-b358-ad5a85023485-images\") pod \"machine-config-operator-74547568cd-npj9g\" (UID: \"69e95a31-ebb4-4647-b358-ad5a85023485\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-npj9g" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.707830 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e1a48f2-d6ff-4699-aea7-66f08f0a4e4b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fhz5c\" (UID: \"2e1a48f2-d6ff-4699-aea7-66f08f0a4e4b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fhz5c" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.708134 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2b7160ce-096f-4305-9954-982608b133ac-etcd-client\") pod \"apiserver-76f77b778f-bzlw5\" (UID: \"2b7160ce-096f-4305-9954-982608b133ac\") " pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.708271 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96025a62-2435-451a-93bf-b03d24d6cfc1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-s6twn\" (UID: \"96025a62-2435-451a-93bf-b03d24d6cfc1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s6twn" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.708948 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/602fd67e-3c82-46aa-879d-17bbd976e85b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-s77pq\" (UID: \"602fd67e-3c82-46aa-879d-17bbd976e85b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-s77pq" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.709220 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/57c79b01-642f-4c45-886c-b3e852c0bc23-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-g5rg5\" (UID: \"57c79b01-642f-4c45-886c-b3e852c0bc23\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g5rg5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.709275 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/01d0fb39-b10a-4717-8e77-ed73f95166bd-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6kfbf\" (UID: \"01d0fb39-b10a-4717-8e77-ed73f95166bd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6kfbf" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.709383 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/498fcaeb-168e-4860-9f5d-a7c72ee1808f-serving-cert\") pod \"etcd-operator-b45778765-xv8d6\" (UID: \"498fcaeb-168e-4860-9f5d-a7c72ee1808f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xv8d6" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.710892 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ba02dfe-ace9-4644-b56c-cba779cfb2ec-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-gvn7f\" (UID: \"6ba02dfe-ace9-4644-b56c-cba779cfb2ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gvn7f" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.712209 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8bcb1a4d-f708-4d3a-81f1-8373e36eb474-serving-cert\") pod \"console-operator-58897d9998-cww5w\" (UID: \"8bcb1a4d-f708-4d3a-81f1-8373e36eb474\") " pod="openshift-console-operator/console-operator-58897d9998-cww5w" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.712289 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.716283 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2b7160ce-096f-4305-9954-982608b133ac-encryption-config\") pod \"apiserver-76f77b778f-bzlw5\" (UID: \"2b7160ce-096f-4305-9954-982608b133ac\") " pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.717423 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/defba3b0-b36e-4b8e-a8b1-577782a54249-serving-cert\") pod \"route-controller-manager-6576b87f9c-27p2h\" (UID: \"defba3b0-b36e-4b8e-a8b1-577782a54249\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27p2h" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.717504 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/215da1aa-97ec-4ef7-a65d-597190dc6c63-metrics-tls\") pod \"ingress-operator-5b745b69d9-fltkw\" (UID: \"215da1aa-97ec-4ef7-a65d-597190dc6c63\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fltkw" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.717586 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7332042a-dffc-4c3e-94eb-2a1dedc58062-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-j2j42\" (UID: \"7332042a-dffc-4c3e-94eb-2a1dedc58062\") " pod="openshift-marketplace/marketplace-operator-79b997595-j2j42" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.718964 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8de305f-4cde-4354-ad95-b74003e014a2-proxy-tls\") pod \"machine-config-controller-84d6567774-gm5vv\" (UID: \"c8de305f-4cde-4354-ad95-b74003e014a2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm5vv" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.719887 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b7160ce-096f-4305-9954-982608b133ac-serving-cert\") pod \"apiserver-76f77b778f-bzlw5\" (UID: \"2b7160ce-096f-4305-9954-982608b133ac\") " pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.722734 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a043d02e-8a8a-42e6-839d-15dc6c0b43b6-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qgk4d\" (UID: \"a043d02e-8a8a-42e6-839d-15dc6c0b43b6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgk4d" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.732287 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.760459 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.772550 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.793038 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.808971 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/84a21e6e-7bda-408b-a607-f02b4f807535-audit-policies\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.809116 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.809209 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b2d85f1d-8085-42e9-b3a3-dfa5eed4c3cd-metrics-tls\") pod \"dns-operator-744455d44c-wnkwx\" (UID: \"b2d85f1d-8085-42e9-b3a3-dfa5eed4c3cd\") " pod="openshift-dns-operator/dns-operator-744455d44c-wnkwx" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.809289 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/84a21e6e-7bda-408b-a607-f02b4f807535-audit-dir\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.809351 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.809414 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.809369 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/84a21e6e-7bda-408b-a607-f02b4f807535-audit-dir\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.809481 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6njd\" (UniqueName: \"kubernetes.io/projected/b2d85f1d-8085-42e9-b3a3-dfa5eed4c3cd-kube-api-access-x6njd\") pod \"dns-operator-744455d44c-wnkwx\" (UID: \"b2d85f1d-8085-42e9-b3a3-dfa5eed4c3cd\") " pod="openshift-dns-operator/dns-operator-744455d44c-wnkwx" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.809605 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/008bfbc9-9b16-4769-ba0a-116a67b7fdb4-stats-auth\") pod \"router-default-5444994796-mfdzp\" (UID: \"008bfbc9-9b16-4769-ba0a-116a67b7fdb4\") " pod="openshift-ingress/router-default-5444994796-mfdzp" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.809670 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abd70175-8048-40dc-8f82-72d1112b0af0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-59q79\" (UID: \"abd70175-8048-40dc-8f82-72d1112b0af0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-59q79" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.809733 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46bgv\" (UniqueName: \"kubernetes.io/projected/01289133-eb09-4497-8df9-bfd2ee3e0357-kube-api-access-46bgv\") pod \"machine-config-server-vk4md\" (UID: \"01289133-eb09-4497-8df9-bfd2ee3e0357\") " pod="openshift-machine-config-operator/machine-config-server-vk4md" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.809828 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.809903 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/008bfbc9-9b16-4769-ba0a-116a67b7fdb4-metrics-certs\") pod \"router-default-5444994796-mfdzp\" (UID: \"008bfbc9-9b16-4769-ba0a-116a67b7fdb4\") " pod="openshift-ingress/router-default-5444994796-mfdzp" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.809977 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abd70175-8048-40dc-8f82-72d1112b0af0-config\") pod \"kube-apiserver-operator-766d6c64bb-59q79\" (UID: \"abd70175-8048-40dc-8f82-72d1112b0af0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-59q79" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.810039 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94158f29-fe7c-44f5-85a6-ec3e7f39a50a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ltsqn\" (UID: \"94158f29-fe7c-44f5-85a6-ec3e7f39a50a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ltsqn" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.810114 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/69e95a31-ebb4-4647-b358-ad5a85023485-auth-proxy-config\") pod \"machine-config-operator-74547568cd-npj9g\" (UID: \"69e95a31-ebb4-4647-b358-ad5a85023485\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-npj9g" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.810182 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/69e95a31-ebb4-4647-b358-ad5a85023485-images\") pod \"machine-config-operator-74547568cd-npj9g\" (UID: \"69e95a31-ebb4-4647-b358-ad5a85023485\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-npj9g" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.810251 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2032a3f5-b88c-423b-a25d-3768950ac81c-srv-cert\") pod \"olm-operator-6b444d44fb-4cqtc\" (UID: \"2032a3f5-b88c-423b-a25d-3768950ac81c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4cqtc" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.810314 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s2kn\" (UniqueName: \"kubernetes.io/projected/19aab548-96a0-4056-8226-f9e7cf4b3ca3-kube-api-access-5s2kn\") pod \"collect-profiles-29564280-6hx2v\" (UID: \"19aab548-96a0-4056-8226-f9e7cf4b3ca3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-6hx2v" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.810377 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvrcp\" (UniqueName: \"kubernetes.io/projected/b2e93e85-70a0-4853-8fc1-2101d5b26069-kube-api-access-kvrcp\") pod \"packageserver-d55dfcdfc-5q9t5\" (UID: \"b2e93e85-70a0-4853-8fc1-2101d5b26069\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5q9t5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.810507 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.810578 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/008bfbc9-9b16-4769-ba0a-116a67b7fdb4-default-certificate\") pod \"router-default-5444994796-mfdzp\" (UID: \"008bfbc9-9b16-4769-ba0a-116a67b7fdb4\") " pod="openshift-ingress/router-default-5444994796-mfdzp" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.810642 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.810713 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khtw6\" (UniqueName: \"kubernetes.io/projected/84a21e6e-7bda-408b-a607-f02b4f807535-kube-api-access-khtw6\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.810796 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b2e93e85-70a0-4853-8fc1-2101d5b26069-tmpfs\") pod \"packageserver-d55dfcdfc-5q9t5\" (UID: \"b2e93e85-70a0-4853-8fc1-2101d5b26069\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5q9t5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.810883 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/abd70175-8048-40dc-8f82-72d1112b0af0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-59q79\" (UID: \"abd70175-8048-40dc-8f82-72d1112b0af0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-59q79" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.810989 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b2e93e85-70a0-4853-8fc1-2101d5b26069-apiservice-cert\") pod \"packageserver-d55dfcdfc-5q9t5\" (UID: \"b2e93e85-70a0-4853-8fc1-2101d5b26069\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5q9t5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.811068 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2782ade8-7344-423c-8ace-e9fe0b0fd207-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-xnxfs\" (UID: \"2782ade8-7344-423c-8ace-e9fe0b0fd207\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnxfs" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.811141 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfvxs\" (UniqueName: \"kubernetes.io/projected/69e95a31-ebb4-4647-b358-ad5a85023485-kube-api-access-cfvxs\") pod \"machine-config-operator-74547568cd-npj9g\" (UID: \"69e95a31-ebb4-4647-b358-ad5a85023485\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-npj9g" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.811305 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/69e95a31-ebb4-4647-b358-ad5a85023485-proxy-tls\") pod \"machine-config-operator-74547568cd-npj9g\" (UID: \"69e95a31-ebb4-4647-b358-ad5a85023485\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-npj9g" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.811377 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2782ade8-7344-423c-8ace-e9fe0b0fd207-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-xnxfs\" (UID: \"2782ade8-7344-423c-8ace-e9fe0b0fd207\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnxfs" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.811450 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2032a3f5-b88c-423b-a25d-3768950ac81c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-4cqtc\" (UID: \"2032a3f5-b88c-423b-a25d-3768950ac81c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4cqtc" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.811517 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng69v\" (UniqueName: \"kubernetes.io/projected/2782ade8-7344-423c-8ace-e9fe0b0fd207-kube-api-access-ng69v\") pod \"openshift-controller-manager-operator-756b6f6bc6-xnxfs\" (UID: \"2782ade8-7344-423c-8ace-e9fe0b0fd207\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnxfs" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.811574 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/69e95a31-ebb4-4647-b358-ad5a85023485-auth-proxy-config\") pod \"machine-config-operator-74547568cd-npj9g\" (UID: \"69e95a31-ebb4-4647-b358-ad5a85023485\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-npj9g" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.811479 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b2e93e85-70a0-4853-8fc1-2101d5b26069-tmpfs\") pod \"packageserver-d55dfcdfc-5q9t5\" (UID: \"b2e93e85-70a0-4853-8fc1-2101d5b26069\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5q9t5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.811690 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/01289133-eb09-4497-8df9-bfd2ee3e0357-node-bootstrap-token\") pod \"machine-config-server-vk4md\" (UID: \"01289133-eb09-4497-8df9-bfd2ee3e0357\") " pod="openshift-machine-config-operator/machine-config-server-vk4md" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.811781 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/008bfbc9-9b16-4769-ba0a-116a67b7fdb4-service-ca-bundle\") pod \"router-default-5444994796-mfdzp\" (UID: \"008bfbc9-9b16-4769-ba0a-116a67b7fdb4\") " pod="openshift-ingress/router-default-5444994796-mfdzp" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.811851 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.811932 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.812002 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94158f29-fe7c-44f5-85a6-ec3e7f39a50a-config\") pod \"kube-controller-manager-operator-78b949d7b-ltsqn\" (UID: \"94158f29-fe7c-44f5-85a6-ec3e7f39a50a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ltsqn" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.812072 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/19aab548-96a0-4056-8226-f9e7cf4b3ca3-config-volume\") pod \"collect-profiles-29564280-6hx2v\" (UID: \"19aab548-96a0-4056-8226-f9e7cf4b3ca3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-6hx2v" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.812138 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.812211 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.812301 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/19aab548-96a0-4056-8226-f9e7cf4b3ca3-secret-volume\") pod \"collect-profiles-29564280-6hx2v\" (UID: \"19aab548-96a0-4056-8226-f9e7cf4b3ca3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-6hx2v" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.812365 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b2e93e85-70a0-4853-8fc1-2101d5b26069-webhook-cert\") pod \"packageserver-d55dfcdfc-5q9t5\" (UID: \"b2e93e85-70a0-4853-8fc1-2101d5b26069\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5q9t5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.812452 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/01289133-eb09-4497-8df9-bfd2ee3e0357-certs\") pod \"machine-config-server-vk4md\" (UID: \"01289133-eb09-4497-8df9-bfd2ee3e0357\") " pod="openshift-machine-config-operator/machine-config-server-vk4md" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.812517 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcsvv\" (UniqueName: \"kubernetes.io/projected/008bfbc9-9b16-4769-ba0a-116a67b7fdb4-kube-api-access-lcsvv\") pod \"router-default-5444994796-mfdzp\" (UID: \"008bfbc9-9b16-4769-ba0a-116a67b7fdb4\") " pod="openshift-ingress/router-default-5444994796-mfdzp" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.812581 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.812654 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/94158f29-fe7c-44f5-85a6-ec3e7f39a50a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ltsqn\" (UID: \"94158f29-fe7c-44f5-85a6-ec3e7f39a50a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ltsqn" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.812719 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ht7b\" (UniqueName: \"kubernetes.io/projected/2032a3f5-b88c-423b-a25d-3768950ac81c-kube-api-access-7ht7b\") pod \"olm-operator-6b444d44fb-4cqtc\" (UID: \"2032a3f5-b88c-423b-a25d-3768950ac81c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4cqtc" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.812860 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.814478 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2032a3f5-b88c-423b-a25d-3768950ac81c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-4cqtc\" (UID: \"2032a3f5-b88c-423b-a25d-3768950ac81c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4cqtc" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.815346 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b2e93e85-70a0-4853-8fc1-2101d5b26069-apiservice-cert\") pod \"packageserver-d55dfcdfc-5q9t5\" (UID: \"b2e93e85-70a0-4853-8fc1-2101d5b26069\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5q9t5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.815580 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/19aab548-96a0-4056-8226-f9e7cf4b3ca3-secret-volume\") pod \"collect-profiles-29564280-6hx2v\" (UID: \"19aab548-96a0-4056-8226-f9e7cf4b3ca3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-6hx2v" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.815798 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b2e93e85-70a0-4853-8fc1-2101d5b26069-webhook-cert\") pod \"packageserver-d55dfcdfc-5q9t5\" (UID: \"b2e93e85-70a0-4853-8fc1-2101d5b26069\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5q9t5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.854049 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.872676 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.893029 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.904197 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b2d85f1d-8085-42e9-b3a3-dfa5eed4c3cd-metrics-tls\") pod \"dns-operator-744455d44c-wnkwx\" (UID: \"b2d85f1d-8085-42e9-b3a3-dfa5eed4c3cd\") " pod="openshift-dns-operator/dns-operator-744455d44c-wnkwx" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.913194 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.934757 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.953145 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.973155 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 18 18:04:25 crc kubenswrapper[4830]: I0318 18:04:25.993453 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.012551 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.032819 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.053935 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.073411 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.093132 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.102951 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/84a21e6e-7bda-408b-a607-f02b4f807535-audit-policies\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.115190 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.132843 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.145618 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.152889 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.161531 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.174577 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.183157 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.192395 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.206030 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.212628 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.224588 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.234132 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wx6kd" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.234255 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.247594 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.253602 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.258249 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.273004 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.293499 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.298336 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.323716 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.338946 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.352108 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.363356 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.366276 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.373814 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.376577 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.386024 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.394040 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.413568 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.434469 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.454154 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.473115 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.500424 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.512100 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.534379 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.550762 4830 request.go:700] Waited for 1.015054671s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/secrets?fieldSelector=metadata.name%3Dconsole-oauth-config&limit=500&resourceVersion=0 Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.552498 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.572612 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.593863 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.613100 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.633195 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.655060 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.673528 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.685303 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94158f29-fe7c-44f5-85a6-ec3e7f39a50a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ltsqn\" (UID: \"94158f29-fe7c-44f5-85a6-ec3e7f39a50a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ltsqn" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.692878 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.703022 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94158f29-fe7c-44f5-85a6-ec3e7f39a50a-config\") pod \"kube-controller-manager-operator-78b949d7b-ltsqn\" (UID: \"94158f29-fe7c-44f5-85a6-ec3e7f39a50a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ltsqn" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.714717 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.736977 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.745958 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/008bfbc9-9b16-4769-ba0a-116a67b7fdb4-default-certificate\") pod \"router-default-5444994796-mfdzp\" (UID: \"008bfbc9-9b16-4769-ba0a-116a67b7fdb4\") " pod="openshift-ingress/router-default-5444994796-mfdzp" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.753415 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.773531 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.784051 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/008bfbc9-9b16-4769-ba0a-116a67b7fdb4-service-ca-bundle\") pod \"router-default-5444994796-mfdzp\" (UID: \"008bfbc9-9b16-4769-ba0a-116a67b7fdb4\") " pod="openshift-ingress/router-default-5444994796-mfdzp" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.793624 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.803281 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/008bfbc9-9b16-4769-ba0a-116a67b7fdb4-metrics-certs\") pod \"router-default-5444994796-mfdzp\" (UID: \"008bfbc9-9b16-4769-ba0a-116a67b7fdb4\") " pod="openshift-ingress/router-default-5444994796-mfdzp" Mar 18 18:04:26 crc kubenswrapper[4830]: E0318 18:04:26.809848 4830 secret.go:188] Couldn't get secret openshift-ingress/router-stats-default: failed to sync secret cache: timed out waiting for the condition Mar 18 18:04:26 crc kubenswrapper[4830]: E0318 18:04:26.810011 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/008bfbc9-9b16-4769-ba0a-116a67b7fdb4-stats-auth podName:008bfbc9-9b16-4769-ba0a-116a67b7fdb4 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:27.309975697 +0000 UTC m=+101.877606069 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "stats-auth" (UniqueName: "kubernetes.io/secret/008bfbc9-9b16-4769-ba0a-116a67b7fdb4-stats-auth") pod "router-default-5444994796-mfdzp" (UID: "008bfbc9-9b16-4769-ba0a-116a67b7fdb4") : failed to sync secret cache: timed out waiting for the condition Mar 18 18:04:26 crc kubenswrapper[4830]: E0318 18:04:26.809860 4830 secret.go:188] Couldn't get secret openshift-kube-apiserver-operator/kube-apiserver-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 18 18:04:26 crc kubenswrapper[4830]: E0318 18:04:26.810117 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/abd70175-8048-40dc-8f82-72d1112b0af0-serving-cert podName:abd70175-8048-40dc-8f82-72d1112b0af0 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:27.31008971 +0000 UTC m=+101.877720082 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/abd70175-8048-40dc-8f82-72d1112b0af0-serving-cert") pod "kube-apiserver-operator-766d6c64bb-59q79" (UID: "abd70175-8048-40dc-8f82-72d1112b0af0") : failed to sync secret cache: timed out waiting for the condition Mar 18 18:04:26 crc kubenswrapper[4830]: E0318 18:04:26.810196 4830 configmap.go:193] Couldn't get configMap openshift-kube-apiserver-operator/kube-apiserver-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 18 18:04:26 crc kubenswrapper[4830]: E0318 18:04:26.810242 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/abd70175-8048-40dc-8f82-72d1112b0af0-config podName:abd70175-8048-40dc-8f82-72d1112b0af0 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:27.310228094 +0000 UTC m=+101.877858466 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/abd70175-8048-40dc-8f82-72d1112b0af0-config") pod "kube-apiserver-operator-766d6c64bb-59q79" (UID: "abd70175-8048-40dc-8f82-72d1112b0af0") : failed to sync configmap cache: timed out waiting for the condition Mar 18 18:04:26 crc kubenswrapper[4830]: E0318 18:04:26.810284 4830 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/machine-config-operator-images: failed to sync configmap cache: timed out waiting for the condition Mar 18 18:04:26 crc kubenswrapper[4830]: E0318 18:04:26.810328 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/69e95a31-ebb4-4647-b358-ad5a85023485-images podName:69e95a31-ebb4-4647-b358-ad5a85023485 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:27.310316936 +0000 UTC m=+101.877947298 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/69e95a31-ebb4-4647-b358-ad5a85023485-images") pod "machine-config-operator-74547568cd-npj9g" (UID: "69e95a31-ebb4-4647-b358-ad5a85023485") : failed to sync configmap cache: timed out waiting for the condition Mar 18 18:04:26 crc kubenswrapper[4830]: E0318 18:04:26.811003 4830 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 18 18:04:26 crc kubenswrapper[4830]: E0318 18:04:26.811130 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2032a3f5-b88c-423b-a25d-3768950ac81c-srv-cert podName:2032a3f5-b88c-423b-a25d-3768950ac81c nodeName:}" failed. No retries permitted until 2026-03-18 18:04:27.311095225 +0000 UTC m=+101.878725647 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/2032a3f5-b88c-423b-a25d-3768950ac81c-srv-cert") pod "olm-operator-6b444d44fb-4cqtc" (UID: "2032a3f5-b88c-423b-a25d-3768950ac81c") : failed to sync secret cache: timed out waiting for the condition Mar 18 18:04:26 crc kubenswrapper[4830]: E0318 18:04:26.811388 4830 configmap.go:193] Couldn't get configMap openshift-controller-manager-operator/openshift-controller-manager-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 18 18:04:26 crc kubenswrapper[4830]: E0318 18:04:26.811504 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2782ade8-7344-423c-8ace-e9fe0b0fd207-config podName:2782ade8-7344-423c-8ace-e9fe0b0fd207 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:27.311478075 +0000 UTC m=+101.879108527 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/2782ade8-7344-423c-8ace-e9fe0b0fd207-config") pod "openshift-controller-manager-operator-756b6f6bc6-xnxfs" (UID: "2782ade8-7344-423c-8ace-e9fe0b0fd207") : failed to sync configmap cache: timed out waiting for the condition Mar 18 18:04:26 crc kubenswrapper[4830]: E0318 18:04:26.811573 4830 secret.go:188] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 18 18:04:26 crc kubenswrapper[4830]: E0318 18:04:26.811660 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69e95a31-ebb4-4647-b358-ad5a85023485-proxy-tls podName:69e95a31-ebb4-4647-b358-ad5a85023485 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:27.311638479 +0000 UTC m=+101.879268941 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/69e95a31-ebb4-4647-b358-ad5a85023485-proxy-tls") pod "machine-config-operator-74547568cd-npj9g" (UID: "69e95a31-ebb4-4647-b358-ad5a85023485") : failed to sync secret cache: timed out waiting for the condition Mar 18 18:04:26 crc kubenswrapper[4830]: E0318 18:04:26.812128 4830 secret.go:188] Couldn't get secret openshift-controller-manager-operator/openshift-controller-manager-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 18 18:04:26 crc kubenswrapper[4830]: E0318 18:04:26.812151 4830 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Mar 18 18:04:26 crc kubenswrapper[4830]: E0318 18:04:26.812199 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2782ade8-7344-423c-8ace-e9fe0b0fd207-serving-cert podName:2782ade8-7344-423c-8ace-e9fe0b0fd207 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:27.312184313 +0000 UTC m=+101.879814745 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/2782ade8-7344-423c-8ace-e9fe0b0fd207-serving-cert") pod "openshift-controller-manager-operator-756b6f6bc6-xnxfs" (UID: "2782ade8-7344-423c-8ace-e9fe0b0fd207") : failed to sync secret cache: timed out waiting for the condition Mar 18 18:04:26 crc kubenswrapper[4830]: E0318 18:04:26.812238 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01289133-eb09-4497-8df9-bfd2ee3e0357-node-bootstrap-token podName:01289133-eb09-4497-8df9-bfd2ee3e0357 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:27.312214874 +0000 UTC m=+101.879845336 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/01289133-eb09-4497-8df9-bfd2ee3e0357-node-bootstrap-token") pod "machine-config-server-vk4md" (UID: "01289133-eb09-4497-8df9-bfd2ee3e0357") : failed to sync secret cache: timed out waiting for the condition Mar 18 18:04:26 crc kubenswrapper[4830]: E0318 18:04:26.812245 4830 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Mar 18 18:04:26 crc kubenswrapper[4830]: E0318 18:04:26.812294 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/19aab548-96a0-4056-8226-f9e7cf4b3ca3-config-volume podName:19aab548-96a0-4056-8226-f9e7cf4b3ca3 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:27.312283146 +0000 UTC m=+101.879913638 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/19aab548-96a0-4056-8226-f9e7cf4b3ca3-config-volume") pod "collect-profiles-29564280-6hx2v" (UID: "19aab548-96a0-4056-8226-f9e7cf4b3ca3") : failed to sync configmap cache: timed out waiting for the condition Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.813284 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 18 18:04:26 crc kubenswrapper[4830]: E0318 18:04:26.813564 4830 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Mar 18 18:04:26 crc kubenswrapper[4830]: E0318 18:04:26.813635 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01289133-eb09-4497-8df9-bfd2ee3e0357-certs podName:01289133-eb09-4497-8df9-bfd2ee3e0357 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:27.313617849 +0000 UTC m=+101.881248221 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/01289133-eb09-4497-8df9-bfd2ee3e0357-certs") pod "machine-config-server-vk4md" (UID: "01289133-eb09-4497-8df9-bfd2ee3e0357") : failed to sync secret cache: timed out waiting for the condition Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.832568 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.851916 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.873259 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.892158 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.913320 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.932150 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.953176 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.973115 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 18:04:26 crc kubenswrapper[4830]: I0318 18:04:26.992982 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.013206 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.034037 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.053639 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.073121 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.093992 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.112527 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.132624 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.152748 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.172383 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.193477 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.212755 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.233291 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.233647 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.233647 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.253260 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.272829 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.293735 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.347842 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/008bfbc9-9b16-4769-ba0a-116a67b7fdb4-stats-auth\") pod \"router-default-5444994796-mfdzp\" (UID: \"008bfbc9-9b16-4769-ba0a-116a67b7fdb4\") " pod="openshift-ingress/router-default-5444994796-mfdzp" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.347922 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abd70175-8048-40dc-8f82-72d1112b0af0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-59q79\" (UID: \"abd70175-8048-40dc-8f82-72d1112b0af0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-59q79" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.348020 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abd70175-8048-40dc-8f82-72d1112b0af0-config\") pod \"kube-apiserver-operator-766d6c64bb-59q79\" (UID: \"abd70175-8048-40dc-8f82-72d1112b0af0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-59q79" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.348064 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/69e95a31-ebb4-4647-b358-ad5a85023485-images\") pod \"machine-config-operator-74547568cd-npj9g\" (UID: \"69e95a31-ebb4-4647-b358-ad5a85023485\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-npj9g" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.348099 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2032a3f5-b88c-423b-a25d-3768950ac81c-srv-cert\") pod \"olm-operator-6b444d44fb-4cqtc\" (UID: \"2032a3f5-b88c-423b-a25d-3768950ac81c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4cqtc" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.348984 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2782ade8-7344-423c-8ace-e9fe0b0fd207-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-xnxfs\" (UID: \"2782ade8-7344-423c-8ace-e9fe0b0fd207\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnxfs" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.349074 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/69e95a31-ebb4-4647-b358-ad5a85023485-proxy-tls\") pod \"machine-config-operator-74547568cd-npj9g\" (UID: \"69e95a31-ebb4-4647-b358-ad5a85023485\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-npj9g" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.349128 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2782ade8-7344-423c-8ace-e9fe0b0fd207-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-xnxfs\" (UID: \"2782ade8-7344-423c-8ace-e9fe0b0fd207\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnxfs" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.349319 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/01289133-eb09-4497-8df9-bfd2ee3e0357-node-bootstrap-token\") pod \"machine-config-server-vk4md\" (UID: \"01289133-eb09-4497-8df9-bfd2ee3e0357\") " pod="openshift-machine-config-operator/machine-config-server-vk4md" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.349414 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/19aab548-96a0-4056-8226-f9e7cf4b3ca3-config-volume\") pod \"collect-profiles-29564280-6hx2v\" (UID: \"19aab548-96a0-4056-8226-f9e7cf4b3ca3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-6hx2v" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.349542 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/69e95a31-ebb4-4647-b358-ad5a85023485-images\") pod \"machine-config-operator-74547568cd-npj9g\" (UID: \"69e95a31-ebb4-4647-b358-ad5a85023485\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-npj9g" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.349564 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/01289133-eb09-4497-8df9-bfd2ee3e0357-certs\") pod \"machine-config-server-vk4md\" (UID: \"01289133-eb09-4497-8df9-bfd2ee3e0357\") " pod="openshift-machine-config-operator/machine-config-server-vk4md" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.350412 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2782ade8-7344-423c-8ace-e9fe0b0fd207-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-xnxfs\" (UID: \"2782ade8-7344-423c-8ace-e9fe0b0fd207\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnxfs" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.350415 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abd70175-8048-40dc-8f82-72d1112b0af0-config\") pod \"kube-apiserver-operator-766d6c64bb-59q79\" (UID: \"abd70175-8048-40dc-8f82-72d1112b0af0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-59q79" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.353393 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/19aab548-96a0-4056-8226-f9e7cf4b3ca3-config-volume\") pod \"collect-profiles-29564280-6hx2v\" (UID: \"19aab548-96a0-4056-8226-f9e7cf4b3ca3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-6hx2v" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.354672 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abd70175-8048-40dc-8f82-72d1112b0af0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-59q79\" (UID: \"abd70175-8048-40dc-8f82-72d1112b0af0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-59q79" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.357188 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/008bfbc9-9b16-4769-ba0a-116a67b7fdb4-stats-auth\") pod \"router-default-5444994796-mfdzp\" (UID: \"008bfbc9-9b16-4769-ba0a-116a67b7fdb4\") " pod="openshift-ingress/router-default-5444994796-mfdzp" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.368341 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/01289133-eb09-4497-8df9-bfd2ee3e0357-node-bootstrap-token\") pod \"machine-config-server-vk4md\" (UID: \"01289133-eb09-4497-8df9-bfd2ee3e0357\") " pod="openshift-machine-config-operator/machine-config-server-vk4md" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.369327 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2032a3f5-b88c-423b-a25d-3768950ac81c-srv-cert\") pod \"olm-operator-6b444d44fb-4cqtc\" (UID: \"2032a3f5-b88c-423b-a25d-3768950ac81c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4cqtc" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.370292 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/01289133-eb09-4497-8df9-bfd2ee3e0357-certs\") pod \"machine-config-server-vk4md\" (UID: \"01289133-eb09-4497-8df9-bfd2ee3e0357\") " pod="openshift-machine-config-operator/machine-config-server-vk4md" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.371609 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2782ade8-7344-423c-8ace-e9fe0b0fd207-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-xnxfs\" (UID: \"2782ade8-7344-423c-8ace-e9fe0b0fd207\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnxfs" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.376555 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/69e95a31-ebb4-4647-b358-ad5a85023485-proxy-tls\") pod \"machine-config-operator-74547568cd-npj9g\" (UID: \"69e95a31-ebb4-4647-b358-ad5a85023485\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-npj9g" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.378919 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhvcj\" (UniqueName: \"kubernetes.io/projected/884979c1-fccc-4bd6-b6db-4ec35bd9bdf7-kube-api-access-jhvcj\") pod \"openshift-config-operator-7777fb866f-kr648\" (UID: \"884979c1-fccc-4bd6-b6db-4ec35bd9bdf7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kr648" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.387107 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkxts\" (UniqueName: \"kubernetes.io/projected/d43b41ee-1e3d-42bf-8856-0678d441ac96-kube-api-access-dkxts\") pod \"apiserver-7bbb656c7d-78nbq\" (UID: \"d43b41ee-1e3d-42bf-8856-0678d441ac96\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.405918 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zxs7\" (UniqueName: \"kubernetes.io/projected/35a6cab9-b63f-4ed5-ac08-897e894498c5-kube-api-access-6zxs7\") pod \"controller-manager-879f6c89f-cnwlr\" (UID: \"35a6cab9-b63f-4ed5-ac08-897e894498c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cnwlr" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.412837 4830 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.421253 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbbwn\" (UniqueName: \"kubernetes.io/projected/9279fbd5-1378-4a9a-b35d-85a7b9430674-kube-api-access-hbbwn\") pod \"machine-api-operator-5694c8668f-f77k6\" (UID: \"9279fbd5-1378-4a9a-b35d-85a7b9430674\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f77k6" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.433804 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.450222 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.453398 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.475419 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cnwlr" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.476131 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.493466 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.502744 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kr648" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.510617 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-f77k6" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.515399 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.533052 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.551374 4830 request.go:700] Waited for 1.906813471s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/configmaps?fieldSelector=metadata.name%3Dcni-sysctl-allowlist&limit=500&resourceVersion=0 Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.553211 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.608455 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scv79\" (UniqueName: \"kubernetes.io/projected/96025a62-2435-451a-93bf-b03d24d6cfc1-kube-api-access-scv79\") pod \"authentication-operator-69f744f599-s6twn\" (UID: \"96025a62-2435-451a-93bf-b03d24d6cfc1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s6twn" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.613571 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld9v5\" (UniqueName: \"kubernetes.io/projected/6ba02dfe-ace9-4644-b56c-cba779cfb2ec-kube-api-access-ld9v5\") pod \"kube-storage-version-migrator-operator-b67b599dd-gvn7f\" (UID: \"6ba02dfe-ace9-4644-b56c-cba779cfb2ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gvn7f" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.650077 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz2tl\" (UniqueName: \"kubernetes.io/projected/8b0c7aa1-248e-4847-97f5-c08c17e78c3d-kube-api-access-rz2tl\") pod \"downloads-7954f5f757-t7xl2\" (UID: \"8b0c7aa1-248e-4847-97f5-c08c17e78c3d\") " pod="openshift-console/downloads-7954f5f757-t7xl2" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.669992 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-s6twn" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.677252 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmw9w\" (UniqueName: \"kubernetes.io/projected/498fcaeb-168e-4860-9f5d-a7c72ee1808f-kube-api-access-fmw9w\") pod \"etcd-operator-b45778765-xv8d6\" (UID: \"498fcaeb-168e-4860-9f5d-a7c72ee1808f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xv8d6" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.686274 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-xv8d6" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.693326 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwbpk\" (UniqueName: \"kubernetes.io/projected/c8de305f-4cde-4354-ad95-b74003e014a2-kube-api-access-xwbpk\") pod \"machine-config-controller-84d6567774-gm5vv\" (UID: \"c8de305f-4cde-4354-ad95-b74003e014a2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm5vv" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.706530 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/215da1aa-97ec-4ef7-a65d-597190dc6c63-bound-sa-token\") pod \"ingress-operator-5b745b69d9-fltkw\" (UID: \"215da1aa-97ec-4ef7-a65d-597190dc6c63\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fltkw" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.719095 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42w7x\" (UniqueName: \"kubernetes.io/projected/8bcb1a4d-f708-4d3a-81f1-8373e36eb474-kube-api-access-42w7x\") pod \"console-operator-58897d9998-cww5w\" (UID: \"8bcb1a4d-f708-4d3a-81f1-8373e36eb474\") " pod="openshift-console-operator/console-operator-58897d9998-cww5w" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.721918 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-t7xl2" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.740871 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf8s7\" (UniqueName: \"kubernetes.io/projected/215da1aa-97ec-4ef7-a65d-597190dc6c63-kube-api-access-nf8s7\") pod \"ingress-operator-5b745b69d9-fltkw\" (UID: \"215da1aa-97ec-4ef7-a65d-597190dc6c63\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fltkw" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.748021 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl5sk\" (UniqueName: \"kubernetes.io/projected/01d0fb39-b10a-4717-8e77-ed73f95166bd-kube-api-access-tl5sk\") pod \"package-server-manager-789f6589d5-6kfbf\" (UID: \"01d0fb39-b10a-4717-8e77-ed73f95166bd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6kfbf" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.766929 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc2pc\" (UniqueName: \"kubernetes.io/projected/a043d02e-8a8a-42e6-839d-15dc6c0b43b6-kube-api-access-rc2pc\") pod \"cluster-samples-operator-665b6dd947-qgk4d\" (UID: \"a043d02e-8a8a-42e6-839d-15dc6c0b43b6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgk4d" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.796995 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmntm\" (UniqueName: \"kubernetes.io/projected/57c79b01-642f-4c45-886c-b3e852c0bc23-kube-api-access-zmntm\") pod \"control-plane-machine-set-operator-78cbb6b69f-g5rg5\" (UID: \"57c79b01-642f-4c45-886c-b3e852c0bc23\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g5rg5" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.806564 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4nb6\" (UniqueName: \"kubernetes.io/projected/2b7160ce-096f-4305-9954-982608b133ac-kube-api-access-s4nb6\") pod \"apiserver-76f77b778f-bzlw5\" (UID: \"2b7160ce-096f-4305-9954-982608b133ac\") " pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.829223 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-855js\" (UniqueName: \"kubernetes.io/projected/7332042a-dffc-4c3e-94eb-2a1dedc58062-kube-api-access-855js\") pod \"marketplace-operator-79b997595-j2j42\" (UID: \"7332042a-dffc-4c3e-94eb-2a1dedc58062\") " pod="openshift-marketplace/marketplace-operator-79b997595-j2j42" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.848118 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhl9k\" (UniqueName: \"kubernetes.io/projected/602fd67e-3c82-46aa-879d-17bbd976e85b-kube-api-access-lhl9k\") pod \"multus-admission-controller-857f4d67dd-s77pq\" (UID: \"602fd67e-3c82-46aa-879d-17bbd976e85b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-s77pq" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.853386 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fltkw" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.857780 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-kr648"] Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.891203 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gvn7f" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.892301 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2e1a48f2-d6ff-4699-aea7-66f08f0a4e4b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fhz5c\" (UID: \"2e1a48f2-d6ff-4699-aea7-66f08f0a4e4b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fhz5c" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.905994 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm5vv" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.916917 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-j2j42" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.923897 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6njd\" (UniqueName: \"kubernetes.io/projected/b2d85f1d-8085-42e9-b3a3-dfa5eed4c3cd-kube-api-access-x6njd\") pod \"dns-operator-744455d44c-wnkwx\" (UID: \"b2d85f1d-8085-42e9-b3a3-dfa5eed4c3cd\") " pod="openshift-dns-operator/dns-operator-744455d44c-wnkwx" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.924114 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tvnn\" (UniqueName: \"kubernetes.io/projected/defba3b0-b36e-4b8e-a8b1-577782a54249-kube-api-access-6tvnn\") pod \"route-controller-manager-6576b87f9c-27p2h\" (UID: \"defba3b0-b36e-4b8e-a8b1-577782a54249\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27p2h" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.926868 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-cww5w" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.940060 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46bgv\" (UniqueName: \"kubernetes.io/projected/01289133-eb09-4497-8df9-bfd2ee3e0357-kube-api-access-46bgv\") pod \"machine-config-server-vk4md\" (UID: \"01289133-eb09-4497-8df9-bfd2ee3e0357\") " pod="openshift-machine-config-operator/machine-config-server-vk4md" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.940369 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fhz5c" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.940681 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-vk4md" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.968108 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgk4d" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.968998 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g5rg5" Mar 18 18:04:27 crc kubenswrapper[4830]: I0318 18:04:27.977606 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27p2h" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:27.987625 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvrcp\" (UniqueName: \"kubernetes.io/projected/b2e93e85-70a0-4853-8fc1-2101d5b26069-kube-api-access-kvrcp\") pod \"packageserver-d55dfcdfc-5q9t5\" (UID: \"b2e93e85-70a0-4853-8fc1-2101d5b26069\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5q9t5" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:27.990911 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khtw6\" (UniqueName: \"kubernetes.io/projected/84a21e6e-7bda-408b-a607-f02b4f807535-kube-api-access-khtw6\") pod \"oauth-openshift-558db77b4-kczvm\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:27.996968 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6kfbf" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.008878 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfvxs\" (UniqueName: \"kubernetes.io/projected/69e95a31-ebb4-4647-b358-ad5a85023485-kube-api-access-cfvxs\") pod \"machine-config-operator-74547568cd-npj9g\" (UID: \"69e95a31-ebb4-4647-b358-ad5a85023485\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-npj9g" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.028314 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.031982 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/abd70175-8048-40dc-8f82-72d1112b0af0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-59q79\" (UID: \"abd70175-8048-40dc-8f82-72d1112b0af0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-59q79" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.049552 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5q9t5" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.051112 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng69v\" (UniqueName: \"kubernetes.io/projected/2782ade8-7344-423c-8ace-e9fe0b0fd207-kube-api-access-ng69v\") pod \"openshift-controller-manager-operator-756b6f6bc6-xnxfs\" (UID: \"2782ade8-7344-423c-8ace-e9fe0b0fd207\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnxfs" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.051642 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xv8d6"] Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.066729 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s2kn\" (UniqueName: \"kubernetes.io/projected/19aab548-96a0-4056-8226-f9e7cf4b3ca3-kube-api-access-5s2kn\") pod \"collect-profiles-29564280-6hx2v\" (UID: \"19aab548-96a0-4056-8226-f9e7cf4b3ca3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-6hx2v" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.067253 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-f77k6"] Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.067450 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-wnkwx" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.069322 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cnwlr"] Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.073026 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ht7b\" (UniqueName: \"kubernetes.io/projected/2032a3f5-b88c-423b-a25d-3768950ac81c-kube-api-access-7ht7b\") pod \"olm-operator-6b444d44fb-4cqtc\" (UID: \"2032a3f5-b88c-423b-a25d-3768950ac81c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4cqtc" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.075159 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq"] Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.090171 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.091145 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcsvv\" (UniqueName: \"kubernetes.io/projected/008bfbc9-9b16-4769-ba0a-116a67b7fdb4-kube-api-access-lcsvv\") pod \"router-default-5444994796-mfdzp\" (UID: \"008bfbc9-9b16-4769-ba0a-116a67b7fdb4\") " pod="openshift-ingress/router-default-5444994796-mfdzp" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.096371 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-t7xl2"] Mar 18 18:04:28 crc kubenswrapper[4830]: W0318 18:04:28.105019 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod498fcaeb_168e_4860_9f5d_a7c72ee1808f.slice/crio-f94f50ae53cf34e1c8d97a720747c4a5bbf8fdf9fe30eefb3f5023d39b352c72 WatchSource:0}: Error finding container f94f50ae53cf34e1c8d97a720747c4a5bbf8fdf9fe30eefb3f5023d39b352c72: Status 404 returned error can't find the container with id f94f50ae53cf34e1c8d97a720747c4a5bbf8fdf9fe30eefb3f5023d39b352c72 Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.105375 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-s6twn"] Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.107130 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/94158f29-fe7c-44f5-85a6-ec3e7f39a50a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ltsqn\" (UID: \"94158f29-fe7c-44f5-85a6-ec3e7f39a50a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ltsqn" Mar 18 18:04:28 crc kubenswrapper[4830]: W0318 18:04:28.110647 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod884979c1_fccc_4bd6_b6db_4ec35bd9bdf7.slice/crio-67d7adc6a22fae881067c02bf5b86ae11eba40db98d19f6df76eea58d0b00d3d WatchSource:0}: Error finding container 67d7adc6a22fae881067c02bf5b86ae11eba40db98d19f6df76eea58d0b00d3d: Status 404 returned error can't find the container with id 67d7adc6a22fae881067c02bf5b86ae11eba40db98d19f6df76eea58d0b00d3d Mar 18 18:04:28 crc kubenswrapper[4830]: W0318 18:04:28.114084 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35a6cab9_b63f_4ed5_ac08_897e894498c5.slice/crio-ac01a25dc5e6d39db2ae11b713c3febc375735b48f2d725dd976dd196a3c75cf WatchSource:0}: Error finding container ac01a25dc5e6d39db2ae11b713c3febc375735b48f2d725dd976dd196a3c75cf: Status 404 returned error can't find the container with id ac01a25dc5e6d39db2ae11b713c3febc375735b48f2d725dd976dd196a3c75cf Mar 18 18:04:28 crc kubenswrapper[4830]: W0318 18:04:28.115414 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd43b41ee_1e3d_42bf_8856_0678d441ac96.slice/crio-942a5f8d790f0876f8039151255bcf4b27a099abb4abe531ad812f5f14828bf1 WatchSource:0}: Error finding container 942a5f8d790f0876f8039151255bcf4b27a099abb4abe531ad812f5f14828bf1: Status 404 returned error can't find the container with id 942a5f8d790f0876f8039151255bcf4b27a099abb4abe531ad812f5f14828bf1 Mar 18 18:04:28 crc kubenswrapper[4830]: W0318 18:04:28.115615 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b0c7aa1_248e_4847_97f5_c08c17e78c3d.slice/crio-31ee00aa0f635077c1023c4ca73d06142677a3feeb1ee25c4b814ebcbb20f1e9 WatchSource:0}: Error finding container 31ee00aa0f635077c1023c4ca73d06142677a3feeb1ee25c4b814ebcbb20f1e9: Status 404 returned error can't find the container with id 31ee00aa0f635077c1023c4ca73d06142677a3feeb1ee25c4b814ebcbb20f1e9 Mar 18 18:04:28 crc kubenswrapper[4830]: W0318 18:04:28.122586 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96025a62_2435_451a_93bf_b03d24d6cfc1.slice/crio-88175ad536a8a2edce73ba91347ce0b6e18b638ac04ab1cc8f514c9d09b1e72b WatchSource:0}: Error finding container 88175ad536a8a2edce73ba91347ce0b6e18b638ac04ab1cc8f514c9d09b1e72b: Status 404 returned error can't find the container with id 88175ad536a8a2edce73ba91347ce0b6e18b638ac04ab1cc8f514c9d09b1e72b Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.130080 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-s77pq" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.132468 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.142010 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-fltkw"] Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.145624 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ltsqn" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.151988 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-mfdzp" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.153183 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.160960 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4cqtc" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.167720 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-59q79" Mar 18 18:04:28 crc kubenswrapper[4830]: W0318 18:04:28.171282 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod215da1aa_97ec_4ef7_a65d_597190dc6c63.slice/crio-b620c92021de59056e200e41b7d81fa417fce5b6195beeac301548177c8a063e WatchSource:0}: Error finding container b620c92021de59056e200e41b7d81fa417fce5b6195beeac301548177c8a063e: Status 404 returned error can't find the container with id b620c92021de59056e200e41b7d81fa417fce5b6195beeac301548177c8a063e Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.173726 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.174061 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-6hx2v" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.183538 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-npj9g" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.192580 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.202254 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnxfs" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.215069 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.232660 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.271447 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gvn7f"] Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.275733 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95b4d24e-09da-4c0d-9d24-81621509024a-trusted-ca-bundle\") pod \"console-f9d7485db-lfz57\" (UID: \"95b4d24e-09da-4c0d-9d24-81621509024a\") " pod="openshift-console/console-f9d7485db-lfz57" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.275786 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt6d8\" (UniqueName: \"kubernetes.io/projected/f83d2867-10a5-46ca-9f3c-caedae650499-kube-api-access-nt6d8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.276147 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvpzs\" (UniqueName: \"kubernetes.io/projected/8ddeb5b0-b33a-45aa-8129-e227613b85f7-kube-api-access-tvpzs\") pod \"service-ca-operator-777779d784-rpbbt\" (UID: \"8ddeb5b0-b33a-45aa-8129-e227613b85f7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rpbbt" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.276178 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ddeb5b0-b33a-45aa-8129-e227613b85f7-serving-cert\") pod \"service-ca-operator-777779d784-rpbbt\" (UID: \"8ddeb5b0-b33a-45aa-8129-e227613b85f7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rpbbt" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.276194 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/95049de3-9804-4b1a-ba38-495ecbff971b-srv-cert\") pod \"catalog-operator-68c6474976-2xz7d\" (UID: \"95049de3-9804-4b1a-ba38-495ecbff971b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2xz7d" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.276210 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2186f1d1-98b2-4ad7-91cd-acb99fc7aa5b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-9x2m7\" (UID: \"2186f1d1-98b2-4ad7-91cd-acb99fc7aa5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9x2m7" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.276237 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f83d2867-10a5-46ca-9f3c-caedae650499-trusted-ca\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.276251 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f83d2867-10a5-46ca-9f3c-caedae650499-installation-pull-secrets\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.276265 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1fe6ec9e-1fc2-46f6-b3a8-257c478f278a-metrics-tls\") pod \"dns-default-qx95k\" (UID: \"1fe6ec9e-1fc2-46f6-b3a8-257c478f278a\") " pod="openshift-dns/dns-default-qx95k" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.276290 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/95b4d24e-09da-4c0d-9d24-81621509024a-console-config\") pod \"console-f9d7485db-lfz57\" (UID: \"95b4d24e-09da-4c0d-9d24-81621509024a\") " pod="openshift-console/console-f9d7485db-lfz57" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.276314 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/52dfe6d5-441c-4853-810d-f56db985d9bd-signing-key\") pod \"service-ca-9c57cc56f-wjtsp\" (UID: \"52dfe6d5-441c-4853-810d-f56db985d9bd\") " pod="openshift-service-ca/service-ca-9c57cc56f-wjtsp" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.276345 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/52dfe6d5-441c-4853-810d-f56db985d9bd-signing-cabundle\") pod \"service-ca-9c57cc56f-wjtsp\" (UID: \"52dfe6d5-441c-4853-810d-f56db985d9bd\") " pod="openshift-service-ca/service-ca-9c57cc56f-wjtsp" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.276379 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smqtb\" (UniqueName: \"kubernetes.io/projected/2186f1d1-98b2-4ad7-91cd-acb99fc7aa5b-kube-api-access-smqtb\") pod \"cluster-image-registry-operator-dc59b4c8b-9x2m7\" (UID: \"2186f1d1-98b2-4ad7-91cd-acb99fc7aa5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9x2m7" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.276398 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d5586f3-1dc6-422a-b01a-9719ed806021-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-sztnk\" (UID: \"1d5586f3-1dc6-422a-b01a-9719ed806021\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sztnk" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.276412 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/95b4d24e-09da-4c0d-9d24-81621509024a-console-oauth-config\") pod \"console-f9d7485db-lfz57\" (UID: \"95b4d24e-09da-4c0d-9d24-81621509024a\") " pod="openshift-console/console-f9d7485db-lfz57" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.276622 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2186f1d1-98b2-4ad7-91cd-acb99fc7aa5b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-9x2m7\" (UID: \"2186f1d1-98b2-4ad7-91cd-acb99fc7aa5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9x2m7" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.276640 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p6sn\" (UniqueName: \"kubernetes.io/projected/95b4d24e-09da-4c0d-9d24-81621509024a-kube-api-access-7p6sn\") pod \"console-f9d7485db-lfz57\" (UID: \"95b4d24e-09da-4c0d-9d24-81621509024a\") " pod="openshift-console/console-f9d7485db-lfz57" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.276656 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/3ed8be83-14ab-44e2-9c05-bf0306320a71-machine-approver-tls\") pod \"machine-approver-56656f9798-f4wwz\" (UID: \"3ed8be83-14ab-44e2-9c05-bf0306320a71\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f4wwz" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.276678 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk9k2\" (UniqueName: \"kubernetes.io/projected/3ed8be83-14ab-44e2-9c05-bf0306320a71-kube-api-access-vk9k2\") pod \"machine-approver-56656f9798-f4wwz\" (UID: \"3ed8be83-14ab-44e2-9c05-bf0306320a71\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f4wwz" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.276696 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f83d2867-10a5-46ca-9f3c-caedae650499-registry-certificates\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.276721 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnr5j\" (UniqueName: \"kubernetes.io/projected/1fe6ec9e-1fc2-46f6-b3a8-257c478f278a-kube-api-access-hnr5j\") pod \"dns-default-qx95k\" (UID: \"1fe6ec9e-1fc2-46f6-b3a8-257c478f278a\") " pod="openshift-dns/dns-default-qx95k" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.276735 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bgbt\" (UniqueName: \"kubernetes.io/projected/95049de3-9804-4b1a-ba38-495ecbff971b-kube-api-access-5bgbt\") pod \"catalog-operator-68c6474976-2xz7d\" (UID: \"95049de3-9804-4b1a-ba38-495ecbff971b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2xz7d" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.276750 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxl85\" (UniqueName: \"kubernetes.io/projected/dae4f68b-bbf7-441a-8c3f-8a260664215c-kube-api-access-dxl85\") pod \"migrator-59844c95c7-w4n2s\" (UID: \"dae4f68b-bbf7-441a-8c3f-8a260664215c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-w4n2s" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.276790 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/95b4d24e-09da-4c0d-9d24-81621509024a-oauth-serving-cert\") pod \"console-f9d7485db-lfz57\" (UID: \"95b4d24e-09da-4c0d-9d24-81621509024a\") " pod="openshift-console/console-f9d7485db-lfz57" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.276806 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzt9r\" (UniqueName: \"kubernetes.io/projected/1d5586f3-1dc6-422a-b01a-9719ed806021-kube-api-access-fzt9r\") pod \"openshift-apiserver-operator-796bbdcf4f-sztnk\" (UID: \"1d5586f3-1dc6-422a-b01a-9719ed806021\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sztnk" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.276861 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/95b4d24e-09da-4c0d-9d24-81621509024a-console-serving-cert\") pod \"console-f9d7485db-lfz57\" (UID: \"95b4d24e-09da-4c0d-9d24-81621509024a\") " pod="openshift-console/console-f9d7485db-lfz57" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.276892 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5586f3-1dc6-422a-b01a-9719ed806021-config\") pod \"openshift-apiserver-operator-796bbdcf4f-sztnk\" (UID: \"1d5586f3-1dc6-422a-b01a-9719ed806021\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sztnk" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.276919 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j87c\" (UniqueName: \"kubernetes.io/projected/52dfe6d5-441c-4853-810d-f56db985d9bd-kube-api-access-8j87c\") pod \"service-ca-9c57cc56f-wjtsp\" (UID: \"52dfe6d5-441c-4853-810d-f56db985d9bd\") " pod="openshift-service-ca/service-ca-9c57cc56f-wjtsp" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.276963 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3ed8be83-14ab-44e2-9c05-bf0306320a71-auth-proxy-config\") pod \"machine-approver-56656f9798-f4wwz\" (UID: \"3ed8be83-14ab-44e2-9c05-bf0306320a71\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f4wwz" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.277003 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/95049de3-9804-4b1a-ba38-495ecbff971b-profile-collector-cert\") pod \"catalog-operator-68c6474976-2xz7d\" (UID: \"95049de3-9804-4b1a-ba38-495ecbff971b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2xz7d" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.277039 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f83d2867-10a5-46ca-9f3c-caedae650499-bound-sa-token\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.277068 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2186f1d1-98b2-4ad7-91cd-acb99fc7aa5b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-9x2m7\" (UID: \"2186f1d1-98b2-4ad7-91cd-acb99fc7aa5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9x2m7" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.277093 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.277118 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ed8be83-14ab-44e2-9c05-bf0306320a71-config\") pod \"machine-approver-56656f9798-f4wwz\" (UID: \"3ed8be83-14ab-44e2-9c05-bf0306320a71\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f4wwz" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.277139 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/95b4d24e-09da-4c0d-9d24-81621509024a-service-ca\") pod \"console-f9d7485db-lfz57\" (UID: \"95b4d24e-09da-4c0d-9d24-81621509024a\") " pod="openshift-console/console-f9d7485db-lfz57" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.277159 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1fe6ec9e-1fc2-46f6-b3a8-257c478f278a-config-volume\") pod \"dns-default-qx95k\" (UID: \"1fe6ec9e-1fc2-46f6-b3a8-257c478f278a\") " pod="openshift-dns/dns-default-qx95k" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.277190 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f83d2867-10a5-46ca-9f3c-caedae650499-registry-tls\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.277235 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f83d2867-10a5-46ca-9f3c-caedae650499-ca-trust-extracted\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.277254 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ddeb5b0-b33a-45aa-8129-e227613b85f7-config\") pod \"service-ca-operator-777779d784-rpbbt\" (UID: \"8ddeb5b0-b33a-45aa-8129-e227613b85f7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rpbbt" Mar 18 18:04:28 crc kubenswrapper[4830]: E0318 18:04:28.278622 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:28.778610383 +0000 UTC m=+103.346240715 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.303877 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gm5vv"] Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.381406 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.381905 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f83d2867-10a5-46ca-9f3c-caedae650499-bound-sa-token\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.381951 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2186f1d1-98b2-4ad7-91cd-acb99fc7aa5b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-9x2m7\" (UID: \"2186f1d1-98b2-4ad7-91cd-acb99fc7aa5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9x2m7" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.381983 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/714819f9-4554-4f15-bf01-40ba2f401872-socket-dir\") pod \"csi-hostpathplugin-r267p\" (UID: \"714819f9-4554-4f15-bf01-40ba2f401872\") " pod="hostpath-provisioner/csi-hostpathplugin-r267p" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.382010 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ed8be83-14ab-44e2-9c05-bf0306320a71-config\") pod \"machine-approver-56656f9798-f4wwz\" (UID: \"3ed8be83-14ab-44e2-9c05-bf0306320a71\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f4wwz" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.382056 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/95b4d24e-09da-4c0d-9d24-81621509024a-service-ca\") pod \"console-f9d7485db-lfz57\" (UID: \"95b4d24e-09da-4c0d-9d24-81621509024a\") " pod="openshift-console/console-f9d7485db-lfz57" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.382084 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1fe6ec9e-1fc2-46f6-b3a8-257c478f278a-config-volume\") pod \"dns-default-qx95k\" (UID: \"1fe6ec9e-1fc2-46f6-b3a8-257c478f278a\") " pod="openshift-dns/dns-default-qx95k" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.382107 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dbl8\" (UniqueName: \"kubernetes.io/projected/1e193992-9fbb-46cc-bb80-ed0563456687-kube-api-access-9dbl8\") pod \"cni-sysctl-allowlist-ds-lrwxl\" (UID: \"1e193992-9fbb-46cc-bb80-ed0563456687\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lrwxl" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.382132 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f83d2867-10a5-46ca-9f3c-caedae650499-registry-tls\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.382231 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f83d2867-10a5-46ca-9f3c-caedae650499-ca-trust-extracted\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.382267 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ddeb5b0-b33a-45aa-8129-e227613b85f7-config\") pod \"service-ca-operator-777779d784-rpbbt\" (UID: \"8ddeb5b0-b33a-45aa-8129-e227613b85f7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rpbbt" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.382318 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1e193992-9fbb-46cc-bb80-ed0563456687-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-lrwxl\" (UID: \"1e193992-9fbb-46cc-bb80-ed0563456687\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lrwxl" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.382400 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95b4d24e-09da-4c0d-9d24-81621509024a-trusted-ca-bundle\") pod \"console-f9d7485db-lfz57\" (UID: \"95b4d24e-09da-4c0d-9d24-81621509024a\") " pod="openshift-console/console-f9d7485db-lfz57" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.382422 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6ltq\" (UniqueName: \"kubernetes.io/projected/2dbc415e-f205-44f6-bd62-17e259bb08d6-kube-api-access-w6ltq\") pod \"ingress-canary-gjjbf\" (UID: \"2dbc415e-f205-44f6-bd62-17e259bb08d6\") " pod="openshift-ingress-canary/ingress-canary-gjjbf" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.382463 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt6d8\" (UniqueName: \"kubernetes.io/projected/f83d2867-10a5-46ca-9f3c-caedae650499-kube-api-access-nt6d8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.382488 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvpzs\" (UniqueName: \"kubernetes.io/projected/8ddeb5b0-b33a-45aa-8129-e227613b85f7-kube-api-access-tvpzs\") pod \"service-ca-operator-777779d784-rpbbt\" (UID: \"8ddeb5b0-b33a-45aa-8129-e227613b85f7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rpbbt" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.382562 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ddeb5b0-b33a-45aa-8129-e227613b85f7-serving-cert\") pod \"service-ca-operator-777779d784-rpbbt\" (UID: \"8ddeb5b0-b33a-45aa-8129-e227613b85f7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rpbbt" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.382599 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/95049de3-9804-4b1a-ba38-495ecbff971b-srv-cert\") pod \"catalog-operator-68c6474976-2xz7d\" (UID: \"95049de3-9804-4b1a-ba38-495ecbff971b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2xz7d" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.382620 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2186f1d1-98b2-4ad7-91cd-acb99fc7aa5b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-9x2m7\" (UID: \"2186f1d1-98b2-4ad7-91cd-acb99fc7aa5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9x2m7" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.382643 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f83d2867-10a5-46ca-9f3c-caedae650499-trusted-ca\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.382692 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f83d2867-10a5-46ca-9f3c-caedae650499-installation-pull-secrets\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.382715 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1fe6ec9e-1fc2-46f6-b3a8-257c478f278a-metrics-tls\") pod \"dns-default-qx95k\" (UID: \"1fe6ec9e-1fc2-46f6-b3a8-257c478f278a\") " pod="openshift-dns/dns-default-qx95k" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.382736 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/95b4d24e-09da-4c0d-9d24-81621509024a-console-config\") pod \"console-f9d7485db-lfz57\" (UID: \"95b4d24e-09da-4c0d-9d24-81621509024a\") " pod="openshift-console/console-f9d7485db-lfz57" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.382758 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/714819f9-4554-4f15-bf01-40ba2f401872-csi-data-dir\") pod \"csi-hostpathplugin-r267p\" (UID: \"714819f9-4554-4f15-bf01-40ba2f401872\") " pod="hostpath-provisioner/csi-hostpathplugin-r267p" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.382800 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/1e193992-9fbb-46cc-bb80-ed0563456687-ready\") pod \"cni-sysctl-allowlist-ds-lrwxl\" (UID: \"1e193992-9fbb-46cc-bb80-ed0563456687\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lrwxl" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.382826 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/52dfe6d5-441c-4853-810d-f56db985d9bd-signing-key\") pod \"service-ca-9c57cc56f-wjtsp\" (UID: \"52dfe6d5-441c-4853-810d-f56db985d9bd\") " pod="openshift-service-ca/service-ca-9c57cc56f-wjtsp" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.382871 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/52dfe6d5-441c-4853-810d-f56db985d9bd-signing-cabundle\") pod \"service-ca-9c57cc56f-wjtsp\" (UID: \"52dfe6d5-441c-4853-810d-f56db985d9bd\") " pod="openshift-service-ca/service-ca-9c57cc56f-wjtsp" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.382919 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smqtb\" (UniqueName: \"kubernetes.io/projected/2186f1d1-98b2-4ad7-91cd-acb99fc7aa5b-kube-api-access-smqtb\") pod \"cluster-image-registry-operator-dc59b4c8b-9x2m7\" (UID: \"2186f1d1-98b2-4ad7-91cd-acb99fc7aa5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9x2m7" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.382942 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/714819f9-4554-4f15-bf01-40ba2f401872-plugins-dir\") pod \"csi-hostpathplugin-r267p\" (UID: \"714819f9-4554-4f15-bf01-40ba2f401872\") " pod="hostpath-provisioner/csi-hostpathplugin-r267p" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.382996 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d5586f3-1dc6-422a-b01a-9719ed806021-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-sztnk\" (UID: \"1d5586f3-1dc6-422a-b01a-9719ed806021\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sztnk" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.383052 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/95b4d24e-09da-4c0d-9d24-81621509024a-console-oauth-config\") pod \"console-f9d7485db-lfz57\" (UID: \"95b4d24e-09da-4c0d-9d24-81621509024a\") " pod="openshift-console/console-f9d7485db-lfz57" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.383075 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2dbc415e-f205-44f6-bd62-17e259bb08d6-cert\") pod \"ingress-canary-gjjbf\" (UID: \"2dbc415e-f205-44f6-bd62-17e259bb08d6\") " pod="openshift-ingress-canary/ingress-canary-gjjbf" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.383123 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2186f1d1-98b2-4ad7-91cd-acb99fc7aa5b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-9x2m7\" (UID: \"2186f1d1-98b2-4ad7-91cd-acb99fc7aa5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9x2m7" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.383144 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/714819f9-4554-4f15-bf01-40ba2f401872-registration-dir\") pod \"csi-hostpathplugin-r267p\" (UID: \"714819f9-4554-4f15-bf01-40ba2f401872\") " pod="hostpath-provisioner/csi-hostpathplugin-r267p" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.383181 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p6sn\" (UniqueName: \"kubernetes.io/projected/95b4d24e-09da-4c0d-9d24-81621509024a-kube-api-access-7p6sn\") pod \"console-f9d7485db-lfz57\" (UID: \"95b4d24e-09da-4c0d-9d24-81621509024a\") " pod="openshift-console/console-f9d7485db-lfz57" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.383204 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/3ed8be83-14ab-44e2-9c05-bf0306320a71-machine-approver-tls\") pod \"machine-approver-56656f9798-f4wwz\" (UID: \"3ed8be83-14ab-44e2-9c05-bf0306320a71\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f4wwz" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.383259 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk9k2\" (UniqueName: \"kubernetes.io/projected/3ed8be83-14ab-44e2-9c05-bf0306320a71-kube-api-access-vk9k2\") pod \"machine-approver-56656f9798-f4wwz\" (UID: \"3ed8be83-14ab-44e2-9c05-bf0306320a71\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f4wwz" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.383285 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f83d2867-10a5-46ca-9f3c-caedae650499-registry-certificates\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.383332 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnr5j\" (UniqueName: \"kubernetes.io/projected/1fe6ec9e-1fc2-46f6-b3a8-257c478f278a-kube-api-access-hnr5j\") pod \"dns-default-qx95k\" (UID: \"1fe6ec9e-1fc2-46f6-b3a8-257c478f278a\") " pod="openshift-dns/dns-default-qx95k" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.383355 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bgbt\" (UniqueName: \"kubernetes.io/projected/95049de3-9804-4b1a-ba38-495ecbff971b-kube-api-access-5bgbt\") pod \"catalog-operator-68c6474976-2xz7d\" (UID: \"95049de3-9804-4b1a-ba38-495ecbff971b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2xz7d" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.383376 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxl85\" (UniqueName: \"kubernetes.io/projected/dae4f68b-bbf7-441a-8c3f-8a260664215c-kube-api-access-dxl85\") pod \"migrator-59844c95c7-w4n2s\" (UID: \"dae4f68b-bbf7-441a-8c3f-8a260664215c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-w4n2s" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.383435 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/714819f9-4554-4f15-bf01-40ba2f401872-mountpoint-dir\") pod \"csi-hostpathplugin-r267p\" (UID: \"714819f9-4554-4f15-bf01-40ba2f401872\") " pod="hostpath-provisioner/csi-hostpathplugin-r267p" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.383471 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1e193992-9fbb-46cc-bb80-ed0563456687-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-lrwxl\" (UID: \"1e193992-9fbb-46cc-bb80-ed0563456687\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lrwxl" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.383520 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/95b4d24e-09da-4c0d-9d24-81621509024a-oauth-serving-cert\") pod \"console-f9d7485db-lfz57\" (UID: \"95b4d24e-09da-4c0d-9d24-81621509024a\") " pod="openshift-console/console-f9d7485db-lfz57" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.383541 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzt9r\" (UniqueName: \"kubernetes.io/projected/1d5586f3-1dc6-422a-b01a-9719ed806021-kube-api-access-fzt9r\") pod \"openshift-apiserver-operator-796bbdcf4f-sztnk\" (UID: \"1d5586f3-1dc6-422a-b01a-9719ed806021\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sztnk" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.383575 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/95b4d24e-09da-4c0d-9d24-81621509024a-console-serving-cert\") pod \"console-f9d7485db-lfz57\" (UID: \"95b4d24e-09da-4c0d-9d24-81621509024a\") " pod="openshift-console/console-f9d7485db-lfz57" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.383621 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5586f3-1dc6-422a-b01a-9719ed806021-config\") pod \"openshift-apiserver-operator-796bbdcf4f-sztnk\" (UID: \"1d5586f3-1dc6-422a-b01a-9719ed806021\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sztnk" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.383659 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j87c\" (UniqueName: \"kubernetes.io/projected/52dfe6d5-441c-4853-810d-f56db985d9bd-kube-api-access-8j87c\") pod \"service-ca-9c57cc56f-wjtsp\" (UID: \"52dfe6d5-441c-4853-810d-f56db985d9bd\") " pod="openshift-service-ca/service-ca-9c57cc56f-wjtsp" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.383738 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3ed8be83-14ab-44e2-9c05-bf0306320a71-auth-proxy-config\") pod \"machine-approver-56656f9798-f4wwz\" (UID: \"3ed8be83-14ab-44e2-9c05-bf0306320a71\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f4wwz" Mar 18 18:04:28 crc kubenswrapper[4830]: E0318 18:04:28.384429 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:28.88440471 +0000 UTC m=+103.452035052 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.385358 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ed8be83-14ab-44e2-9c05-bf0306320a71-config\") pod \"machine-approver-56656f9798-f4wwz\" (UID: \"3ed8be83-14ab-44e2-9c05-bf0306320a71\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f4wwz" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.388148 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1fe6ec9e-1fc2-46f6-b3a8-257c478f278a-config-volume\") pod \"dns-default-qx95k\" (UID: \"1fe6ec9e-1fc2-46f6-b3a8-257c478f278a\") " pod="openshift-dns/dns-default-qx95k" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.388666 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/52dfe6d5-441c-4853-810d-f56db985d9bd-signing-cabundle\") pod \"service-ca-9c57cc56f-wjtsp\" (UID: \"52dfe6d5-441c-4853-810d-f56db985d9bd\") " pod="openshift-service-ca/service-ca-9c57cc56f-wjtsp" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.392215 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/95b4d24e-09da-4c0d-9d24-81621509024a-service-ca\") pod \"console-f9d7485db-lfz57\" (UID: \"95b4d24e-09da-4c0d-9d24-81621509024a\") " pod="openshift-console/console-f9d7485db-lfz57" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.393393 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/95b4d24e-09da-4c0d-9d24-81621509024a-oauth-serving-cert\") pod \"console-f9d7485db-lfz57\" (UID: \"95b4d24e-09da-4c0d-9d24-81621509024a\") " pod="openshift-console/console-f9d7485db-lfz57" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.394515 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/95b4d24e-09da-4c0d-9d24-81621509024a-console-config\") pod \"console-f9d7485db-lfz57\" (UID: \"95b4d24e-09da-4c0d-9d24-81621509024a\") " pod="openshift-console/console-f9d7485db-lfz57" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.393397 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f83d2867-10a5-46ca-9f3c-caedae650499-trusted-ca\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.395489 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2186f1d1-98b2-4ad7-91cd-acb99fc7aa5b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-9x2m7\" (UID: \"2186f1d1-98b2-4ad7-91cd-acb99fc7aa5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9x2m7" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.399323 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1fe6ec9e-1fc2-46f6-b3a8-257c478f278a-metrics-tls\") pod \"dns-default-qx95k\" (UID: \"1fe6ec9e-1fc2-46f6-b3a8-257c478f278a\") " pod="openshift-dns/dns-default-qx95k" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.400046 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f83d2867-10a5-46ca-9f3c-caedae650499-registry-tls\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.402393 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g5rg5"] Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.404252 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5586f3-1dc6-422a-b01a-9719ed806021-config\") pod \"openshift-apiserver-operator-796bbdcf4f-sztnk\" (UID: \"1d5586f3-1dc6-422a-b01a-9719ed806021\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sztnk" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.404589 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95b4d24e-09da-4c0d-9d24-81621509024a-trusted-ca-bundle\") pod \"console-f9d7485db-lfz57\" (UID: \"95b4d24e-09da-4c0d-9d24-81621509024a\") " pod="openshift-console/console-f9d7485db-lfz57" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.404868 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f83d2867-10a5-46ca-9f3c-caedae650499-ca-trust-extracted\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.406397 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3ed8be83-14ab-44e2-9c05-bf0306320a71-auth-proxy-config\") pod \"machine-approver-56656f9798-f4wwz\" (UID: \"3ed8be83-14ab-44e2-9c05-bf0306320a71\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f4wwz" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.406467 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ddeb5b0-b33a-45aa-8129-e227613b85f7-config\") pod \"service-ca-operator-777779d784-rpbbt\" (UID: \"8ddeb5b0-b33a-45aa-8129-e227613b85f7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rpbbt" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.406945 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/95049de3-9804-4b1a-ba38-495ecbff971b-profile-collector-cert\") pod \"catalog-operator-68c6474976-2xz7d\" (UID: \"95049de3-9804-4b1a-ba38-495ecbff971b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2xz7d" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.406998 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfx7v\" (UniqueName: \"kubernetes.io/projected/714819f9-4554-4f15-bf01-40ba2f401872-kube-api-access-zfx7v\") pod \"csi-hostpathplugin-r267p\" (UID: \"714819f9-4554-4f15-bf01-40ba2f401872\") " pod="hostpath-provisioner/csi-hostpathplugin-r267p" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.407827 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d5586f3-1dc6-422a-b01a-9719ed806021-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-sztnk\" (UID: \"1d5586f3-1dc6-422a-b01a-9719ed806021\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sztnk" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.408301 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/95049de3-9804-4b1a-ba38-495ecbff971b-srv-cert\") pod \"catalog-operator-68c6474976-2xz7d\" (UID: \"95049de3-9804-4b1a-ba38-495ecbff971b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2xz7d" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.409584 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f83d2867-10a5-46ca-9f3c-caedae650499-registry-certificates\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.414405 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ddeb5b0-b33a-45aa-8129-e227613b85f7-serving-cert\") pod \"service-ca-operator-777779d784-rpbbt\" (UID: \"8ddeb5b0-b33a-45aa-8129-e227613b85f7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rpbbt" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.417501 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/95b4d24e-09da-4c0d-9d24-81621509024a-console-oauth-config\") pod \"console-f9d7485db-lfz57\" (UID: \"95b4d24e-09da-4c0d-9d24-81621509024a\") " pod="openshift-console/console-f9d7485db-lfz57" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.428362 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/95b4d24e-09da-4c0d-9d24-81621509024a-console-serving-cert\") pod \"console-f9d7485db-lfz57\" (UID: \"95b4d24e-09da-4c0d-9d24-81621509024a\") " pod="openshift-console/console-f9d7485db-lfz57" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.429184 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/3ed8be83-14ab-44e2-9c05-bf0306320a71-machine-approver-tls\") pod \"machine-approver-56656f9798-f4wwz\" (UID: \"3ed8be83-14ab-44e2-9c05-bf0306320a71\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f4wwz" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.432485 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f83d2867-10a5-46ca-9f3c-caedae650499-installation-pull-secrets\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.435261 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/52dfe6d5-441c-4853-810d-f56db985d9bd-signing-key\") pod \"service-ca-9c57cc56f-wjtsp\" (UID: \"52dfe6d5-441c-4853-810d-f56db985d9bd\") " pod="openshift-service-ca/service-ca-9c57cc56f-wjtsp" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.438849 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/95049de3-9804-4b1a-ba38-495ecbff971b-profile-collector-cert\") pod \"catalog-operator-68c6474976-2xz7d\" (UID: \"95049de3-9804-4b1a-ba38-495ecbff971b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2xz7d" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.440363 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2186f1d1-98b2-4ad7-91cd-acb99fc7aa5b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-9x2m7\" (UID: \"2186f1d1-98b2-4ad7-91cd-acb99fc7aa5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9x2m7" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.447714 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f83d2867-10a5-46ca-9f3c-caedae650499-bound-sa-token\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.448685 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5q9t5"] Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.454566 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2186f1d1-98b2-4ad7-91cd-acb99fc7aa5b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-9x2m7\" (UID: \"2186f1d1-98b2-4ad7-91cd-acb99fc7aa5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9x2m7" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.467738 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smqtb\" (UniqueName: \"kubernetes.io/projected/2186f1d1-98b2-4ad7-91cd-acb99fc7aa5b-kube-api-access-smqtb\") pod \"cluster-image-registry-operator-dc59b4c8b-9x2m7\" (UID: \"2186f1d1-98b2-4ad7-91cd-acb99fc7aa5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9x2m7" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.489432 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt6d8\" (UniqueName: \"kubernetes.io/projected/f83d2867-10a5-46ca-9f3c-caedae650499-kube-api-access-nt6d8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.492072 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9x2m7" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.512661 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dbl8\" (UniqueName: \"kubernetes.io/projected/1e193992-9fbb-46cc-bb80-ed0563456687-kube-api-access-9dbl8\") pod \"cni-sysctl-allowlist-ds-lrwxl\" (UID: \"1e193992-9fbb-46cc-bb80-ed0563456687\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lrwxl" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.512701 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1e193992-9fbb-46cc-bb80-ed0563456687-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-lrwxl\" (UID: \"1e193992-9fbb-46cc-bb80-ed0563456687\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lrwxl" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.512734 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6ltq\" (UniqueName: \"kubernetes.io/projected/2dbc415e-f205-44f6-bd62-17e259bb08d6-kube-api-access-w6ltq\") pod \"ingress-canary-gjjbf\" (UID: \"2dbc415e-f205-44f6-bd62-17e259bb08d6\") " pod="openshift-ingress-canary/ingress-canary-gjjbf" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.512783 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/714819f9-4554-4f15-bf01-40ba2f401872-csi-data-dir\") pod \"csi-hostpathplugin-r267p\" (UID: \"714819f9-4554-4f15-bf01-40ba2f401872\") " pod="hostpath-provisioner/csi-hostpathplugin-r267p" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.512799 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/1e193992-9fbb-46cc-bb80-ed0563456687-ready\") pod \"cni-sysctl-allowlist-ds-lrwxl\" (UID: \"1e193992-9fbb-46cc-bb80-ed0563456687\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lrwxl" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.512834 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/714819f9-4554-4f15-bf01-40ba2f401872-plugins-dir\") pod \"csi-hostpathplugin-r267p\" (UID: \"714819f9-4554-4f15-bf01-40ba2f401872\") " pod="hostpath-provisioner/csi-hostpathplugin-r267p" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.512850 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2dbc415e-f205-44f6-bd62-17e259bb08d6-cert\") pod \"ingress-canary-gjjbf\" (UID: \"2dbc415e-f205-44f6-bd62-17e259bb08d6\") " pod="openshift-ingress-canary/ingress-canary-gjjbf" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.512866 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/714819f9-4554-4f15-bf01-40ba2f401872-registration-dir\") pod \"csi-hostpathplugin-r267p\" (UID: \"714819f9-4554-4f15-bf01-40ba2f401872\") " pod="hostpath-provisioner/csi-hostpathplugin-r267p" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.512918 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/714819f9-4554-4f15-bf01-40ba2f401872-mountpoint-dir\") pod \"csi-hostpathplugin-r267p\" (UID: \"714819f9-4554-4f15-bf01-40ba2f401872\") " pod="hostpath-provisioner/csi-hostpathplugin-r267p" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.512940 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1e193992-9fbb-46cc-bb80-ed0563456687-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-lrwxl\" (UID: \"1e193992-9fbb-46cc-bb80-ed0563456687\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lrwxl" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.512979 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfx7v\" (UniqueName: \"kubernetes.io/projected/714819f9-4554-4f15-bf01-40ba2f401872-kube-api-access-zfx7v\") pod \"csi-hostpathplugin-r267p\" (UID: \"714819f9-4554-4f15-bf01-40ba2f401872\") " pod="hostpath-provisioner/csi-hostpathplugin-r267p" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.512995 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/714819f9-4554-4f15-bf01-40ba2f401872-socket-dir\") pod \"csi-hostpathplugin-r267p\" (UID: \"714819f9-4554-4f15-bf01-40ba2f401872\") " pod="hostpath-provisioner/csi-hostpathplugin-r267p" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.513022 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:28 crc kubenswrapper[4830]: E0318 18:04:28.513290 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:29.01327921 +0000 UTC m=+103.580909542 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.514251 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/714819f9-4554-4f15-bf01-40ba2f401872-registration-dir\") pod \"csi-hostpathplugin-r267p\" (UID: \"714819f9-4554-4f15-bf01-40ba2f401872\") " pod="hostpath-provisioner/csi-hostpathplugin-r267p" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.514419 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1e193992-9fbb-46cc-bb80-ed0563456687-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-lrwxl\" (UID: \"1e193992-9fbb-46cc-bb80-ed0563456687\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lrwxl" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.514473 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/714819f9-4554-4f15-bf01-40ba2f401872-mountpoint-dir\") pod \"csi-hostpathplugin-r267p\" (UID: \"714819f9-4554-4f15-bf01-40ba2f401872\") " pod="hostpath-provisioner/csi-hostpathplugin-r267p" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.514521 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/714819f9-4554-4f15-bf01-40ba2f401872-csi-data-dir\") pod \"csi-hostpathplugin-r267p\" (UID: \"714819f9-4554-4f15-bf01-40ba2f401872\") " pod="hostpath-provisioner/csi-hostpathplugin-r267p" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.514555 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1e193992-9fbb-46cc-bb80-ed0563456687-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-lrwxl\" (UID: \"1e193992-9fbb-46cc-bb80-ed0563456687\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lrwxl" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.514650 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/714819f9-4554-4f15-bf01-40ba2f401872-socket-dir\") pod \"csi-hostpathplugin-r267p\" (UID: \"714819f9-4554-4f15-bf01-40ba2f401872\") " pod="hostpath-provisioner/csi-hostpathplugin-r267p" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.514684 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/714819f9-4554-4f15-bf01-40ba2f401872-plugins-dir\") pod \"csi-hostpathplugin-r267p\" (UID: \"714819f9-4554-4f15-bf01-40ba2f401872\") " pod="hostpath-provisioner/csi-hostpathplugin-r267p" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.514897 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/1e193992-9fbb-46cc-bb80-ed0563456687-ready\") pod \"cni-sysctl-allowlist-ds-lrwxl\" (UID: \"1e193992-9fbb-46cc-bb80-ed0563456687\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lrwxl" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.515660 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvpzs\" (UniqueName: \"kubernetes.io/projected/8ddeb5b0-b33a-45aa-8129-e227613b85f7-kube-api-access-tvpzs\") pod \"service-ca-operator-777779d784-rpbbt\" (UID: \"8ddeb5b0-b33a-45aa-8129-e227613b85f7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rpbbt" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.525950 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2dbc415e-f205-44f6-bd62-17e259bb08d6-cert\") pod \"ingress-canary-gjjbf\" (UID: \"2dbc415e-f205-44f6-bd62-17e259bb08d6\") " pod="openshift-ingress-canary/ingress-canary-gjjbf" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.538745 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnr5j\" (UniqueName: \"kubernetes.io/projected/1fe6ec9e-1fc2-46f6-b3a8-257c478f278a-kube-api-access-hnr5j\") pod \"dns-default-qx95k\" (UID: \"1fe6ec9e-1fc2-46f6-b3a8-257c478f278a\") " pod="openshift-dns/dns-default-qx95k" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.562545 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bgbt\" (UniqueName: \"kubernetes.io/projected/95049de3-9804-4b1a-ba38-495ecbff971b-kube-api-access-5bgbt\") pod \"catalog-operator-68c6474976-2xz7d\" (UID: \"95049de3-9804-4b1a-ba38-495ecbff971b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2xz7d" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.585356 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxl85\" (UniqueName: \"kubernetes.io/projected/dae4f68b-bbf7-441a-8c3f-8a260664215c-kube-api-access-dxl85\") pod \"migrator-59844c95c7-w4n2s\" (UID: \"dae4f68b-bbf7-441a-8c3f-8a260664215c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-w4n2s" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.616436 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:28 crc kubenswrapper[4830]: E0318 18:04:28.616821 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:29.11680671 +0000 UTC m=+103.684437042 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.621370 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p6sn\" (UniqueName: \"kubernetes.io/projected/95b4d24e-09da-4c0d-9d24-81621509024a-kube-api-access-7p6sn\") pod \"console-f9d7485db-lfz57\" (UID: \"95b4d24e-09da-4c0d-9d24-81621509024a\") " pod="openshift-console/console-f9d7485db-lfz57" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.634699 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk9k2\" (UniqueName: \"kubernetes.io/projected/3ed8be83-14ab-44e2-9c05-bf0306320a71-kube-api-access-vk9k2\") pod \"machine-approver-56656f9798-f4wwz\" (UID: \"3ed8be83-14ab-44e2-9c05-bf0306320a71\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f4wwz" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.641978 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-rpbbt" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.661377 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2xz7d" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.675191 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzt9r\" (UniqueName: \"kubernetes.io/projected/1d5586f3-1dc6-422a-b01a-9719ed806021-kube-api-access-fzt9r\") pod \"openshift-apiserver-operator-796bbdcf4f-sztnk\" (UID: \"1d5586f3-1dc6-422a-b01a-9719ed806021\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sztnk" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.675201 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-w4n2s" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.681516 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j87c\" (UniqueName: \"kubernetes.io/projected/52dfe6d5-441c-4853-810d-f56db985d9bd-kube-api-access-8j87c\") pod \"service-ca-9c57cc56f-wjtsp\" (UID: \"52dfe6d5-441c-4853-810d-f56db985d9bd\") " pod="openshift-service-ca/service-ca-9c57cc56f-wjtsp" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.681730 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-wjtsp" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.697026 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sztnk" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.704906 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dbl8\" (UniqueName: \"kubernetes.io/projected/1e193992-9fbb-46cc-bb80-ed0563456687-kube-api-access-9dbl8\") pod \"cni-sysctl-allowlist-ds-lrwxl\" (UID: \"1e193992-9fbb-46cc-bb80-ed0563456687\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lrwxl" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.714688 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6ltq\" (UniqueName: \"kubernetes.io/projected/2dbc415e-f205-44f6-bd62-17e259bb08d6-kube-api-access-w6ltq\") pod \"ingress-canary-gjjbf\" (UID: \"2dbc415e-f205-44f6-bd62-17e259bb08d6\") " pod="openshift-ingress-canary/ingress-canary-gjjbf" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.718122 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:28 crc kubenswrapper[4830]: E0318 18:04:28.718391 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:29.218378219 +0000 UTC m=+103.786008551 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.722510 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgk4d"] Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.739372 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lfz57" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.741253 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfx7v\" (UniqueName: \"kubernetes.io/projected/714819f9-4554-4f15-bf01-40ba2f401872-kube-api-access-zfx7v\") pod \"csi-hostpathplugin-r267p\" (UID: \"714819f9-4554-4f15-bf01-40ba2f401872\") " pod="hostpath-provisioner/csi-hostpathplugin-r267p" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.801394 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5q9t5" event={"ID":"b2e93e85-70a0-4853-8fc1-2101d5b26069","Type":"ContainerStarted","Data":"a4b1a67e2df66d1982691c6b4eae40e300fb440edbfa269d5f9284257e30809d"} Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.808956 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qx95k" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.818760 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-xv8d6" event={"ID":"498fcaeb-168e-4860-9f5d-a7c72ee1808f","Type":"ContainerStarted","Data":"f94f50ae53cf34e1c8d97a720747c4a5bbf8fdf9fe30eefb3f5023d39b352c72"} Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.819732 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:28 crc kubenswrapper[4830]: E0318 18:04:28.820127 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:29.320101403 +0000 UTC m=+103.887731735 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.820185 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:28 crc kubenswrapper[4830]: E0318 18:04:28.820489 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:29.320482103 +0000 UTC m=+103.888112435 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.834607 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-f77k6" event={"ID":"9279fbd5-1378-4a9a-b35d-85a7b9430674","Type":"ContainerStarted","Data":"6464502ba3e491703345b349d1a93a8e05d7f200bab11e2648a7d39872bef423"} Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.835063 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-f77k6" event={"ID":"9279fbd5-1378-4a9a-b35d-85a7b9430674","Type":"ContainerStarted","Data":"3d53658e1e6407a869f7f6860fcc259d2ed1f4493b979f243e2b5bdefe63f031"} Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.837283 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq" event={"ID":"d43b41ee-1e3d-42bf-8856-0678d441ac96","Type":"ContainerStarted","Data":"942a5f8d790f0876f8039151255bcf4b27a099abb4abe531ad812f5f14828bf1"} Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.838028 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g5rg5" event={"ID":"57c79b01-642f-4c45-886c-b3e852c0bc23","Type":"ContainerStarted","Data":"515f909eada8f2c6c876fae0493d7f3e0f458fb4f2c5ab1a3bee00fcc2162e6d"} Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.843182 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cnwlr" event={"ID":"35a6cab9-b63f-4ed5-ac08-897e894498c5","Type":"ContainerStarted","Data":"ac01a25dc5e6d39db2ae11b713c3febc375735b48f2d725dd976dd196a3c75cf"} Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.846241 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j2j42"] Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.847205 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-cww5w"] Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.849231 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-vk4md" event={"ID":"01289133-eb09-4497-8df9-bfd2ee3e0357","Type":"ContainerStarted","Data":"2574c17e365049a3be6c26ffc4ae885436b0460cfa0aca7e3c372f61ab87adec"} Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.855316 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-r267p" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.856699 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm5vv" event={"ID":"c8de305f-4cde-4354-ad95-b74003e014a2","Type":"ContainerStarted","Data":"f587872fd1ad1fb2c5e594ff3c4b1427cf9e06ca0a9425b1133679fdd9b6d7f7"} Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.859857 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gjjbf" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.867816 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-lrwxl" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.871300 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-t7xl2" event={"ID":"8b0c7aa1-248e-4847-97f5-c08c17e78c3d","Type":"ContainerStarted","Data":"244c51394f80bc699de52ad44204bfd3709470cf07023e0800eadd92b5f51eec"} Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.871336 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-t7xl2" event={"ID":"8b0c7aa1-248e-4847-97f5-c08c17e78c3d","Type":"ContainerStarted","Data":"31ee00aa0f635077c1023c4ca73d06142677a3feeb1ee25c4b814ebcbb20f1e9"} Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.871812 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-t7xl2" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.880370 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-s6twn" event={"ID":"96025a62-2435-451a-93bf-b03d24d6cfc1","Type":"ContainerStarted","Data":"2a42b74a1432edad75b094f8b1b35aefda7439144c3e5248eeb78e5c3dc93eac"} Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.880430 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-s6twn" event={"ID":"96025a62-2435-451a-93bf-b03d24d6cfc1","Type":"ContainerStarted","Data":"88175ad536a8a2edce73ba91347ce0b6e18b638ac04ab1cc8f514c9d09b1e72b"} Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.886040 4830 patch_prober.go:28] interesting pod/downloads-7954f5f757-t7xl2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.886078 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-t7xl2" podUID="8b0c7aa1-248e-4847-97f5-c08c17e78c3d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.888932 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fltkw" event={"ID":"215da1aa-97ec-4ef7-a65d-597190dc6c63","Type":"ContainerStarted","Data":"b620c92021de59056e200e41b7d81fa417fce5b6195beeac301548177c8a063e"} Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.900747 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kczvm"] Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.900907 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gvn7f" event={"ID":"6ba02dfe-ace9-4644-b56c-cba779cfb2ec","Type":"ContainerStarted","Data":"38294a94d224c112526761fff7304538750b7d30113057e883618fab999a37fb"} Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.903955 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-mfdzp" event={"ID":"008bfbc9-9b16-4769-ba0a-116a67b7fdb4","Type":"ContainerStarted","Data":"589bdd5580ec945dad18a0da1499fb4e4d4568d44d577ce6a08710c0e99a4901"} Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.906290 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kr648" event={"ID":"884979c1-fccc-4bd6-b6db-4ec35bd9bdf7","Type":"ContainerStarted","Data":"3fd61dea5493b054823cc65823154a6135f3e72807cc39f88c9838ed50dae29c"} Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.906310 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kr648" event={"ID":"884979c1-fccc-4bd6-b6db-4ec35bd9bdf7","Type":"ContainerStarted","Data":"67d7adc6a22fae881067c02bf5b86ae11eba40db98d19f6df76eea58d0b00d3d"} Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.921250 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:28 crc kubenswrapper[4830]: E0318 18:04:28.921365 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:29.421338244 +0000 UTC m=+103.988968576 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.921595 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:28 crc kubenswrapper[4830]: E0318 18:04:28.923199 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:29.423181071 +0000 UTC m=+103.990811403 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:28 crc kubenswrapper[4830]: I0318 18:04:28.932559 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f4wwz" Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.022544 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:29 crc kubenswrapper[4830]: E0318 18:04:29.025225 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:29.525209012 +0000 UTC m=+104.092839344 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.045729 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-wnkwx"] Mar 18 18:04:29 crc kubenswrapper[4830]: W0318 18:04:29.058069 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84a21e6e_7bda_408b_a607_f02b4f807535.slice/crio-377506eab5a17193ab2406749c4ee6f41d9333773dd66510a3514ec88fb22891 WatchSource:0}: Error finding container 377506eab5a17193ab2406749c4ee6f41d9333773dd66510a3514ec88fb22891: Status 404 returned error can't find the container with id 377506eab5a17193ab2406749c4ee6f41d9333773dd66510a3514ec88fb22891 Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.102655 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bzlw5"] Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.109995 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6kfbf"] Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.126250 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:29 crc kubenswrapper[4830]: E0318 18:04:29.126489 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:29.626475694 +0000 UTC m=+104.194106016 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.191623 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4cqtc"] Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.201357 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fhz5c"] Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.211699 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-s77pq"] Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.218763 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-27p2h"] Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.221176 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-59q79"] Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.230970 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:29 crc kubenswrapper[4830]: E0318 18:04:29.231364 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:29.731335827 +0000 UTC m=+104.298966159 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.231570 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:29 crc kubenswrapper[4830]: E0318 18:04:29.231878 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:29.731870611 +0000 UTC m=+104.299500943 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.338535 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:29 crc kubenswrapper[4830]: E0318 18:04:29.338837 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:29.838745855 +0000 UTC m=+104.406376207 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.339641 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:29 crc kubenswrapper[4830]: E0318 18:04:29.340317 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:29.840295304 +0000 UTC m=+104.407925636 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.440848 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:29 crc kubenswrapper[4830]: E0318 18:04:29.441589 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:29.941361041 +0000 UTC m=+104.508991373 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.542510 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:29 crc kubenswrapper[4830]: E0318 18:04:29.543281 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:30.043267979 +0000 UTC m=+104.610898311 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.584581 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ltsqn"] Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.617631 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-npj9g"] Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.628131 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564280-6hx2v"] Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.644922 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:29 crc kubenswrapper[4830]: E0318 18:04:29.645265 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:30.145242069 +0000 UTC m=+104.712872401 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.656064 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnxfs"] Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.662796 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-rpbbt"] Mar 18 18:04:29 crc kubenswrapper[4830]: W0318 18:04:29.681535 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94158f29_fe7c_44f5_85a6_ec3e7f39a50a.slice/crio-6d8a323f53274ad9ab8abf75ed7a6db0c88805126f484a4beb078b2f1ec60f8c WatchSource:0}: Error finding container 6d8a323f53274ad9ab8abf75ed7a6db0c88805126f484a4beb078b2f1ec60f8c: Status 404 returned error can't find the container with id 6d8a323f53274ad9ab8abf75ed7a6db0c88805126f484a4beb078b2f1ec60f8c Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.705326 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.716530 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-wjtsp"] Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.747449 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:29 crc kubenswrapper[4830]: E0318 18:04:29.747927 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:30.247902136 +0000 UTC m=+104.815532468 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.792395 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gjjbf"] Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.807284 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-w4n2s"] Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.809471 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-lfz57"] Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.830517 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sztnk"] Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.835372 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qx95k"] Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.839240 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2xz7d"] Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.841464 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9x2m7"] Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.848191 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:29 crc kubenswrapper[4830]: E0318 18:04:29.848557 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:30.348501811 +0000 UTC m=+104.916132143 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.856044 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:29 crc kubenswrapper[4830]: E0318 18:04:29.856644 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:30.356617567 +0000 UTC m=+104.924247899 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.923738 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-r267p"] Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.928629 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-wjtsp" event={"ID":"52dfe6d5-441c-4853-810d-f56db985d9bd","Type":"ContainerStarted","Data":"8beb1cdbb7a1b72748e5a1babfce60f661baf3f9a172a94c87c97eb6152c185a"} Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.936939 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g5rg5" event={"ID":"57c79b01-642f-4c45-886c-b3e852c0bc23","Type":"ContainerStarted","Data":"e531b3947fb7e33fed54e0f35aed23e5958ce37eea91282114f412cbb5d04b5e"} Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.939250 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cnwlr" event={"ID":"35a6cab9-b63f-4ed5-ac08-897e894498c5","Type":"ContainerStarted","Data":"df71dba90fd8d7a4a61b513e9e04447374d83370486ca198e2ce4ec93e7c66a4"} Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.940221 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-cnwlr" Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.942064 4830 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-cnwlr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.942115 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-cnwlr" podUID="35a6cab9-b63f-4ed5-ac08-897e894498c5" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.946996 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4cqtc" event={"ID":"2032a3f5-b88c-423b-a25d-3768950ac81c","Type":"ContainerStarted","Data":"5fc444af53e28b7a570343370aa46eb52c6cf30f7af132d9bec4290bbf123101"} Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.950045 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-lrwxl" event={"ID":"1e193992-9fbb-46cc-bb80-ed0563456687","Type":"ContainerStarted","Data":"9bed7717e556d33909244919a91e052eb4990fe7f7b890f2c09d8f0650485320"} Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.952917 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-59q79" event={"ID":"abd70175-8048-40dc-8f82-72d1112b0af0","Type":"ContainerStarted","Data":"ed1eff6e82b0b669ee77a3d68d097e8b477bf6adb32934803bde2492cc2fe667"} Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.957381 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:29 crc kubenswrapper[4830]: E0318 18:04:29.957816 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:30.457800046 +0000 UTC m=+105.025430378 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.962742 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6kfbf" event={"ID":"01d0fb39-b10a-4717-8e77-ed73f95166bd","Type":"ContainerStarted","Data":"5099d4273f7728440a24f3386658c3d50ed3c9dae9c4db6150be99518c0ff165"} Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.962809 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6kfbf" event={"ID":"01d0fb39-b10a-4717-8e77-ed73f95166bd","Type":"ContainerStarted","Data":"9151858b24e7d88aa89c9900af564776d5ff1defc38885deb962c3319c2bdeb6"} Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.968733 4830 generic.go:334] "Generic (PLEG): container finished" podID="d43b41ee-1e3d-42bf-8856-0678d441ac96" containerID="da73a0463a8892e5210aa1671018b67121528d3d63262605861be48a232abafa" exitCode=0 Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.969020 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq" event={"ID":"d43b41ee-1e3d-42bf-8856-0678d441ac96","Type":"ContainerDied","Data":"da73a0463a8892e5210aa1671018b67121528d3d63262605861be48a232abafa"} Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.974709 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-6hx2v" event={"ID":"19aab548-96a0-4056-8226-f9e7cf4b3ca3","Type":"ContainerStarted","Data":"25c71557e35285fabcbc5018999db857e6c2d7dceef7c15de7d8b301679c6aa1"} Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.984317 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ltsqn" event={"ID":"94158f29-fe7c-44f5-85a6-ec3e7f39a50a","Type":"ContainerStarted","Data":"6d8a323f53274ad9ab8abf75ed7a6db0c88805126f484a4beb078b2f1ec60f8c"} Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.995682 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" event={"ID":"84a21e6e-7bda-408b-a607-f02b4f807535","Type":"ContainerStarted","Data":"45ac4097ed87ea50bce298a8c5c2aac3970b76fc10ec18ef9adc87fa65809345"} Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.995729 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" event={"ID":"84a21e6e-7bda-408b-a607-f02b4f807535","Type":"ContainerStarted","Data":"377506eab5a17193ab2406749c4ee6f41d9333773dd66510a3514ec88fb22891"} Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.996265 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.997915 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgk4d" event={"ID":"a043d02e-8a8a-42e6-839d-15dc6c0b43b6","Type":"ContainerStarted","Data":"1bd817a4a767c064c9b26db206bf6a6e301eb220026b4c6c9899defb83ac26fb"} Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.997940 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgk4d" event={"ID":"a043d02e-8a8a-42e6-839d-15dc6c0b43b6","Type":"ContainerStarted","Data":"6c5e1574f9061dcc5d30d92f7ef465dd8d33cedb677ec57dba82aab5cd4c88b2"} Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.999463 4830 generic.go:334] "Generic (PLEG): container finished" podID="884979c1-fccc-4bd6-b6db-4ec35bd9bdf7" containerID="3fd61dea5493b054823cc65823154a6135f3e72807cc39f88c9838ed50dae29c" exitCode=0 Mar 18 18:04:29 crc kubenswrapper[4830]: I0318 18:04:29.999622 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kr648" event={"ID":"884979c1-fccc-4bd6-b6db-4ec35bd9bdf7","Type":"ContainerDied","Data":"3fd61dea5493b054823cc65823154a6135f3e72807cc39f88c9838ed50dae29c"} Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:29.999813 4830 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-kczvm container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.21:6443/healthz\": dial tcp 10.217.0.21:6443: connect: connection refused" start-of-body= Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:29.999849 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" podUID="84a21e6e-7bda-408b-a607-f02b4f807535" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.21:6443/healthz\": dial tcp 10.217.0.21:6443: connect: connection refused" Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.006650 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sztnk" event={"ID":"1d5586f3-1dc6-422a-b01a-9719ed806021","Type":"ContainerStarted","Data":"fb04235bddd487d6eb19abdd355845df6929a44df67648612b034925fcae75d0"} Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.013587 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-xv8d6" event={"ID":"498fcaeb-168e-4860-9f5d-a7c72ee1808f","Type":"ContainerStarted","Data":"9f60741f36698f9d00717b52349ca3cb263f2cb3256a5d4fbdb0377c2aaa0ebc"} Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.036888 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-f77k6" event={"ID":"9279fbd5-1378-4a9a-b35d-85a7b9430674","Type":"ContainerStarted","Data":"bcc0ec47053d73b0958129c4e4716d7ab39a7c7658cf64d8a768b131ef0fbf47"} Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.045313 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm5vv" event={"ID":"c8de305f-4cde-4354-ad95-b74003e014a2","Type":"ContainerStarted","Data":"8e5677af0220b377d6b69ac452b20c793d46fb48d01e6f9bc092ecbfe441feb3"} Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.104669 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-vk4md" event={"ID":"01289133-eb09-4497-8df9-bfd2ee3e0357","Type":"ContainerStarted","Data":"bb33fa4aa65d0b831c410625a59461fc3305c62018d39c2cb35f6269de6fd5e2"} Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.105353 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:30 crc kubenswrapper[4830]: E0318 18:04:30.109354 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:30.60933632 +0000 UTC m=+105.176966742 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.118821 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-rpbbt" event={"ID":"8ddeb5b0-b33a-45aa-8129-e227613b85f7","Type":"ContainerStarted","Data":"7c1291c91b875cd67301ec23a230fc78d96021ec2dc00cd0fb49838adcb1590b"} Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.137308 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5q9t5" event={"ID":"b2e93e85-70a0-4853-8fc1-2101d5b26069","Type":"ContainerStarted","Data":"ca41988a8725d58de6fce607217f345defb0d4dc266a516873137246e511a81e"} Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.137576 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5q9t5" Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.153426 4830 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-5q9t5 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:5443/healthz\": dial tcp 10.217.0.26:5443: connect: connection refused" start-of-body= Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.153480 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5q9t5" podUID="b2e93e85-70a0-4853-8fc1-2101d5b26069" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.26:5443/healthz\": dial tcp 10.217.0.26:5443: connect: connection refused" Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.156441 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-cww5w" event={"ID":"8bcb1a4d-f708-4d3a-81f1-8373e36eb474","Type":"ContainerStarted","Data":"715618a755b0d31627193c8f97f5ce21bb59adbc108cb3aad524428060fd8ab0"} Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.156995 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-cww5w" event={"ID":"8bcb1a4d-f708-4d3a-81f1-8373e36eb474","Type":"ContainerStarted","Data":"44e79bd22b2cfa8cd7fa0577968d0c7027ce87e1bdc1e74376d2fb40aa097394"} Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.158827 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-cww5w" Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.166976 4830 patch_prober.go:28] interesting pod/console-operator-58897d9998-cww5w container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.167296 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-cww5w" podUID="8bcb1a4d-f708-4d3a-81f1-8373e36eb474" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.175968 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g5rg5" podStartSLOduration=63.175932375 podStartE2EDuration="1m3.175932375s" podCreationTimestamp="2026-03-18 18:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:30.170687243 +0000 UTC m=+104.738317595" watchObservedRunningTime="2026-03-18 18:04:30.175932375 +0000 UTC m=+104.743562727" Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.197718 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-j2j42" event={"ID":"7332042a-dffc-4c3e-94eb-2a1dedc58062","Type":"ContainerStarted","Data":"7a5a7383097e5257a71b6aa3da8a366746db9f368caa6f41e4d909a957e6fb28"} Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.197781 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-j2j42" event={"ID":"7332042a-dffc-4c3e-94eb-2a1dedc58062","Type":"ContainerStarted","Data":"e3b3476a4f611f80c90c6f9c9f4cd6702275d3470a1c6289c245f21adcc96e36"} Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.198612 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-j2j42" Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.202351 4830 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-j2j42 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/healthz\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.202442 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-j2j42" podUID="7332042a-dffc-4c3e-94eb-2a1dedc58062" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.13:8080/healthz\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.202713 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lfz57" event={"ID":"95b4d24e-09da-4c0d-9d24-81621509024a","Type":"ContainerStarted","Data":"052cda2b09aa5376af7220895ab194ca96345372ea8b743b3811ddb0eb9b2045"} Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.208373 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:30 crc kubenswrapper[4830]: E0318 18:04:30.208753 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:30.708735435 +0000 UTC m=+105.276365767 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.219218 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnxfs" event={"ID":"2782ade8-7344-423c-8ace-e9fe0b0fd207","Type":"ContainerStarted","Data":"00fca3270da7495bbe1fb7e39de62060600f8a9661d4d7a8ba674376a1d49182"} Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.223277 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fhz5c" event={"ID":"2e1a48f2-d6ff-4699-aea7-66f08f0a4e4b","Type":"ContainerStarted","Data":"4746742f127abc4527374610f7dfa38b51f78d0e8ceb57324ace8a1c1c83624e"} Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.225271 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-mfdzp" event={"ID":"008bfbc9-9b16-4769-ba0a-116a67b7fdb4","Type":"ContainerStarted","Data":"776961b12948d04c013061c66beff64e1b72cabe82107b28894f5e8a7ea491bb"} Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.231576 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-wnkwx" event={"ID":"b2d85f1d-8085-42e9-b3a3-dfa5eed4c3cd","Type":"ContainerStarted","Data":"e4f86c8450fa31adbea4fcebfd791dd6e007e5385fe419a69c44d04d704ee104"} Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.237691 4830 generic.go:334] "Generic (PLEG): container finished" podID="2b7160ce-096f-4305-9954-982608b133ac" containerID="d6cbbe58ab3d997b0640fcb26493ec38b4d58d2b6af119eac8f207b518797859" exitCode=0 Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.250388 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" event={"ID":"2b7160ce-096f-4305-9954-982608b133ac","Type":"ContainerDied","Data":"d6cbbe58ab3d997b0640fcb26493ec38b4d58d2b6af119eac8f207b518797859"} Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.250436 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" event={"ID":"2b7160ce-096f-4305-9954-982608b133ac","Type":"ContainerStarted","Data":"bc31713d256ec0be2279312c08e34f573d031d71c9d511688b53ed00b52d5b4b"} Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.250454 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-s77pq" event={"ID":"602fd67e-3c82-46aa-879d-17bbd976e85b","Type":"ContainerStarted","Data":"829359ee2fcddaf338d8fe7e4f4a441522c8d0e6914f7341cde672c61321ca4d"} Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.250879 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f4wwz" event={"ID":"3ed8be83-14ab-44e2-9c05-bf0306320a71","Type":"ContainerStarted","Data":"c2ba3d4ef6c5f2eb2d7f40680b7674d63b595f994d2c936d3ea4aa53ef822cb0"} Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.261267 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27p2h" event={"ID":"defba3b0-b36e-4b8e-a8b1-577782a54249","Type":"ContainerStarted","Data":"2216f56c8f6a58801f8930f2cb045c34fa23d8a783b3914886e516f7ea4c2efa"} Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.264211 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-npj9g" event={"ID":"69e95a31-ebb4-4647-b358-ad5a85023485","Type":"ContainerStarted","Data":"8c43a86b4482c2da8de2825d95980add635e9c5cebe92ab7f54993f51338f8a8"} Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.273058 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-w4n2s" event={"ID":"dae4f68b-bbf7-441a-8c3f-8a260664215c","Type":"ContainerStarted","Data":"cdc69a27dfce800f1ff971004ca402b3061313f49903bbc9461da17f3e7e2a84"} Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.284048 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gvn7f" event={"ID":"6ba02dfe-ace9-4644-b56c-cba779cfb2ec","Type":"ContainerStarted","Data":"8ec28a97cfa6c679ef1235806a65ac1e4e7a15042aa2ee4e6e8c67a37abaa794"} Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.286821 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-t7xl2" podStartSLOduration=64.28680346 podStartE2EDuration="1m4.28680346s" podCreationTimestamp="2026-03-18 18:03:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:30.241848333 +0000 UTC m=+104.809478665" watchObservedRunningTime="2026-03-18 18:04:30.28680346 +0000 UTC m=+104.854433792" Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.287300 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-f77k6" podStartSLOduration=63.287294093 podStartE2EDuration="1m3.287294093s" podCreationTimestamp="2026-03-18 18:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:30.284221985 +0000 UTC m=+104.851852317" watchObservedRunningTime="2026-03-18 18:04:30.287294093 +0000 UTC m=+104.854924425" Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.316225 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.316497 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gjjbf" event={"ID":"2dbc415e-f205-44f6-bd62-17e259bb08d6","Type":"ContainerStarted","Data":"bab8a05d68a2ff819e20be4c096c160b80382ce28f5a106f91d3a20ec1d2d07e"} Mar 18 18:04:30 crc kubenswrapper[4830]: E0318 18:04:30.319918 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:30.819897198 +0000 UTC m=+105.387527530 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.336935 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fltkw" event={"ID":"215da1aa-97ec-4ef7-a65d-597190dc6c63","Type":"ContainerStarted","Data":"adb7bcc653d58f5fe65e138a300d02df4dfc85eadf122d55dbbddc0a3f408f4c"} Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.336994 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fltkw" event={"ID":"215da1aa-97ec-4ef7-a65d-597190dc6c63","Type":"ContainerStarted","Data":"216eeae0cb46b48df7cd16ecfe47bebaec14e58fc0fb9766eac1254fae66abed"} Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.347172 4830 patch_prober.go:28] interesting pod/downloads-7954f5f757-t7xl2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.347243 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-t7xl2" podUID="8b0c7aa1-248e-4847-97f5-c08c17e78c3d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.375482 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-cnwlr" podStartSLOduration=64.375465982 podStartE2EDuration="1m4.375465982s" podCreationTimestamp="2026-03-18 18:03:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:30.337455162 +0000 UTC m=+104.905085494" watchObservedRunningTime="2026-03-18 18:04:30.375465982 +0000 UTC m=+104.943096314" Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.419427 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" podStartSLOduration=64.419409224 podStartE2EDuration="1m4.419409224s" podCreationTimestamp="2026-03-18 18:03:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:30.373601975 +0000 UTC m=+104.941232307" watchObservedRunningTime="2026-03-18 18:04:30.419409224 +0000 UTC m=+104.987039556" Mar 18 18:04:30 crc kubenswrapper[4830]: E0318 18:04:30.420021 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:30.920004599 +0000 UTC m=+105.487634931 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.419622 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.420176 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:30 crc kubenswrapper[4830]: E0318 18:04:30.421320 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:30.921296762 +0000 UTC m=+105.488927104 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.521596 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:30 crc kubenswrapper[4830]: E0318 18:04:30.521935 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:31.021920078 +0000 UTC m=+105.589550410 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.541593 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-s6twn" podStartSLOduration=64.541570975 podStartE2EDuration="1m4.541570975s" podCreationTimestamp="2026-03-18 18:03:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:30.488394769 +0000 UTC m=+105.056025101" watchObservedRunningTime="2026-03-18 18:04:30.541570975 +0000 UTC m=+105.109201307" Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.543480 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-xv8d6" podStartSLOduration=63.543472983 podStartE2EDuration="1m3.543472983s" podCreationTimestamp="2026-03-18 18:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:30.541211636 +0000 UTC m=+105.108841978" watchObservedRunningTime="2026-03-18 18:04:30.543472983 +0000 UTC m=+105.111103315" Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.571014 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-vk4md" podStartSLOduration=5.570997689 podStartE2EDuration="5.570997689s" podCreationTimestamp="2026-03-18 18:04:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:30.568062365 +0000 UTC m=+105.135692697" watchObservedRunningTime="2026-03-18 18:04:30.570997689 +0000 UTC m=+105.138628021" Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.635951 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:30 crc kubenswrapper[4830]: E0318 18:04:30.636307 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:31.136295331 +0000 UTC m=+105.703925663 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.739956 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:30 crc kubenswrapper[4830]: E0318 18:04:30.740208 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:31.24018745 +0000 UTC m=+105.807817792 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.744254 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:30 crc kubenswrapper[4830]: E0318 18:04:30.744730 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:31.244714974 +0000 UTC m=+105.812345306 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.845883 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:30 crc kubenswrapper[4830]: E0318 18:04:30.846113 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:31.346078919 +0000 UTC m=+105.913709261 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.846535 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:30 crc kubenswrapper[4830]: E0318 18:04:30.846854 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:31.346842658 +0000 UTC m=+105.914472990 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.956527 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.956754 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.956879 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.956929 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.956975 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:30 crc kubenswrapper[4830]: E0318 18:04:30.964191 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:31.464143486 +0000 UTC m=+106.031773828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.973811 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.979012 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.982949 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.986816 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gvn7f" podStartSLOduration=63.986730037 podStartE2EDuration="1m3.986730037s" podCreationTimestamp="2026-03-18 18:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:30.963889049 +0000 UTC m=+105.531519381" watchObservedRunningTime="2026-03-18 18:04:30.986730037 +0000 UTC m=+105.554360369" Mar 18 18:04:30 crc kubenswrapper[4830]: I0318 18:04:30.989569 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.055123 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-cww5w" podStartSLOduration=65.055095427 podStartE2EDuration="1m5.055095427s" podCreationTimestamp="2026-03-18 18:03:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:31.045411412 +0000 UTC m=+105.613041754" watchObservedRunningTime="2026-03-18 18:04:31.055095427 +0000 UTC m=+105.622725759" Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.061540 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.061583 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/437f27f7-4531-4e3e-b3a9-a471c7630012-metrics-certs\") pod \"network-metrics-daemon-wx6kd\" (UID: \"437f27f7-4531-4e3e-b3a9-a471c7630012\") " pod="openshift-multus/network-metrics-daemon-wx6kd" Mar 18 18:04:31 crc kubenswrapper[4830]: E0318 18:04:31.062839 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:31.562759371 +0000 UTC m=+106.130389703 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.072278 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.082080 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/437f27f7-4531-4e3e-b3a9-a471c7630012-metrics-certs\") pod \"network-metrics-daemon-wx6kd\" (UID: \"437f27f7-4531-4e3e-b3a9-a471c7630012\") " pod="openshift-multus/network-metrics-daemon-wx6kd" Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.156406 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.156664 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-mfdzp" Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.159645 4830 patch_prober.go:28] interesting pod/router-default-5444994796-mfdzp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 18:04:31 crc kubenswrapper[4830]: [-]has-synced failed: reason withheld Mar 18 18:04:31 crc kubenswrapper[4830]: [+]process-running ok Mar 18 18:04:31 crc kubenswrapper[4830]: healthz check failed Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.159692 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mfdzp" podUID="008bfbc9-9b16-4769-ba0a-116a67b7fdb4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.165320 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:31 crc kubenswrapper[4830]: E0318 18:04:31.165700 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:31.665684295 +0000 UTC m=+106.233314627 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.177733 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.210704 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-j2j42" podStartSLOduration=64.210686053 podStartE2EDuration="1m4.210686053s" podCreationTimestamp="2026-03-18 18:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:31.176906379 +0000 UTC m=+105.744536711" watchObservedRunningTime="2026-03-18 18:04:31.210686053 +0000 UTC m=+105.778316375" Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.268555 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:31 crc kubenswrapper[4830]: E0318 18:04:31.269132 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:31.769088471 +0000 UTC m=+106.336718803 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.348507 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-mfdzp" podStartSLOduration=64.34848463 podStartE2EDuration="1m4.34848463s" podCreationTimestamp="2026-03-18 18:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:31.346318365 +0000 UTC m=+105.913948697" watchObservedRunningTime="2026-03-18 18:04:31.34848463 +0000 UTC m=+105.916114962" Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.361360 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wx6kd" Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.369306 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:31 crc kubenswrapper[4830]: E0318 18:04:31.369576 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:31.869558603 +0000 UTC m=+106.437188935 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.388354 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sztnk" event={"ID":"1d5586f3-1dc6-422a-b01a-9719ed806021","Type":"ContainerStarted","Data":"ebe8f7808fda681d6524a38d719053296169de374b2d2cb7f51c932e36f86a2f"} Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.393503 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm5vv" event={"ID":"c8de305f-4cde-4354-ad95-b74003e014a2","Type":"ContainerStarted","Data":"de0e4138dc440e1e08ff21d124cc338e189693b53471ffbfa8d97be7074f82e1"} Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.403607 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fltkw" podStartSLOduration=64.403592094 podStartE2EDuration="1m4.403592094s" podCreationTimestamp="2026-03-18 18:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:31.40147144 +0000 UTC m=+105.969101772" watchObservedRunningTime="2026-03-18 18:04:31.403592094 +0000 UTC m=+105.971222426" Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.425889 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9x2m7" event={"ID":"2186f1d1-98b2-4ad7-91cd-acb99fc7aa5b","Type":"ContainerStarted","Data":"903c617814bc2b5123d196ff66e3d226b06157969e1f4000e650b95f51d0ed8d"} Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.425986 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9x2m7" event={"ID":"2186f1d1-98b2-4ad7-91cd-acb99fc7aa5b","Type":"ContainerStarted","Data":"285d404d491e4024e9b6320e1ab524f80cc1c0dbf80570b7a24313d7d4e246be"} Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.428011 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5q9t5" podStartSLOduration=64.427992731 podStartE2EDuration="1m4.427992731s" podCreationTimestamp="2026-03-18 18:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:31.427784496 +0000 UTC m=+105.995414828" watchObservedRunningTime="2026-03-18 18:04:31.427992731 +0000 UTC m=+105.995623063" Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.445588 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6kfbf" event={"ID":"01d0fb39-b10a-4717-8e77-ed73f95166bd","Type":"ContainerStarted","Data":"33c47f7819b06875a925fd61b335ec8a1cbf87d68af1407709e32e34a74e0660"} Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.446356 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6kfbf" Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.460191 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fhz5c" event={"ID":"2e1a48f2-d6ff-4699-aea7-66f08f0a4e4b","Type":"ContainerStarted","Data":"61e79751d799720e3791225324943a091dc8b27c1003e3f21ec672095ae4313e"} Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.470944 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.473180 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4cqtc" event={"ID":"2032a3f5-b88c-423b-a25d-3768950ac81c","Type":"ContainerStarted","Data":"a9684378675f8ae8f0c3b86db4725b3a4444ae17b299044f0074a006b6e91cc4"} Mar 18 18:04:31 crc kubenswrapper[4830]: E0318 18:04:31.473623 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:31.973610145 +0000 UTC m=+106.541240467 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.474814 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4cqtc" Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.525226 4830 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-4cqtc container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.525306 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4cqtc" podUID="2032a3f5-b88c-423b-a25d-3768950ac81c" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.553345 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm5vv" podStartSLOduration=64.553313812 podStartE2EDuration="1m4.553313812s" podCreationTimestamp="2026-03-18 18:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:31.550674355 +0000 UTC m=+106.118304677" watchObservedRunningTime="2026-03-18 18:04:31.553313812 +0000 UTC m=+106.120944144" Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.572150 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:31 crc kubenswrapper[4830]: E0318 18:04:31.575437 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:32.075405421 +0000 UTC m=+106.643035753 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.598259 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-lrwxl" event={"ID":"1e193992-9fbb-46cc-bb80-ed0563456687","Type":"ContainerStarted","Data":"81b8be7dfb58f04ef54ad801b0e06061e15df68bb8d2b73116eb4d548beb2a4c"} Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.599234 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-lrwxl" Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.609382 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sztnk" podStartSLOduration=65.60935628 podStartE2EDuration="1m5.60935628s" podCreationTimestamp="2026-03-18 18:03:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:31.59947989 +0000 UTC m=+106.167110222" watchObservedRunningTime="2026-03-18 18:04:31.60935628 +0000 UTC m=+106.176986612" Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.673815 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:31 crc kubenswrapper[4830]: E0318 18:04:31.674442 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:32.174428716 +0000 UTC m=+106.742059048 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.675593 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4cqtc" podStartSLOduration=64.675582545 podStartE2EDuration="1m4.675582545s" podCreationTimestamp="2026-03-18 18:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:31.675211216 +0000 UTC m=+106.242841548" watchObservedRunningTime="2026-03-18 18:04:31.675582545 +0000 UTC m=+106.243212877" Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.676376 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fhz5c" podStartSLOduration=64.676368325 podStartE2EDuration="1m4.676368325s" podCreationTimestamp="2026-03-18 18:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:31.64178315 +0000 UTC m=+106.209413482" watchObservedRunningTime="2026-03-18 18:04:31.676368325 +0000 UTC m=+106.243998657" Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.709917 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gjjbf" event={"ID":"2dbc415e-f205-44f6-bd62-17e259bb08d6","Type":"ContainerStarted","Data":"9f325968ba63f3f74f69591a56e970cb5d35ee5d28240163362df9109e93b052"} Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.737600 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27p2h" event={"ID":"defba3b0-b36e-4b8e-a8b1-577782a54249","Type":"ContainerStarted","Data":"71995208bc4a6026db9f35d327bf7ef5221f0e88b84d41677626505393d9465d"} Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.739651 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6kfbf" podStartSLOduration=64.739612535 podStartE2EDuration="1m4.739612535s" podCreationTimestamp="2026-03-18 18:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:31.710623142 +0000 UTC m=+106.278253474" watchObservedRunningTime="2026-03-18 18:04:31.739612535 +0000 UTC m=+106.307242877" Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.740347 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27p2h" Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.750543 4830 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-27p2h container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.750585 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27p2h" podUID="defba3b0-b36e-4b8e-a8b1-577782a54249" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.753416 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-rpbbt" event={"ID":"8ddeb5b0-b33a-45aa-8129-e227613b85f7","Type":"ContainerStarted","Data":"8c8524915f5a8fb33981a6e31435f997398117f1fd0b472e0ada00126978b39b"} Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.774441 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9x2m7" podStartSLOduration=64.774411256 podStartE2EDuration="1m4.774411256s" podCreationTimestamp="2026-03-18 18:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:31.743212786 +0000 UTC m=+106.310843118" watchObservedRunningTime="2026-03-18 18:04:31.774411256 +0000 UTC m=+106.342041588" Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.779399 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:31 crc kubenswrapper[4830]: E0318 18:04:31.780507 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:32.280488409 +0000 UTC m=+106.848118741 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.783643 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-npj9g" event={"ID":"69e95a31-ebb4-4647-b358-ad5a85023485","Type":"ContainerStarted","Data":"d9c9f1a63789b17c6532fa3167d1b49c60f30c4e04a80cd3eaf8a0c94dff0521"} Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.783686 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-npj9g" event={"ID":"69e95a31-ebb4-4647-b358-ad5a85023485","Type":"ContainerStarted","Data":"672014e73ac63983a4f81f8efc6b262e683864c7d4791abcd7699d86aaf1ad08"} Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.805402 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-lrwxl" Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.823190 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-w4n2s" event={"ID":"dae4f68b-bbf7-441a-8c3f-8a260664215c","Type":"ContainerStarted","Data":"dc3d5323db484cf5b4c4ecd5ddbc22b42fad18fc41fa2f7c916b7b283bf57f94"} Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.829464 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-s77pq" event={"ID":"602fd67e-3c82-46aa-879d-17bbd976e85b","Type":"ContainerStarted","Data":"85dce238b5f8b74e45138a0bfe7cf6f8b86db2f60e21828ef87af432010c7454"} Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.830568 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lfz57" event={"ID":"95b4d24e-09da-4c0d-9d24-81621509024a","Type":"ContainerStarted","Data":"2931bf69515a4a9a5d0e070d0c198fb1079578d21a98cc67e91f316c33e5b89c"} Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.840052 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-gjjbf" podStartSLOduration=6.839995325 podStartE2EDuration="6.839995325s" podCreationTimestamp="2026-03-18 18:04:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:31.770562768 +0000 UTC m=+106.338193110" watchObservedRunningTime="2026-03-18 18:04:31.839995325 +0000 UTC m=+106.407625647" Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.862071 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-lrwxl" podStartSLOduration=6.854921752 podStartE2EDuration="6.854921752s" podCreationTimestamp="2026-03-18 18:04:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:31.805277996 +0000 UTC m=+106.372908328" watchObservedRunningTime="2026-03-18 18:04:31.854921752 +0000 UTC m=+106.422552084" Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.872946 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2xz7d" event={"ID":"95049de3-9804-4b1a-ba38-495ecbff971b","Type":"ContainerStarted","Data":"0501429becfc4843087776cfb7565250c0ca49707add8b90b52061afda8de627"} Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.872993 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2xz7d" event={"ID":"95049de3-9804-4b1a-ba38-495ecbff971b","Type":"ContainerStarted","Data":"32a7a49c4c3163c0987252c2528f3744e3f5841bdc8e082f0d3ced11dad3de55"} Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.873734 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2xz7d" Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.882232 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.883379 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27p2h" podStartSLOduration=64.883352872 podStartE2EDuration="1m4.883352872s" podCreationTimestamp="2026-03-18 18:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:31.841097313 +0000 UTC m=+106.408727645" watchObservedRunningTime="2026-03-18 18:04:31.883352872 +0000 UTC m=+106.450983204" Mar 18 18:04:31 crc kubenswrapper[4830]: E0318 18:04:31.896919 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:32.396900984 +0000 UTC m=+106.964531316 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.898480 4830 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-2xz7d container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.898578 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2xz7d" podUID="95049de3-9804-4b1a-ba38-495ecbff971b" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.920535 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-wjtsp" event={"ID":"52dfe6d5-441c-4853-810d-f56db985d9bd","Type":"ContainerStarted","Data":"282469c5bfe2c1d5341ae56f9edec8bea6f555bb134ecd3c6600de66f856208d"} Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.952452 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-rpbbt" podStartSLOduration=64.952421479 podStartE2EDuration="1m4.952421479s" podCreationTimestamp="2026-03-18 18:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:31.895364076 +0000 UTC m=+106.462994408" watchObservedRunningTime="2026-03-18 18:04:31.952421479 +0000 UTC m=+106.520051811" Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.961888 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qx95k" event={"ID":"1fe6ec9e-1fc2-46f6-b3a8-257c478f278a","Type":"ContainerStarted","Data":"8aba406089e543daa5be53b54cb3a0ef1112249d801d33710bbfe1af43f811fa"} Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.961943 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qx95k" event={"ID":"1fe6ec9e-1fc2-46f6-b3a8-257c478f278a","Type":"ContainerStarted","Data":"a28bef0cbc347d4bbb68d15122b2c5c536b5cb4adef73518f0f8967681f527d7"} Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.978601 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-npj9g" podStartSLOduration=64.978575521 podStartE2EDuration="1m4.978575521s" podCreationTimestamp="2026-03-18 18:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:31.937374478 +0000 UTC m=+106.505004810" watchObservedRunningTime="2026-03-18 18:04:31.978575521 +0000 UTC m=+106.546205853" Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.987661 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-r267p" event={"ID":"714819f9-4554-4f15-bf01-40ba2f401872","Type":"ContainerStarted","Data":"c354490f34c99e60527e62056d313e932bbb0db1baa6c4cb46ad77b046309a18"} Mar 18 18:04:31 crc kubenswrapper[4830]: I0318 18:04:31.996144 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:31 crc kubenswrapper[4830]: E0318 18:04:31.997869 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:32.497844018 +0000 UTC m=+107.065474350 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:32 crc kubenswrapper[4830]: I0318 18:04:32.045337 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-59q79" event={"ID":"abd70175-8048-40dc-8f82-72d1112b0af0","Type":"ContainerStarted","Data":"76f07a1f029d60c3717e18802848150506d11ae9e1817d4ac702f4539e3d155d"} Mar 18 18:04:32 crc kubenswrapper[4830]: I0318 18:04:32.100762 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:32 crc kubenswrapper[4830]: E0318 18:04:32.101978 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:32.601967013 +0000 UTC m=+107.169597345 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:32 crc kubenswrapper[4830]: I0318 18:04:32.103179 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgk4d" event={"ID":"a043d02e-8a8a-42e6-839d-15dc6c0b43b6","Type":"ContainerStarted","Data":"a51357b4dcc839efa10f3861fdfd28b70c41df07e227fc7df9b67468d424fcb4"} Mar 18 18:04:32 crc kubenswrapper[4830]: I0318 18:04:32.134164 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kr648" event={"ID":"884979c1-fccc-4bd6-b6db-4ec35bd9bdf7","Type":"ContainerStarted","Data":"23862085c038a02468b61a8248f62bc456e55ac83f01095d994a32a6d2b937a3"} Mar 18 18:04:32 crc kubenswrapper[4830]: I0318 18:04:32.134729 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kr648" Mar 18 18:04:32 crc kubenswrapper[4830]: I0318 18:04:32.152460 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f4wwz" event={"ID":"3ed8be83-14ab-44e2-9c05-bf0306320a71","Type":"ContainerStarted","Data":"5a95b2cd6c2d34f36dae86121bbaf7ab61b339d4808d575540b4637295e6b6e4"} Mar 18 18:04:32 crc kubenswrapper[4830]: I0318 18:04:32.172975 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnxfs" event={"ID":"2782ade8-7344-423c-8ace-e9fe0b0fd207","Type":"ContainerStarted","Data":"e9c31c468f851b283f72d3df0b35492a91cee91202445cf160ed3e9ff4b61b93"} Mar 18 18:04:32 crc kubenswrapper[4830]: I0318 18:04:32.183874 4830 patch_prober.go:28] interesting pod/router-default-5444994796-mfdzp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 18:04:32 crc kubenswrapper[4830]: [-]has-synced failed: reason withheld Mar 18 18:04:32 crc kubenswrapper[4830]: [+]process-running ok Mar 18 18:04:32 crc kubenswrapper[4830]: healthz check failed Mar 18 18:04:32 crc kubenswrapper[4830]: I0318 18:04:32.183935 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mfdzp" podUID="008bfbc9-9b16-4769-ba0a-116a67b7fdb4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 18:04:32 crc kubenswrapper[4830]: I0318 18:04:32.184992 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-wnkwx" event={"ID":"b2d85f1d-8085-42e9-b3a3-dfa5eed4c3cd","Type":"ContainerStarted","Data":"700ab01ede56bd510d26e17d13e3732f11162f81e3b75798444aefa630eeaf20"} Mar 18 18:04:32 crc kubenswrapper[4830]: I0318 18:04:32.196227 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-6hx2v" event={"ID":"19aab548-96a0-4056-8226-f9e7cf4b3ca3","Type":"ContainerStarted","Data":"de7a7195b9b43f06d4c613b87f6701cc48012d5464974eeeccde1f5a1e890958"} Mar 18 18:04:32 crc kubenswrapper[4830]: I0318 18:04:32.204471 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:32 crc kubenswrapper[4830]: E0318 18:04:32.205824 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:32.70579951 +0000 UTC m=+107.273429842 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:32 crc kubenswrapper[4830]: I0318 18:04:32.221235 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ltsqn" event={"ID":"94158f29-fe7c-44f5-85a6-ec3e7f39a50a","Type":"ContainerStarted","Data":"9cb5c93081877004b4c18de8cd7c39c06f19ecde967914c496e4e3fdb3464bc8"} Mar 18 18:04:32 crc kubenswrapper[4830]: I0318 18:04:32.222982 4830 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-j2j42 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/healthz\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 18 18:04:32 crc kubenswrapper[4830]: I0318 18:04:32.223011 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-j2j42" podUID="7332042a-dffc-4c3e-94eb-2a1dedc58062" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.13:8080/healthz\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 18 18:04:32 crc kubenswrapper[4830]: I0318 18:04:32.223192 4830 patch_prober.go:28] interesting pod/downloads-7954f5f757-t7xl2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 18 18:04:32 crc kubenswrapper[4830]: I0318 18:04:32.223208 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-t7xl2" podUID="8b0c7aa1-248e-4847-97f5-c08c17e78c3d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 18 18:04:32 crc kubenswrapper[4830]: I0318 18:04:32.230352 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-lfz57" podStartSLOduration=66.23032265 podStartE2EDuration="1m6.23032265s" podCreationTimestamp="2026-03-18 18:03:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:32.099709645 +0000 UTC m=+106.667339977" watchObservedRunningTime="2026-03-18 18:04:32.23032265 +0000 UTC m=+106.797952982" Mar 18 18:04:32 crc kubenswrapper[4830]: I0318 18:04:32.286735 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-cnwlr" Mar 18 18:04:32 crc kubenswrapper[4830]: I0318 18:04:32.311678 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:32 crc kubenswrapper[4830]: E0318 18:04:32.312985 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:32.812972901 +0000 UTC m=+107.380603233 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:32 crc kubenswrapper[4830]: I0318 18:04:32.397013 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-wnkwx" podStartSLOduration=65.396988237 podStartE2EDuration="1m5.396988237s" podCreationTimestamp="2026-03-18 18:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:32.382733446 +0000 UTC m=+106.950363778" watchObservedRunningTime="2026-03-18 18:04:32.396988237 +0000 UTC m=+106.964618569" Mar 18 18:04:32 crc kubenswrapper[4830]: I0318 18:04:32.412873 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:32 crc kubenswrapper[4830]: E0318 18:04:32.413350 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:32.91333207 +0000 UTC m=+107.480962402 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:32 crc kubenswrapper[4830]: I0318 18:04:32.501106 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kr648" podStartSLOduration=66.50108646 podStartE2EDuration="1m6.50108646s" podCreationTimestamp="2026-03-18 18:03:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:32.500844974 +0000 UTC m=+107.068475306" watchObservedRunningTime="2026-03-18 18:04:32.50108646 +0000 UTC m=+107.068716792" Mar 18 18:04:32 crc kubenswrapper[4830]: I0318 18:04:32.514453 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:32 crc kubenswrapper[4830]: E0318 18:04:32.514761 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:33.014749136 +0000 UTC m=+107.582379468 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:32 crc kubenswrapper[4830]: I0318 18:04:32.565013 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wx6kd"] Mar 18 18:04:32 crc kubenswrapper[4830]: I0318 18:04:32.621970 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:32 crc kubenswrapper[4830]: E0318 18:04:32.622334 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:33.122311567 +0000 UTC m=+107.689941899 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:32 crc kubenswrapper[4830]: I0318 18:04:32.632385 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ltsqn" podStartSLOduration=65.632353241 podStartE2EDuration="1m5.632353241s" podCreationTimestamp="2026-03-18 18:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:32.594186636 +0000 UTC m=+107.161816978" watchObservedRunningTime="2026-03-18 18:04:32.632353241 +0000 UTC m=+107.199983573" Mar 18 18:04:32 crc kubenswrapper[4830]: I0318 18:04:32.727966 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:32 crc kubenswrapper[4830]: E0318 18:04:32.728404 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:33.228387741 +0000 UTC m=+107.796018073 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:32 crc kubenswrapper[4830]: I0318 18:04:32.761005 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-6hx2v" podStartSLOduration=66.760985476 podStartE2EDuration="1m6.760985476s" podCreationTimestamp="2026-03-18 18:03:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:32.63032088 +0000 UTC m=+107.197951212" watchObservedRunningTime="2026-03-18 18:04:32.760985476 +0000 UTC m=+107.328615808" Mar 18 18:04:32 crc kubenswrapper[4830]: I0318 18:04:32.828431 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xnxfs" podStartSLOduration=66.828410082 podStartE2EDuration="1m6.828410082s" podCreationTimestamp="2026-03-18 18:03:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:32.82676105 +0000 UTC m=+107.394391382" watchObservedRunningTime="2026-03-18 18:04:32.828410082 +0000 UTC m=+107.396040414" Mar 18 18:04:32 crc kubenswrapper[4830]: I0318 18:04:32.828750 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-wjtsp" podStartSLOduration=65.82874535 podStartE2EDuration="1m5.82874535s" podCreationTimestamp="2026-03-18 18:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:32.772172539 +0000 UTC m=+107.339802871" watchObservedRunningTime="2026-03-18 18:04:32.82874535 +0000 UTC m=+107.396375682" Mar 18 18:04:32 crc kubenswrapper[4830]: I0318 18:04:32.829217 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:32 crc kubenswrapper[4830]: E0318 18:04:32.829525 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:33.329510529 +0000 UTC m=+107.897140861 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:32 crc kubenswrapper[4830]: I0318 18:04:32.907583 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgk4d" podStartSLOduration=66.907567904 podStartE2EDuration="1m6.907567904s" podCreationTimestamp="2026-03-18 18:03:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:32.906907157 +0000 UTC m=+107.474537489" watchObservedRunningTime="2026-03-18 18:04:32.907567904 +0000 UTC m=+107.475198236" Mar 18 18:04:32 crc kubenswrapper[4830]: I0318 18:04:32.935683 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:32 crc kubenswrapper[4830]: E0318 18:04:32.936155 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:33.436138777 +0000 UTC m=+108.003769109 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:32 crc kubenswrapper[4830]: I0318 18:04:32.952084 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5q9t5" Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.024836 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2xz7d" podStartSLOduration=66.024816541 podStartE2EDuration="1m6.024816541s" podCreationTimestamp="2026-03-18 18:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:33.023313803 +0000 UTC m=+107.590944135" watchObservedRunningTime="2026-03-18 18:04:33.024816541 +0000 UTC m=+107.592446873" Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.026001 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f4wwz" podStartSLOduration=67.02599574 podStartE2EDuration="1m7.02599574s" podCreationTimestamp="2026-03-18 18:03:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:32.962168276 +0000 UTC m=+107.529798608" watchObservedRunningTime="2026-03-18 18:04:33.02599574 +0000 UTC m=+107.593626072" Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.038028 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:33 crc kubenswrapper[4830]: E0318 18:04:33.038514 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:33.538491607 +0000 UTC m=+108.106121939 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.100841 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-lrwxl"] Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.101173 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-59q79" podStartSLOduration=66.101151772 podStartE2EDuration="1m6.101151772s" podCreationTimestamp="2026-03-18 18:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:33.096154515 +0000 UTC m=+107.663784847" watchObservedRunningTime="2026-03-18 18:04:33.101151772 +0000 UTC m=+107.668782104" Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.141472 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:33 crc kubenswrapper[4830]: E0318 18:04:33.141947 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:33.641931274 +0000 UTC m=+108.209561596 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.172986 4830 patch_prober.go:28] interesting pod/router-default-5444994796-mfdzp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 18:04:33 crc kubenswrapper[4830]: [-]has-synced failed: reason withheld Mar 18 18:04:33 crc kubenswrapper[4830]: [+]process-running ok Mar 18 18:04:33 crc kubenswrapper[4830]: healthz check failed Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.173075 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mfdzp" podUID="008bfbc9-9b16-4769-ba0a-116a67b7fdb4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.217922 4830 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-kczvm container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.21:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.218012 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" podUID="84a21e6e-7bda-408b-a607-f02b4f807535" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.21:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.229159 4830 patch_prober.go:28] interesting pod/console-operator-58897d9998-cww5w container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.229244 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-cww5w" podUID="8bcb1a4d-f708-4d3a-81f1-8373e36eb474" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.242113 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:33 crc kubenswrapper[4830]: E0318 18:04:33.242623 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:33.742600091 +0000 UTC m=+108.310230423 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.266146 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-w4n2s" event={"ID":"dae4f68b-bbf7-441a-8c3f-8a260664215c","Type":"ContainerStarted","Data":"3a6475f697a40bdeee3725fb74c1b730f03c3a76081290c5a35a2e1d30d68b4f"} Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.287980 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"7a23b983b1bdf67bed40565b28474847e11ad810a1c071612edb699f1d5a5d7f"} Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.288026 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"8d0bb2b6556fc2875722cb96bbe4e326eebce1700a17beab49c7ee7c081dfe6f"} Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.303261 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-s77pq" event={"ID":"602fd67e-3c82-46aa-879d-17bbd976e85b","Type":"ContainerStarted","Data":"8afc8a625cf130444240a66e9ba4a19bb77af9c6092abc7f5c752dc0a976ec84"} Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.316463 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wx6kd" event={"ID":"437f27f7-4531-4e3e-b3a9-a471c7630012","Type":"ContainerStarted","Data":"916cdf71a4ec1c703dcac4997534800d5f94a64a6b31f149c4bfa3d30dd81bfc"} Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.319153 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"e06721b4405dcfdb64b841f61f7640b7c01a1b5dd44c9293d53e97bbf86f0e71"} Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.319186 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"6021360d206cc0c3ac66a73f19c979be087d622671e806ca810921d9e48d5b1b"} Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.319669 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.343526 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" event={"ID":"2b7160ce-096f-4305-9954-982608b133ac","Type":"ContainerStarted","Data":"7c0ef546195ffcb7033b3658fb490b98e045eb7da2e28e85f8273606c94f707c"} Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.343587 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" event={"ID":"2b7160ce-096f-4305-9954-982608b133ac","Type":"ContainerStarted","Data":"1ff36848c45e2cc49d9a851ba91b3f69f24dfbe04f6e041e215f962115a926c1"} Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.349862 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:33 crc kubenswrapper[4830]: E0318 18:04:33.350247 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:33.850227743 +0000 UTC m=+108.417858075 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.388654 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq" event={"ID":"d43b41ee-1e3d-42bf-8856-0678d441ac96","Type":"ContainerStarted","Data":"1519575e85fa6ff9381e8db28f6c4abcb29e3f7efba420b5e1f59338b72aa683"} Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.402922 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"3028eb01317bde3fe2dc4fc170ce5f6c5e8be6bf2d74cfc9ddb8b72650c9d5d4"} Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.402968 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"da3adac6160afacab8ceec97fdb9fa3419e0cf6151c6b9584e22809b99e43500"} Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.422982 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f4wwz" event={"ID":"3ed8be83-14ab-44e2-9c05-bf0306320a71","Type":"ContainerStarted","Data":"1d7c1de4cc7ee389191cc21370441c11bf1a9851710aee5533a1dae5756db03c"} Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.441531 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-wnkwx" event={"ID":"b2d85f1d-8085-42e9-b3a3-dfa5eed4c3cd","Type":"ContainerStarted","Data":"f192a26ca437a893491bd61f46b84e8651ec4193eab59a31909c619de1a18140"} Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.444014 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qx95k" event={"ID":"1fe6ec9e-1fc2-46f6-b3a8-257c478f278a","Type":"ContainerStarted","Data":"788ea968f24ba3b6d86a6ed0428e9626be0dc633b39ccfb1275a3490b50efeed"} Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.444565 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-qx95k" Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.459996 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:33 crc kubenswrapper[4830]: E0318 18:04:33.460709 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:33.960684788 +0000 UTC m=+108.528315120 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.461078 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:33 crc kubenswrapper[4830]: E0318 18:04:33.468369 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:33.968345202 +0000 UTC m=+108.535975534 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.491120 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-w4n2s" podStartSLOduration=66.491089987 podStartE2EDuration="1m6.491089987s" podCreationTimestamp="2026-03-18 18:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:33.368540627 +0000 UTC m=+107.936170959" watchObservedRunningTime="2026-03-18 18:04:33.491089987 +0000 UTC m=+108.058720319" Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.492648 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-r267p" event={"ID":"714819f9-4554-4f15-bf01-40ba2f401872","Type":"ContainerStarted","Data":"c5d1d09c69c30d827a631323e71b275da5fe9a53ebb8e97560423045a6b95337"} Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.501450 4830 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-j2j42 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/healthz\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.501508 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-j2j42" podUID="7332042a-dffc-4c3e-94eb-2a1dedc58062" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.13:8080/healthz\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.532155 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2xz7d" Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.537168 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27p2h" Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.563224 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:33 crc kubenswrapper[4830]: E0318 18:04:33.564056 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:34.064034893 +0000 UTC m=+108.631665225 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.574588 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4cqtc" Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.637726 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" podStartSLOduration=67.637706457 podStartE2EDuration="1m7.637706457s" podCreationTimestamp="2026-03-18 18:03:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:33.636156877 +0000 UTC m=+108.203787209" watchObservedRunningTime="2026-03-18 18:04:33.637706457 +0000 UTC m=+108.205336789" Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.666038 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:33 crc kubenswrapper[4830]: E0318 18:04:33.680337 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:34.180308104 +0000 UTC m=+108.747938426 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.709813 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-s77pq" podStartSLOduration=66.7097889 podStartE2EDuration="1m6.7097889s" podCreationTimestamp="2026-03-18 18:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:33.702181918 +0000 UTC m=+108.269812260" watchObservedRunningTime="2026-03-18 18:04:33.7097889 +0000 UTC m=+108.277419232" Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.769868 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:33 crc kubenswrapper[4830]: E0318 18:04:33.770445 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:34.270420514 +0000 UTC m=+108.838050856 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.805879 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-qx95k" podStartSLOduration=8.80584814 podStartE2EDuration="8.80584814s" podCreationTimestamp="2026-03-18 18:04:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:33.798244088 +0000 UTC m=+108.365874420" watchObservedRunningTime="2026-03-18 18:04:33.80584814 +0000 UTC m=+108.373478482" Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.871841 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:33 crc kubenswrapper[4830]: E0318 18:04:33.872276 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:34.372258321 +0000 UTC m=+108.939888673 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.971612 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq" podStartSLOduration=66.971591933 podStartE2EDuration="1m6.971591933s" podCreationTimestamp="2026-03-18 18:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:33.967665563 +0000 UTC m=+108.535295895" watchObservedRunningTime="2026-03-18 18:04:33.971591933 +0000 UTC m=+108.539222265" Mar 18 18:04:33 crc kubenswrapper[4830]: I0318 18:04:33.973411 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:33 crc kubenswrapper[4830]: E0318 18:04:33.973733 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:34.473716517 +0000 UTC m=+109.041346849 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:34 crc kubenswrapper[4830]: I0318 18:04:34.075056 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:34 crc kubenswrapper[4830]: E0318 18:04:34.075549 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:34.575524482 +0000 UTC m=+109.143154814 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:34 crc kubenswrapper[4830]: I0318 18:04:34.085719 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:04:34 crc kubenswrapper[4830]: I0318 18:04:34.158732 4830 patch_prober.go:28] interesting pod/router-default-5444994796-mfdzp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 18:04:34 crc kubenswrapper[4830]: [-]has-synced failed: reason withheld Mar 18 18:04:34 crc kubenswrapper[4830]: [+]process-running ok Mar 18 18:04:34 crc kubenswrapper[4830]: healthz check failed Mar 18 18:04:34 crc kubenswrapper[4830]: I0318 18:04:34.158936 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mfdzp" podUID="008bfbc9-9b16-4769-ba0a-116a67b7fdb4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 18:04:34 crc kubenswrapper[4830]: I0318 18:04:34.175967 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:34 crc kubenswrapper[4830]: E0318 18:04:34.176461 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:34.676436465 +0000 UTC m=+109.244066797 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:34 crc kubenswrapper[4830]: I0318 18:04:34.277537 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:34 crc kubenswrapper[4830]: E0318 18:04:34.277923 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:34.777906123 +0000 UTC m=+109.345536445 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:34 crc kubenswrapper[4830]: I0318 18:04:34.379274 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:34 crc kubenswrapper[4830]: E0318 18:04:34.379843 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:34.879816441 +0000 UTC m=+109.447446773 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:34 crc kubenswrapper[4830]: I0318 18:04:34.420524 4830 ???:1] "http: TLS handshake error from 192.168.126.11:52596: no serving certificate available for the kubelet" Mar 18 18:04:34 crc kubenswrapper[4830]: I0318 18:04:34.482025 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:34 crc kubenswrapper[4830]: E0318 18:04:34.482406 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:34.982384986 +0000 UTC m=+109.550015318 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:34 crc kubenswrapper[4830]: I0318 18:04:34.497226 4830 patch_prober.go:28] interesting pod/console-operator-58897d9998-cww5w container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 18:04:34 crc kubenswrapper[4830]: I0318 18:04:34.497280 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-cww5w" podUID="8bcb1a4d-f708-4d3a-81f1-8373e36eb474" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 18:04:34 crc kubenswrapper[4830]: I0318 18:04:34.498544 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wx6kd" event={"ID":"437f27f7-4531-4e3e-b3a9-a471c7630012","Type":"ContainerStarted","Data":"f427061e3f1ff8a68714ceef99582872ba4e848607c1470c8c8411b00c4fe2bd"} Mar 18 18:04:34 crc kubenswrapper[4830]: I0318 18:04:34.498610 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wx6kd" event={"ID":"437f27f7-4531-4e3e-b3a9-a471c7630012","Type":"ContainerStarted","Data":"066b1719e3a2f85a7a4557b3d463d64617d821c5072dafa3f8bbfe8029fe6503"} Mar 18 18:04:34 crc kubenswrapper[4830]: I0318 18:04:34.498760 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-lrwxl" podUID="1e193992-9fbb-46cc-bb80-ed0563456687" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://81b8be7dfb58f04ef54ad801b0e06061e15df68bb8d2b73116eb4d548beb2a4c" gracePeriod=30 Mar 18 18:04:34 crc kubenswrapper[4830]: I0318 18:04:34.566221 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-wx6kd" podStartSLOduration=67.566201676 podStartE2EDuration="1m7.566201676s" podCreationTimestamp="2026-03-18 18:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:34.562840451 +0000 UTC m=+109.130470783" watchObservedRunningTime="2026-03-18 18:04:34.566201676 +0000 UTC m=+109.133831998" Mar 18 18:04:34 crc kubenswrapper[4830]: I0318 18:04:34.568157 4830 ???:1] "http: TLS handshake error from 192.168.126.11:52598: no serving certificate available for the kubelet" Mar 18 18:04:34 crc kubenswrapper[4830]: I0318 18:04:34.584163 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:34 crc kubenswrapper[4830]: E0318 18:04:34.584338 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:35.084310375 +0000 UTC m=+109.651940707 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:34 crc kubenswrapper[4830]: I0318 18:04:34.584662 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:34 crc kubenswrapper[4830]: E0318 18:04:34.584945 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:35.08493569 +0000 UTC m=+109.652566022 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:34 crc kubenswrapper[4830]: I0318 18:04:34.664419 4830 ???:1] "http: TLS handshake error from 192.168.126.11:52600: no serving certificate available for the kubelet" Mar 18 18:04:34 crc kubenswrapper[4830]: I0318 18:04:34.689804 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:34 crc kubenswrapper[4830]: E0318 18:04:34.691540 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:35.191515187 +0000 UTC m=+109.759145519 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:34 crc kubenswrapper[4830]: I0318 18:04:34.762486 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cnwlr"] Mar 18 18:04:34 crc kubenswrapper[4830]: I0318 18:04:34.778201 4830 ???:1] "http: TLS handshake error from 192.168.126.11:52610: no serving certificate available for the kubelet" Mar 18 18:04:34 crc kubenswrapper[4830]: I0318 18:04:34.792276 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:34 crc kubenswrapper[4830]: E0318 18:04:34.792589 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:35.292577284 +0000 UTC m=+109.860207616 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:34 crc kubenswrapper[4830]: I0318 18:04:34.815995 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-27p2h"] Mar 18 18:04:34 crc kubenswrapper[4830]: I0318 18:04:34.870712 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kr648" Mar 18 18:04:34 crc kubenswrapper[4830]: I0318 18:04:34.893438 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:34 crc kubenswrapper[4830]: E0318 18:04:34.893813 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:35.393797975 +0000 UTC m=+109.961428307 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:34 crc kubenswrapper[4830]: I0318 18:04:34.896903 4830 ???:1] "http: TLS handshake error from 192.168.126.11:52618: no serving certificate available for the kubelet" Mar 18 18:04:34 crc kubenswrapper[4830]: I0318 18:04:34.994525 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:34 crc kubenswrapper[4830]: E0318 18:04:34.994836 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:35.49481696 +0000 UTC m=+110.062447282 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.026143 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-szdp2"] Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.027033 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-szdp2" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.027323 4830 ???:1] "http: TLS handshake error from 192.168.126.11:52626: no serving certificate available for the kubelet" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.041327 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.052435 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-szdp2"] Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.101395 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:35 crc kubenswrapper[4830]: E0318 18:04:35.101623 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:35.601579591 +0000 UTC m=+110.169209913 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.101714 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:35 crc kubenswrapper[4830]: E0318 18:04:35.102404 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:35.602340471 +0000 UTC m=+110.169970803 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.159594 4830 patch_prober.go:28] interesting pod/router-default-5444994796-mfdzp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 18:04:35 crc kubenswrapper[4830]: [-]has-synced failed: reason withheld Mar 18 18:04:35 crc kubenswrapper[4830]: [+]process-running ok Mar 18 18:04:35 crc kubenswrapper[4830]: healthz check failed Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.159835 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mfdzp" podUID="008bfbc9-9b16-4769-ba0a-116a67b7fdb4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.166899 4830 ???:1] "http: TLS handshake error from 192.168.126.11:52632: no serving certificate available for the kubelet" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.176143 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2tsg6"] Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.177398 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2tsg6" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.189406 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.193302 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2tsg6"] Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.203510 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:35 crc kubenswrapper[4830]: E0318 18:04:35.203702 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:35.703671434 +0000 UTC m=+110.271301766 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.204512 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b-catalog-content\") pod \"community-operators-szdp2\" (UID: \"b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b\") " pod="openshift-marketplace/community-operators-szdp2" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.204691 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th75n\" (UniqueName: \"kubernetes.io/projected/b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b-kube-api-access-th75n\") pod \"community-operators-szdp2\" (UID: \"b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b\") " pod="openshift-marketplace/community-operators-szdp2" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.204896 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b-utilities\") pod \"community-operators-szdp2\" (UID: \"b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b\") " pod="openshift-marketplace/community-operators-szdp2" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.205036 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:35 crc kubenswrapper[4830]: E0318 18:04:35.205539 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:35.705514741 +0000 UTC m=+110.273145083 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.245910 4830 ???:1] "http: TLS handshake error from 192.168.126.11:52634: no serving certificate available for the kubelet" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.306677 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.306856 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th75n\" (UniqueName: \"kubernetes.io/projected/b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b-kube-api-access-th75n\") pod \"community-operators-szdp2\" (UID: \"b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b\") " pod="openshift-marketplace/community-operators-szdp2" Mar 18 18:04:35 crc kubenswrapper[4830]: E0318 18:04:35.306896 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:35.806867815 +0000 UTC m=+110.374498147 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.306948 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqkg9\" (UniqueName: \"kubernetes.io/projected/4d8e7b87-f442-4d60-bd65-35eacd097689-kube-api-access-dqkg9\") pod \"certified-operators-2tsg6\" (UID: \"4d8e7b87-f442-4d60-bd65-35eacd097689\") " pod="openshift-marketplace/certified-operators-2tsg6" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.307033 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d8e7b87-f442-4d60-bd65-35eacd097689-utilities\") pod \"certified-operators-2tsg6\" (UID: \"4d8e7b87-f442-4d60-bd65-35eacd097689\") " pod="openshift-marketplace/certified-operators-2tsg6" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.307106 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b-utilities\") pod \"community-operators-szdp2\" (UID: \"b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b\") " pod="openshift-marketplace/community-operators-szdp2" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.307128 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.307231 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d8e7b87-f442-4d60-bd65-35eacd097689-catalog-content\") pod \"certified-operators-2tsg6\" (UID: \"4d8e7b87-f442-4d60-bd65-35eacd097689\") " pod="openshift-marketplace/certified-operators-2tsg6" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.307300 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b-catalog-content\") pod \"community-operators-szdp2\" (UID: \"b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b\") " pod="openshift-marketplace/community-operators-szdp2" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.307922 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b-utilities\") pod \"community-operators-szdp2\" (UID: \"b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b\") " pod="openshift-marketplace/community-operators-szdp2" Mar 18 18:04:35 crc kubenswrapper[4830]: E0318 18:04:35.308062 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:35.808026604 +0000 UTC m=+110.375657096 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.308378 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b-catalog-content\") pod \"community-operators-szdp2\" (UID: \"b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b\") " pod="openshift-marketplace/community-operators-szdp2" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.346871 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th75n\" (UniqueName: \"kubernetes.io/projected/b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b-kube-api-access-th75n\") pod \"community-operators-szdp2\" (UID: \"b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b\") " pod="openshift-marketplace/community-operators-szdp2" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.375306 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-klcdh"] Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.377409 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-klcdh" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.390509 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-klcdh"] Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.408039 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.408266 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqkg9\" (UniqueName: \"kubernetes.io/projected/4d8e7b87-f442-4d60-bd65-35eacd097689-kube-api-access-dqkg9\") pod \"certified-operators-2tsg6\" (UID: \"4d8e7b87-f442-4d60-bd65-35eacd097689\") " pod="openshift-marketplace/certified-operators-2tsg6" Mar 18 18:04:35 crc kubenswrapper[4830]: E0318 18:04:35.408361 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:35.908315482 +0000 UTC m=+110.475945824 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.408444 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d8e7b87-f442-4d60-bd65-35eacd097689-utilities\") pod \"certified-operators-2tsg6\" (UID: \"4d8e7b87-f442-4d60-bd65-35eacd097689\") " pod="openshift-marketplace/certified-operators-2tsg6" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.408619 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.408693 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d8e7b87-f442-4d60-bd65-35eacd097689-catalog-content\") pod \"certified-operators-2tsg6\" (UID: \"4d8e7b87-f442-4d60-bd65-35eacd097689\") " pod="openshift-marketplace/certified-operators-2tsg6" Mar 18 18:04:35 crc kubenswrapper[4830]: E0318 18:04:35.408990 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:35.908974358 +0000 UTC m=+110.476604690 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.409391 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d8e7b87-f442-4d60-bd65-35eacd097689-utilities\") pod \"certified-operators-2tsg6\" (UID: \"4d8e7b87-f442-4d60-bd65-35eacd097689\") " pod="openshift-marketplace/certified-operators-2tsg6" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.409730 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d8e7b87-f442-4d60-bd65-35eacd097689-catalog-content\") pod \"certified-operators-2tsg6\" (UID: \"4d8e7b87-f442-4d60-bd65-35eacd097689\") " pod="openshift-marketplace/certified-operators-2tsg6" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.470512 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqkg9\" (UniqueName: \"kubernetes.io/projected/4d8e7b87-f442-4d60-bd65-35eacd097689-kube-api-access-dqkg9\") pod \"certified-operators-2tsg6\" (UID: \"4d8e7b87-f442-4d60-bd65-35eacd097689\") " pod="openshift-marketplace/certified-operators-2tsg6" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.489588 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2tsg6" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.509430 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:35 crc kubenswrapper[4830]: E0318 18:04:35.509584 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:36.009560863 +0000 UTC m=+110.577191195 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.509691 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1176643-c4d6-4be9-8317-a99886a32b29-utilities\") pod \"community-operators-klcdh\" (UID: \"b1176643-c4d6-4be9-8317-a99886a32b29\") " pod="openshift-marketplace/community-operators-klcdh" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.509713 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1176643-c4d6-4be9-8317-a99886a32b29-catalog-content\") pod \"community-operators-klcdh\" (UID: \"b1176643-c4d6-4be9-8317-a99886a32b29\") " pod="openshift-marketplace/community-operators-klcdh" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.509748 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.509783 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzb79\" (UniqueName: \"kubernetes.io/projected/b1176643-c4d6-4be9-8317-a99886a32b29-kube-api-access-lzb79\") pod \"community-operators-klcdh\" (UID: \"b1176643-c4d6-4be9-8317-a99886a32b29\") " pod="openshift-marketplace/community-operators-klcdh" Mar 18 18:04:35 crc kubenswrapper[4830]: E0318 18:04:35.510058 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:36.010046105 +0000 UTC m=+110.577676437 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.510308 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-r267p" event={"ID":"714819f9-4554-4f15-bf01-40ba2f401872","Type":"ContainerStarted","Data":"b12014af959bbbc00937b57e9a4eb4a8e105d84b89fd29d07e2124386e8da18d"} Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.510356 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-r267p" event={"ID":"714819f9-4554-4f15-bf01-40ba2f401872","Type":"ContainerStarted","Data":"a087597808b55756a94e5763a11a8bf35c6554b3edac48ba15f8fdae7663686d"} Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.510923 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-cnwlr" podUID="35a6cab9-b63f-4ed5-ac08-897e894498c5" containerName="controller-manager" containerID="cri-o://df71dba90fd8d7a4a61b513e9e04447374d83370486ca198e2ce4ec93e7c66a4" gracePeriod=30 Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.576015 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-czlcm"] Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.610583 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-czlcm" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.612010 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.612440 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1176643-c4d6-4be9-8317-a99886a32b29-utilities\") pod \"community-operators-klcdh\" (UID: \"b1176643-c4d6-4be9-8317-a99886a32b29\") " pod="openshift-marketplace/community-operators-klcdh" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.612472 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1176643-c4d6-4be9-8317-a99886a32b29-catalog-content\") pod \"community-operators-klcdh\" (UID: \"b1176643-c4d6-4be9-8317-a99886a32b29\") " pod="openshift-marketplace/community-operators-klcdh" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.612583 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzb79\" (UniqueName: \"kubernetes.io/projected/b1176643-c4d6-4be9-8317-a99886a32b29-kube-api-access-lzb79\") pod \"community-operators-klcdh\" (UID: \"b1176643-c4d6-4be9-8317-a99886a32b29\") " pod="openshift-marketplace/community-operators-klcdh" Mar 18 18:04:35 crc kubenswrapper[4830]: E0318 18:04:35.615022 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:36.114997631 +0000 UTC m=+110.682627953 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.617389 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1176643-c4d6-4be9-8317-a99886a32b29-utilities\") pod \"community-operators-klcdh\" (UID: \"b1176643-c4d6-4be9-8317-a99886a32b29\") " pod="openshift-marketplace/community-operators-klcdh" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.617645 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1176643-c4d6-4be9-8317-a99886a32b29-catalog-content\") pod \"community-operators-klcdh\" (UID: \"b1176643-c4d6-4be9-8317-a99886a32b29\") " pod="openshift-marketplace/community-operators-klcdh" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.645104 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-szdp2" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.664448 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-czlcm"] Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.681079 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzb79\" (UniqueName: \"kubernetes.io/projected/b1176643-c4d6-4be9-8317-a99886a32b29-kube-api-access-lzb79\") pod \"community-operators-klcdh\" (UID: \"b1176643-c4d6-4be9-8317-a99886a32b29\") " pod="openshift-marketplace/community-operators-klcdh" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.692354 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-klcdh" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.713974 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fa33f19-15c5-4f78-88ef-1db6eb605aa7-utilities\") pod \"certified-operators-czlcm\" (UID: \"3fa33f19-15c5-4f78-88ef-1db6eb605aa7\") " pod="openshift-marketplace/certified-operators-czlcm" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.714062 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.714094 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fa33f19-15c5-4f78-88ef-1db6eb605aa7-catalog-content\") pod \"certified-operators-czlcm\" (UID: \"3fa33f19-15c5-4f78-88ef-1db6eb605aa7\") " pod="openshift-marketplace/certified-operators-czlcm" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.714131 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5q2g\" (UniqueName: \"kubernetes.io/projected/3fa33f19-15c5-4f78-88ef-1db6eb605aa7-kube-api-access-g5q2g\") pod \"certified-operators-czlcm\" (UID: \"3fa33f19-15c5-4f78-88ef-1db6eb605aa7\") " pod="openshift-marketplace/certified-operators-czlcm" Mar 18 18:04:35 crc kubenswrapper[4830]: E0318 18:04:35.714564 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:36.214547739 +0000 UTC m=+110.782178061 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.714972 4830 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.815953 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:35 crc kubenswrapper[4830]: E0318 18:04:35.816227 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:36.316180791 +0000 UTC m=+110.883811123 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.816519 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5q2g\" (UniqueName: \"kubernetes.io/projected/3fa33f19-15c5-4f78-88ef-1db6eb605aa7-kube-api-access-g5q2g\") pod \"certified-operators-czlcm\" (UID: \"3fa33f19-15c5-4f78-88ef-1db6eb605aa7\") " pod="openshift-marketplace/certified-operators-czlcm" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.816564 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fa33f19-15c5-4f78-88ef-1db6eb605aa7-utilities\") pod \"certified-operators-czlcm\" (UID: \"3fa33f19-15c5-4f78-88ef-1db6eb605aa7\") " pod="openshift-marketplace/certified-operators-czlcm" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.816610 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.816633 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fa33f19-15c5-4f78-88ef-1db6eb605aa7-catalog-content\") pod \"certified-operators-czlcm\" (UID: \"3fa33f19-15c5-4f78-88ef-1db6eb605aa7\") " pod="openshift-marketplace/certified-operators-czlcm" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.817086 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fa33f19-15c5-4f78-88ef-1db6eb605aa7-catalog-content\") pod \"certified-operators-czlcm\" (UID: \"3fa33f19-15c5-4f78-88ef-1db6eb605aa7\") " pod="openshift-marketplace/certified-operators-czlcm" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.817607 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fa33f19-15c5-4f78-88ef-1db6eb605aa7-utilities\") pod \"certified-operators-czlcm\" (UID: \"3fa33f19-15c5-4f78-88ef-1db6eb605aa7\") " pod="openshift-marketplace/certified-operators-czlcm" Mar 18 18:04:35 crc kubenswrapper[4830]: E0318 18:04:35.817858 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:36.317844923 +0000 UTC m=+110.885475255 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.833953 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5q2g\" (UniqueName: \"kubernetes.io/projected/3fa33f19-15c5-4f78-88ef-1db6eb605aa7-kube-api-access-g5q2g\") pod \"certified-operators-czlcm\" (UID: \"3fa33f19-15c5-4f78-88ef-1db6eb605aa7\") " pod="openshift-marketplace/certified-operators-czlcm" Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.917566 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:35 crc kubenswrapper[4830]: E0318 18:04:35.917833 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:36.41774068 +0000 UTC m=+110.985371012 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:35 crc kubenswrapper[4830]: I0318 18:04:35.936375 4830 ???:1] "http: TLS handshake error from 192.168.126.11:52650: no serving certificate available for the kubelet" Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.010828 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-czlcm" Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.020294 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:36 crc kubenswrapper[4830]: E0318 18:04:36.020759 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:36.520740296 +0000 UTC m=+111.088370628 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.121637 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:36 crc kubenswrapper[4830]: E0318 18:04:36.122099 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:36.62207353 +0000 UTC m=+111.189703862 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.135368 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-szdp2"] Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.142484 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2tsg6"] Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.180282 4830 patch_prober.go:28] interesting pod/router-default-5444994796-mfdzp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 18:04:36 crc kubenswrapper[4830]: [-]has-synced failed: reason withheld Mar 18 18:04:36 crc kubenswrapper[4830]: [+]process-running ok Mar 18 18:04:36 crc kubenswrapper[4830]: healthz check failed Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.180362 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mfdzp" podUID="008bfbc9-9b16-4769-ba0a-116a67b7fdb4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.198746 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cnwlr" Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.223747 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:36 crc kubenswrapper[4830]: E0318 18:04:36.224364 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:36.724345457 +0000 UTC m=+111.291975789 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.327900 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/35a6cab9-b63f-4ed5-ac08-897e894498c5-proxy-ca-bundles\") pod \"35a6cab9-b63f-4ed5-ac08-897e894498c5\" (UID: \"35a6cab9-b63f-4ed5-ac08-897e894498c5\") " Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.328011 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/35a6cab9-b63f-4ed5-ac08-897e894498c5-client-ca\") pod \"35a6cab9-b63f-4ed5-ac08-897e894498c5\" (UID: \"35a6cab9-b63f-4ed5-ac08-897e894498c5\") " Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.328238 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.328362 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35a6cab9-b63f-4ed5-ac08-897e894498c5-serving-cert\") pod \"35a6cab9-b63f-4ed5-ac08-897e894498c5\" (UID: \"35a6cab9-b63f-4ed5-ac08-897e894498c5\") " Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.328410 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zxs7\" (UniqueName: \"kubernetes.io/projected/35a6cab9-b63f-4ed5-ac08-897e894498c5-kube-api-access-6zxs7\") pod \"35a6cab9-b63f-4ed5-ac08-897e894498c5\" (UID: \"35a6cab9-b63f-4ed5-ac08-897e894498c5\") " Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.328438 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35a6cab9-b63f-4ed5-ac08-897e894498c5-config\") pod \"35a6cab9-b63f-4ed5-ac08-897e894498c5\" (UID: \"35a6cab9-b63f-4ed5-ac08-897e894498c5\") " Mar 18 18:04:36 crc kubenswrapper[4830]: E0318 18:04:36.328513 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:36.828472892 +0000 UTC m=+111.396103224 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.328688 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:36 crc kubenswrapper[4830]: E0318 18:04:36.329212 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:04:36.82919773 +0000 UTC m=+111.396828062 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nr285" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.329482 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35a6cab9-b63f-4ed5-ac08-897e894498c5-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "35a6cab9-b63f-4ed5-ac08-897e894498c5" (UID: "35a6cab9-b63f-4ed5-ac08-897e894498c5"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.329538 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35a6cab9-b63f-4ed5-ac08-897e894498c5-config" (OuterVolumeSpecName: "config") pod "35a6cab9-b63f-4ed5-ac08-897e894498c5" (UID: "35a6cab9-b63f-4ed5-ac08-897e894498c5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.329570 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35a6cab9-b63f-4ed5-ac08-897e894498c5-client-ca" (OuterVolumeSpecName: "client-ca") pod "35a6cab9-b63f-4ed5-ac08-897e894498c5" (UID: "35a6cab9-b63f-4ed5-ac08-897e894498c5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.338175 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35a6cab9-b63f-4ed5-ac08-897e894498c5-kube-api-access-6zxs7" (OuterVolumeSpecName: "kube-api-access-6zxs7") pod "35a6cab9-b63f-4ed5-ac08-897e894498c5" (UID: "35a6cab9-b63f-4ed5-ac08-897e894498c5"). InnerVolumeSpecName "kube-api-access-6zxs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.349661 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35a6cab9-b63f-4ed5-ac08-897e894498c5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "35a6cab9-b63f-4ed5-ac08-897e894498c5" (UID: "35a6cab9-b63f-4ed5-ac08-897e894498c5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.375584 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-klcdh"] Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.382238 4830 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-18T18:04:35.715001871Z","Handler":null,"Name":""} Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.386549 4830 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.386598 4830 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.432640 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.433045 4830 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/35a6cab9-b63f-4ed5-ac08-897e894498c5-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.433058 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35a6cab9-b63f-4ed5-ac08-897e894498c5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.433069 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zxs7\" (UniqueName: \"kubernetes.io/projected/35a6cab9-b63f-4ed5-ac08-897e894498c5-kube-api-access-6zxs7\") on node \"crc\" DevicePath \"\"" Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.433078 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35a6cab9-b63f-4ed5-ac08-897e894498c5-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.433085 4830 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/35a6cab9-b63f-4ed5-ac08-897e894498c5-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.450707 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.490103 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-czlcm"] Mar 18 18:04:36 crc kubenswrapper[4830]: W0318 18:04:36.494073 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fa33f19_15c5_4f78_88ef_1db6eb605aa7.slice/crio-06c5295f74367633149d70f9a50caf740d43a319753a7049456b9f2f0819aedf WatchSource:0}: Error finding container 06c5295f74367633149d70f9a50caf740d43a319753a7049456b9f2f0819aedf: Status 404 returned error can't find the container with id 06c5295f74367633149d70f9a50caf740d43a319753a7049456b9f2f0819aedf Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.523415 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-czlcm" event={"ID":"3fa33f19-15c5-4f78-88ef-1db6eb605aa7","Type":"ContainerStarted","Data":"06c5295f74367633149d70f9a50caf740d43a319753a7049456b9f2f0819aedf"} Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.530725 4830 generic.go:334] "Generic (PLEG): container finished" podID="4d8e7b87-f442-4d60-bd65-35eacd097689" containerID="a412c2964a8fd96a07c4fec16c5e15e2062037a306dcb2afb25431fe647d9fad" exitCode=0 Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.535456 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.536878 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.530978 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2tsg6" event={"ID":"4d8e7b87-f442-4d60-bd65-35eacd097689","Type":"ContainerDied","Data":"a412c2964a8fd96a07c4fec16c5e15e2062037a306dcb2afb25431fe647d9fad"} Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.537974 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2tsg6" event={"ID":"4d8e7b87-f442-4d60-bd65-35eacd097689","Type":"ContainerStarted","Data":"71d5582447836975691b3f07089271b4406186ddc756751f426b635cc694078b"} Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.547034 4830 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.547071 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.563259 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-r267p" event={"ID":"714819f9-4554-4f15-bf01-40ba2f401872","Type":"ContainerStarted","Data":"56ba36373d5dbad3897032b9323498d5f2799e87642739703410558dc4eadd40"} Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.591358 4830 generic.go:334] "Generic (PLEG): container finished" podID="35a6cab9-b63f-4ed5-ac08-897e894498c5" containerID="df71dba90fd8d7a4a61b513e9e04447374d83370486ca198e2ce4ec93e7c66a4" exitCode=0 Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.591505 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cnwlr" event={"ID":"35a6cab9-b63f-4ed5-ac08-897e894498c5","Type":"ContainerDied","Data":"df71dba90fd8d7a4a61b513e9e04447374d83370486ca198e2ce4ec93e7c66a4"} Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.591552 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cnwlr" event={"ID":"35a6cab9-b63f-4ed5-ac08-897e894498c5","Type":"ContainerDied","Data":"ac01a25dc5e6d39db2ae11b713c3febc375735b48f2d725dd976dd196a3c75cf"} Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.591587 4830 scope.go:117] "RemoveContainer" containerID="df71dba90fd8d7a4a61b513e9e04447374d83370486ca198e2ce4ec93e7c66a4" Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.591815 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cnwlr" Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.592218 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nr285\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.597052 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-r267p" podStartSLOduration=11.597031196 podStartE2EDuration="11.597031196s" podCreationTimestamp="2026-03-18 18:04:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:36.59245464 +0000 UTC m=+111.160084972" watchObservedRunningTime="2026-03-18 18:04:36.597031196 +0000 UTC m=+111.164661528" Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.620860 4830 generic.go:334] "Generic (PLEG): container finished" podID="b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b" containerID="8cc3adf8017632f8d233674fe40a69eff7e10960961467c569441ff2b858da46" exitCode=0 Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.620963 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szdp2" event={"ID":"b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b","Type":"ContainerDied","Data":"8cc3adf8017632f8d233674fe40a69eff7e10960961467c569441ff2b858da46"} Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.620989 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szdp2" event={"ID":"b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b","Type":"ContainerStarted","Data":"c4679f51866bb6dfe4b090db3a1f5996d6f4217a9f48336c793076441a85c87e"} Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.623759 4830 scope.go:117] "RemoveContainer" containerID="df71dba90fd8d7a4a61b513e9e04447374d83370486ca198e2ce4ec93e7c66a4" Mar 18 18:04:36 crc kubenswrapper[4830]: E0318 18:04:36.624651 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df71dba90fd8d7a4a61b513e9e04447374d83370486ca198e2ce4ec93e7c66a4\": container with ID starting with df71dba90fd8d7a4a61b513e9e04447374d83370486ca198e2ce4ec93e7c66a4 not found: ID does not exist" containerID="df71dba90fd8d7a4a61b513e9e04447374d83370486ca198e2ce4ec93e7c66a4" Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.624681 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df71dba90fd8d7a4a61b513e9e04447374d83370486ca198e2ce4ec93e7c66a4"} err="failed to get container status \"df71dba90fd8d7a4a61b513e9e04447374d83370486ca198e2ce4ec93e7c66a4\": rpc error: code = NotFound desc = could not find container \"df71dba90fd8d7a4a61b513e9e04447374d83370486ca198e2ce4ec93e7c66a4\": container with ID starting with df71dba90fd8d7a4a61b513e9e04447374d83370486ca198e2ce4ec93e7c66a4 not found: ID does not exist" Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.634063 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27p2h" podUID="defba3b0-b36e-4b8e-a8b1-577782a54249" containerName="route-controller-manager" containerID="cri-o://71995208bc4a6026db9f35d327bf7ef5221f0e88b84d41677626505393d9465d" gracePeriod=30 Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.634177 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-klcdh" event={"ID":"b1176643-c4d6-4be9-8317-a99886a32b29","Type":"ContainerStarted","Data":"e7c7d3b3bab45bacb82d6a5a86d7bfa957c42b029cc974cd5e089fab79087c8d"} Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.662033 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cnwlr"] Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.666118 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cnwlr"] Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.756359 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.937312 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 18:04:36 crc kubenswrapper[4830]: E0318 18:04:36.937565 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35a6cab9-b63f-4ed5-ac08-897e894498c5" containerName="controller-manager" Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.937581 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="35a6cab9-b63f-4ed5-ac08-897e894498c5" containerName="controller-manager" Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.937706 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="35a6cab9-b63f-4ed5-ac08-897e894498c5" containerName="controller-manager" Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.939191 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.940752 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/70e3f193-7e57-4f53-b136-7642548b0767-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"70e3f193-7e57-4f53-b136-7642548b0767\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.940887 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70e3f193-7e57-4f53-b136-7642548b0767-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"70e3f193-7e57-4f53-b136-7642548b0767\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.944217 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.944525 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.947859 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 18:04:36 crc kubenswrapper[4830]: I0318 18:04:36.998371 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27p2h" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.004287 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nr285"] Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.041845 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70e3f193-7e57-4f53-b136-7642548b0767-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"70e3f193-7e57-4f53-b136-7642548b0767\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.042064 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/70e3f193-7e57-4f53-b136-7642548b0767-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"70e3f193-7e57-4f53-b136-7642548b0767\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.042253 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/70e3f193-7e57-4f53-b136-7642548b0767-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"70e3f193-7e57-4f53-b136-7642548b0767\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.060791 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70e3f193-7e57-4f53-b136-7642548b0767-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"70e3f193-7e57-4f53-b136-7642548b0767\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.142622 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/defba3b0-b36e-4b8e-a8b1-577782a54249-client-ca\") pod \"defba3b0-b36e-4b8e-a8b1-577782a54249\" (UID: \"defba3b0-b36e-4b8e-a8b1-577782a54249\") " Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.142686 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tvnn\" (UniqueName: \"kubernetes.io/projected/defba3b0-b36e-4b8e-a8b1-577782a54249-kube-api-access-6tvnn\") pod \"defba3b0-b36e-4b8e-a8b1-577782a54249\" (UID: \"defba3b0-b36e-4b8e-a8b1-577782a54249\") " Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.142703 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/defba3b0-b36e-4b8e-a8b1-577782a54249-serving-cert\") pod \"defba3b0-b36e-4b8e-a8b1-577782a54249\" (UID: \"defba3b0-b36e-4b8e-a8b1-577782a54249\") " Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.142780 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defba3b0-b36e-4b8e-a8b1-577782a54249-config\") pod \"defba3b0-b36e-4b8e-a8b1-577782a54249\" (UID: \"defba3b0-b36e-4b8e-a8b1-577782a54249\") " Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.143854 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/defba3b0-b36e-4b8e-a8b1-577782a54249-client-ca" (OuterVolumeSpecName: "client-ca") pod "defba3b0-b36e-4b8e-a8b1-577782a54249" (UID: "defba3b0-b36e-4b8e-a8b1-577782a54249"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.143883 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/defba3b0-b36e-4b8e-a8b1-577782a54249-config" (OuterVolumeSpecName: "config") pod "defba3b0-b36e-4b8e-a8b1-577782a54249" (UID: "defba3b0-b36e-4b8e-a8b1-577782a54249"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.146360 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/defba3b0-b36e-4b8e-a8b1-577782a54249-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "defba3b0-b36e-4b8e-a8b1-577782a54249" (UID: "defba3b0-b36e-4b8e-a8b1-577782a54249"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.147257 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/defba3b0-b36e-4b8e-a8b1-577782a54249-kube-api-access-6tvnn" (OuterVolumeSpecName: "kube-api-access-6tvnn") pod "defba3b0-b36e-4b8e-a8b1-577782a54249" (UID: "defba3b0-b36e-4b8e-a8b1-577782a54249"). InnerVolumeSpecName "kube-api-access-6tvnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.155892 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pgl85"] Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.156216 4830 patch_prober.go:28] interesting pod/router-default-5444994796-mfdzp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 18:04:37 crc kubenswrapper[4830]: [-]has-synced failed: reason withheld Mar 18 18:04:37 crc kubenswrapper[4830]: [+]process-running ok Mar 18 18:04:37 crc kubenswrapper[4830]: healthz check failed Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.156413 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mfdzp" podUID="008bfbc9-9b16-4769-ba0a-116a67b7fdb4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 18:04:37 crc kubenswrapper[4830]: E0318 18:04:37.156714 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="defba3b0-b36e-4b8e-a8b1-577782a54249" containerName="route-controller-manager" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.156808 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="defba3b0-b36e-4b8e-a8b1-577782a54249" containerName="route-controller-manager" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.156992 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="defba3b0-b36e-4b8e-a8b1-577782a54249" containerName="route-controller-manager" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.157746 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pgl85" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.165253 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.168634 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pgl85"] Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.243858 4830 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/defba3b0-b36e-4b8e-a8b1-577782a54249-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.243895 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tvnn\" (UniqueName: \"kubernetes.io/projected/defba3b0-b36e-4b8e-a8b1-577782a54249-kube-api-access-6tvnn\") on node \"crc\" DevicePath \"\"" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.243905 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/defba3b0-b36e-4b8e-a8b1-577782a54249-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.243914 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defba3b0-b36e-4b8e-a8b1-577782a54249-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.247180 4830 ???:1] "http: TLS handshake error from 192.168.126.11:49792: no serving certificate available for the kubelet" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.291823 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.345602 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28acc7fe-7976-4396-89b7-c17a9e836b22-utilities\") pod \"redhat-marketplace-pgl85\" (UID: \"28acc7fe-7976-4396-89b7-c17a9e836b22\") " pod="openshift-marketplace/redhat-marketplace-pgl85" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.345675 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28acc7fe-7976-4396-89b7-c17a9e836b22-catalog-content\") pod \"redhat-marketplace-pgl85\" (UID: \"28acc7fe-7976-4396-89b7-c17a9e836b22\") " pod="openshift-marketplace/redhat-marketplace-pgl85" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.345764 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbpht\" (UniqueName: \"kubernetes.io/projected/28acc7fe-7976-4396-89b7-c17a9e836b22-kube-api-access-jbpht\") pod \"redhat-marketplace-pgl85\" (UID: \"28acc7fe-7976-4396-89b7-c17a9e836b22\") " pod="openshift-marketplace/redhat-marketplace-pgl85" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.446469 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28acc7fe-7976-4396-89b7-c17a9e836b22-catalog-content\") pod \"redhat-marketplace-pgl85\" (UID: \"28acc7fe-7976-4396-89b7-c17a9e836b22\") " pod="openshift-marketplace/redhat-marketplace-pgl85" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.446549 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbpht\" (UniqueName: \"kubernetes.io/projected/28acc7fe-7976-4396-89b7-c17a9e836b22-kube-api-access-jbpht\") pod \"redhat-marketplace-pgl85\" (UID: \"28acc7fe-7976-4396-89b7-c17a9e836b22\") " pod="openshift-marketplace/redhat-marketplace-pgl85" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.446608 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28acc7fe-7976-4396-89b7-c17a9e836b22-utilities\") pod \"redhat-marketplace-pgl85\" (UID: \"28acc7fe-7976-4396-89b7-c17a9e836b22\") " pod="openshift-marketplace/redhat-marketplace-pgl85" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.447020 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28acc7fe-7976-4396-89b7-c17a9e836b22-utilities\") pod \"redhat-marketplace-pgl85\" (UID: \"28acc7fe-7976-4396-89b7-c17a9e836b22\") " pod="openshift-marketplace/redhat-marketplace-pgl85" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.447288 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28acc7fe-7976-4396-89b7-c17a9e836b22-catalog-content\") pod \"redhat-marketplace-pgl85\" (UID: \"28acc7fe-7976-4396-89b7-c17a9e836b22\") " pod="openshift-marketplace/redhat-marketplace-pgl85" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.451810 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.451849 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.461687 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.472901 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbpht\" (UniqueName: \"kubernetes.io/projected/28acc7fe-7976-4396-89b7-c17a9e836b22-kube-api-access-jbpht\") pod \"redhat-marketplace-pgl85\" (UID: \"28acc7fe-7976-4396-89b7-c17a9e836b22\") " pod="openshift-marketplace/redhat-marketplace-pgl85" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.473271 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pgl85" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.516201 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 18:04:37 crc kubenswrapper[4830]: W0318 18:04:37.526123 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod70e3f193_7e57_4f53_b136_7642548b0767.slice/crio-68b759e33edb7c8e8cbcf3b142e16185326918bacad0995d5e370128cfb0200d WatchSource:0}: Error finding container 68b759e33edb7c8e8cbcf3b142e16185326918bacad0995d5e370128cfb0200d: Status 404 returned error can't find the container with id 68b759e33edb7c8e8cbcf3b142e16185326918bacad0995d5e370128cfb0200d Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.553061 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sk6wx"] Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.554054 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sk6wx" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.565436 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sk6wx"] Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.661206 4830 generic.go:334] "Generic (PLEG): container finished" podID="defba3b0-b36e-4b8e-a8b1-577782a54249" containerID="71995208bc4a6026db9f35d327bf7ef5221f0e88b84d41677626505393d9465d" exitCode=0 Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.661635 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27p2h" event={"ID":"defba3b0-b36e-4b8e-a8b1-577782a54249","Type":"ContainerDied","Data":"71995208bc4a6026db9f35d327bf7ef5221f0e88b84d41677626505393d9465d"} Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.661647 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27p2h" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.661681 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-27p2h" event={"ID":"defba3b0-b36e-4b8e-a8b1-577782a54249","Type":"ContainerDied","Data":"2216f56c8f6a58801f8930f2cb045c34fa23d8a783b3914886e516f7ea4c2efa"} Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.661715 4830 scope.go:117] "RemoveContainer" containerID="71995208bc4a6026db9f35d327bf7ef5221f0e88b84d41677626505393d9465d" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.684540 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nr285" event={"ID":"f83d2867-10a5-46ca-9f3c-caedae650499","Type":"ContainerStarted","Data":"dd3ad7d0dda9e88427af12f7a7df492711ff983a0f9bf4e1842262d264dc5c3b"} Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.684604 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nr285" event={"ID":"f83d2867-10a5-46ca-9f3c-caedae650499","Type":"ContainerStarted","Data":"bd444a56f83b7168edd0b851e2c895ba41073af54f370fd23bdfa82a5ba88956"} Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.684826 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.701794 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pgl85"] Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.719439 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-nr285" podStartSLOduration=70.719418021 podStartE2EDuration="1m10.719418021s" podCreationTimestamp="2026-03-18 18:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:37.713697186 +0000 UTC m=+112.281327518" watchObservedRunningTime="2026-03-18 18:04:37.719418021 +0000 UTC m=+112.287048353" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.733187 4830 patch_prober.go:28] interesting pod/downloads-7954f5f757-t7xl2 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.733277 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-t7xl2" podUID="8b0c7aa1-248e-4847-97f5-c08c17e78c3d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.733187 4830 patch_prober.go:28] interesting pod/downloads-7954f5f757-t7xl2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.733340 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-t7xl2" podUID="8b0c7aa1-248e-4847-97f5-c08c17e78c3d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.754362 4830 generic.go:334] "Generic (PLEG): container finished" podID="19aab548-96a0-4056-8226-f9e7cf4b3ca3" containerID="de7a7195b9b43f06d4c613b87f6701cc48012d5464974eeeccde1f5a1e890958" exitCode=0 Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.754424 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-6hx2v" event={"ID":"19aab548-96a0-4056-8226-f9e7cf4b3ca3","Type":"ContainerDied","Data":"de7a7195b9b43f06d4c613b87f6701cc48012d5464974eeeccde1f5a1e890958"} Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.762145 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"70e3f193-7e57-4f53-b136-7642548b0767","Type":"ContainerStarted","Data":"68b759e33edb7c8e8cbcf3b142e16185326918bacad0995d5e370128cfb0200d"} Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.764004 4830 generic.go:334] "Generic (PLEG): container finished" podID="b1176643-c4d6-4be9-8317-a99886a32b29" containerID="392fafb4baa602a9290fd49e7c1bdec5921dd5581853104fc1152964d7266c1c" exitCode=0 Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.764155 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-klcdh" event={"ID":"b1176643-c4d6-4be9-8317-a99886a32b29","Type":"ContainerDied","Data":"392fafb4baa602a9290fd49e7c1bdec5921dd5581853104fc1152964d7266c1c"} Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.754022 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbfe6d63-05ee-40d2-affa-03b9310a27c1-catalog-content\") pod \"redhat-marketplace-sk6wx\" (UID: \"dbfe6d63-05ee-40d2-affa-03b9310a27c1\") " pod="openshift-marketplace/redhat-marketplace-sk6wx" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.773675 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkzx6\" (UniqueName: \"kubernetes.io/projected/dbfe6d63-05ee-40d2-affa-03b9310a27c1-kube-api-access-dkzx6\") pod \"redhat-marketplace-sk6wx\" (UID: \"dbfe6d63-05ee-40d2-affa-03b9310a27c1\") " pod="openshift-marketplace/redhat-marketplace-sk6wx" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.774024 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbfe6d63-05ee-40d2-affa-03b9310a27c1-utilities\") pod \"redhat-marketplace-sk6wx\" (UID: \"dbfe6d63-05ee-40d2-affa-03b9310a27c1\") " pod="openshift-marketplace/redhat-marketplace-sk6wx" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.778878 4830 generic.go:334] "Generic (PLEG): container finished" podID="3fa33f19-15c5-4f78-88ef-1db6eb605aa7" containerID="4177dc22a635c9ef897f3230b387441b7b7fe41e4761235bbbfd694d82338cc8" exitCode=0 Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.781138 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-czlcm" event={"ID":"3fa33f19-15c5-4f78-88ef-1db6eb605aa7","Type":"ContainerDied","Data":"4177dc22a635c9ef897f3230b387441b7b7fe41e4761235bbbfd694d82338cc8"} Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.793625 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-78nbq" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.799349 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-27p2h"] Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.799959 4830 scope.go:117] "RemoveContainer" containerID="71995208bc4a6026db9f35d327bf7ef5221f0e88b84d41677626505393d9465d" Mar 18 18:04:37 crc kubenswrapper[4830]: E0318 18:04:37.800269 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71995208bc4a6026db9f35d327bf7ef5221f0e88b84d41677626505393d9465d\": container with ID starting with 71995208bc4a6026db9f35d327bf7ef5221f0e88b84d41677626505393d9465d not found: ID does not exist" containerID="71995208bc4a6026db9f35d327bf7ef5221f0e88b84d41677626505393d9465d" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.800292 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71995208bc4a6026db9f35d327bf7ef5221f0e88b84d41677626505393d9465d"} err="failed to get container status \"71995208bc4a6026db9f35d327bf7ef5221f0e88b84d41677626505393d9465d\": rpc error: code = NotFound desc = could not find container \"71995208bc4a6026db9f35d327bf7ef5221f0e88b84d41677626505393d9465d\": container with ID starting with 71995208bc4a6026db9f35d327bf7ef5221f0e88b84d41677626505393d9465d not found: ID does not exist" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.802761 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-27p2h"] Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.875480 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbfe6d63-05ee-40d2-affa-03b9310a27c1-catalog-content\") pod \"redhat-marketplace-sk6wx\" (UID: \"dbfe6d63-05ee-40d2-affa-03b9310a27c1\") " pod="openshift-marketplace/redhat-marketplace-sk6wx" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.875567 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkzx6\" (UniqueName: \"kubernetes.io/projected/dbfe6d63-05ee-40d2-affa-03b9310a27c1-kube-api-access-dkzx6\") pod \"redhat-marketplace-sk6wx\" (UID: \"dbfe6d63-05ee-40d2-affa-03b9310a27c1\") " pod="openshift-marketplace/redhat-marketplace-sk6wx" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.875643 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbfe6d63-05ee-40d2-affa-03b9310a27c1-utilities\") pod \"redhat-marketplace-sk6wx\" (UID: \"dbfe6d63-05ee-40d2-affa-03b9310a27c1\") " pod="openshift-marketplace/redhat-marketplace-sk6wx" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.878129 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbfe6d63-05ee-40d2-affa-03b9310a27c1-catalog-content\") pod \"redhat-marketplace-sk6wx\" (UID: \"dbfe6d63-05ee-40d2-affa-03b9310a27c1\") " pod="openshift-marketplace/redhat-marketplace-sk6wx" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.879911 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbfe6d63-05ee-40d2-affa-03b9310a27c1-utilities\") pod \"redhat-marketplace-sk6wx\" (UID: \"dbfe6d63-05ee-40d2-affa-03b9310a27c1\") " pod="openshift-marketplace/redhat-marketplace-sk6wx" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.936914 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-j2j42" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.949001 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-cww5w" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.949817 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkzx6\" (UniqueName: \"kubernetes.io/projected/dbfe6d63-05ee-40d2-affa-03b9310a27c1-kube-api-access-dkzx6\") pod \"redhat-marketplace-sk6wx\" (UID: \"dbfe6d63-05ee-40d2-affa-03b9310a27c1\") " pod="openshift-marketplace/redhat-marketplace-sk6wx" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.979833 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-789cdc55bc-7snpt"] Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.981016 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-789cdc55bc-7snpt" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.982349 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5bdf586cd6-pg526"] Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.982837 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.983336 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.984856 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.985005 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.983389 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bdf586cd6-pg526" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.989559 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-789cdc55bc-7snpt"] Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.989623 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.990159 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.990295 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.990293 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.991011 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.991445 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.993234 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 18:04:37 crc kubenswrapper[4830]: I0318 18:04:37.995914 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5bdf586cd6-pg526"] Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.003917 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.005062 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.029166 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.029308 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.040108 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.078867 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b3a31b8-84ff-4138-b4fe-20f46753715f-config\") pod \"controller-manager-5bdf586cd6-pg526\" (UID: \"4b3a31b8-84ff-4138-b4fe-20f46753715f\") " pod="openshift-controller-manager/controller-manager-5bdf586cd6-pg526" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.078918 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67e70641-19a9-4646-899e-4427b292fe9a-config\") pod \"route-controller-manager-789cdc55bc-7snpt\" (UID: \"67e70641-19a9-4646-899e-4427b292fe9a\") " pod="openshift-route-controller-manager/route-controller-manager-789cdc55bc-7snpt" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.078943 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b3a31b8-84ff-4138-b4fe-20f46753715f-proxy-ca-bundles\") pod \"controller-manager-5bdf586cd6-pg526\" (UID: \"4b3a31b8-84ff-4138-b4fe-20f46753715f\") " pod="openshift-controller-manager/controller-manager-5bdf586cd6-pg526" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.078983 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67e70641-19a9-4646-899e-4427b292fe9a-serving-cert\") pod \"route-controller-manager-789cdc55bc-7snpt\" (UID: \"67e70641-19a9-4646-899e-4427b292fe9a\") " pod="openshift-route-controller-manager/route-controller-manager-789cdc55bc-7snpt" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.079010 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dbg6\" (UniqueName: \"kubernetes.io/projected/67e70641-19a9-4646-899e-4427b292fe9a-kube-api-access-6dbg6\") pod \"route-controller-manager-789cdc55bc-7snpt\" (UID: \"67e70641-19a9-4646-899e-4427b292fe9a\") " pod="openshift-route-controller-manager/route-controller-manager-789cdc55bc-7snpt" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.079037 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh88t\" (UniqueName: \"kubernetes.io/projected/4b3a31b8-84ff-4138-b4fe-20f46753715f-kube-api-access-sh88t\") pod \"controller-manager-5bdf586cd6-pg526\" (UID: \"4b3a31b8-84ff-4138-b4fe-20f46753715f\") " pod="openshift-controller-manager/controller-manager-5bdf586cd6-pg526" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.079054 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67e70641-19a9-4646-899e-4427b292fe9a-client-ca\") pod \"route-controller-manager-789cdc55bc-7snpt\" (UID: \"67e70641-19a9-4646-899e-4427b292fe9a\") " pod="openshift-route-controller-manager/route-controller-manager-789cdc55bc-7snpt" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.079100 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b3a31b8-84ff-4138-b4fe-20f46753715f-serving-cert\") pod \"controller-manager-5bdf586cd6-pg526\" (UID: \"4b3a31b8-84ff-4138-b4fe-20f46753715f\") " pod="openshift-controller-manager/controller-manager-5bdf586cd6-pg526" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.079115 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b3a31b8-84ff-4138-b4fe-20f46753715f-client-ca\") pod \"controller-manager-5bdf586cd6-pg526\" (UID: \"4b3a31b8-84ff-4138-b4fe-20f46753715f\") " pod="openshift-controller-manager/controller-manager-5bdf586cd6-pg526" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.152825 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-mfdzp" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.154933 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5rhcb"] Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.155911 4830 patch_prober.go:28] interesting pod/router-default-5444994796-mfdzp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 18:04:38 crc kubenswrapper[4830]: [-]has-synced failed: reason withheld Mar 18 18:04:38 crc kubenswrapper[4830]: [+]process-running ok Mar 18 18:04:38 crc kubenswrapper[4830]: healthz check failed Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.155952 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mfdzp" podUID="008bfbc9-9b16-4769-ba0a-116a67b7fdb4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.156095 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5rhcb" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.158656 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.175886 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5rhcb"] Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.180041 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b3a31b8-84ff-4138-b4fe-20f46753715f-serving-cert\") pod \"controller-manager-5bdf586cd6-pg526\" (UID: \"4b3a31b8-84ff-4138-b4fe-20f46753715f\") " pod="openshift-controller-manager/controller-manager-5bdf586cd6-pg526" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.180094 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b3a31b8-84ff-4138-b4fe-20f46753715f-client-ca\") pod \"controller-manager-5bdf586cd6-pg526\" (UID: \"4b3a31b8-84ff-4138-b4fe-20f46753715f\") " pod="openshift-controller-manager/controller-manager-5bdf586cd6-pg526" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.180164 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b3a31b8-84ff-4138-b4fe-20f46753715f-config\") pod \"controller-manager-5bdf586cd6-pg526\" (UID: \"4b3a31b8-84ff-4138-b4fe-20f46753715f\") " pod="openshift-controller-manager/controller-manager-5bdf586cd6-pg526" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.180191 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67e70641-19a9-4646-899e-4427b292fe9a-config\") pod \"route-controller-manager-789cdc55bc-7snpt\" (UID: \"67e70641-19a9-4646-899e-4427b292fe9a\") " pod="openshift-route-controller-manager/route-controller-manager-789cdc55bc-7snpt" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.180218 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b3a31b8-84ff-4138-b4fe-20f46753715f-proxy-ca-bundles\") pod \"controller-manager-5bdf586cd6-pg526\" (UID: \"4b3a31b8-84ff-4138-b4fe-20f46753715f\") " pod="openshift-controller-manager/controller-manager-5bdf586cd6-pg526" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.180248 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67e70641-19a9-4646-899e-4427b292fe9a-serving-cert\") pod \"route-controller-manager-789cdc55bc-7snpt\" (UID: \"67e70641-19a9-4646-899e-4427b292fe9a\") " pod="openshift-route-controller-manager/route-controller-manager-789cdc55bc-7snpt" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.180270 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dbg6\" (UniqueName: \"kubernetes.io/projected/67e70641-19a9-4646-899e-4427b292fe9a-kube-api-access-6dbg6\") pod \"route-controller-manager-789cdc55bc-7snpt\" (UID: \"67e70641-19a9-4646-899e-4427b292fe9a\") " pod="openshift-route-controller-manager/route-controller-manager-789cdc55bc-7snpt" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.180294 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh88t\" (UniqueName: \"kubernetes.io/projected/4b3a31b8-84ff-4138-b4fe-20f46753715f-kube-api-access-sh88t\") pod \"controller-manager-5bdf586cd6-pg526\" (UID: \"4b3a31b8-84ff-4138-b4fe-20f46753715f\") " pod="openshift-controller-manager/controller-manager-5bdf586cd6-pg526" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.180312 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67e70641-19a9-4646-899e-4427b292fe9a-client-ca\") pod \"route-controller-manager-789cdc55bc-7snpt\" (UID: \"67e70641-19a9-4646-899e-4427b292fe9a\") " pod="openshift-route-controller-manager/route-controller-manager-789cdc55bc-7snpt" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.181143 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67e70641-19a9-4646-899e-4427b292fe9a-client-ca\") pod \"route-controller-manager-789cdc55bc-7snpt\" (UID: \"67e70641-19a9-4646-899e-4427b292fe9a\") " pod="openshift-route-controller-manager/route-controller-manager-789cdc55bc-7snpt" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.182693 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b3a31b8-84ff-4138-b4fe-20f46753715f-client-ca\") pod \"controller-manager-5bdf586cd6-pg526\" (UID: \"4b3a31b8-84ff-4138-b4fe-20f46753715f\") " pod="openshift-controller-manager/controller-manager-5bdf586cd6-pg526" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.183080 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b3a31b8-84ff-4138-b4fe-20f46753715f-proxy-ca-bundles\") pod \"controller-manager-5bdf586cd6-pg526\" (UID: \"4b3a31b8-84ff-4138-b4fe-20f46753715f\") " pod="openshift-controller-manager/controller-manager-5bdf586cd6-pg526" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.183612 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67e70641-19a9-4646-899e-4427b292fe9a-config\") pod \"route-controller-manager-789cdc55bc-7snpt\" (UID: \"67e70641-19a9-4646-899e-4427b292fe9a\") " pod="openshift-route-controller-manager/route-controller-manager-789cdc55bc-7snpt" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.184166 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sk6wx" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.187101 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b3a31b8-84ff-4138-b4fe-20f46753715f-config\") pod \"controller-manager-5bdf586cd6-pg526\" (UID: \"4b3a31b8-84ff-4138-b4fe-20f46753715f\") " pod="openshift-controller-manager/controller-manager-5bdf586cd6-pg526" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.193256 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b3a31b8-84ff-4138-b4fe-20f46753715f-serving-cert\") pod \"controller-manager-5bdf586cd6-pg526\" (UID: \"4b3a31b8-84ff-4138-b4fe-20f46753715f\") " pod="openshift-controller-manager/controller-manager-5bdf586cd6-pg526" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.201744 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67e70641-19a9-4646-899e-4427b292fe9a-serving-cert\") pod \"route-controller-manager-789cdc55bc-7snpt\" (UID: \"67e70641-19a9-4646-899e-4427b292fe9a\") " pod="openshift-route-controller-manager/route-controller-manager-789cdc55bc-7snpt" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.205495 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dbg6\" (UniqueName: \"kubernetes.io/projected/67e70641-19a9-4646-899e-4427b292fe9a-kube-api-access-6dbg6\") pod \"route-controller-manager-789cdc55bc-7snpt\" (UID: \"67e70641-19a9-4646-899e-4427b292fe9a\") " pod="openshift-route-controller-manager/route-controller-manager-789cdc55bc-7snpt" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.205595 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh88t\" (UniqueName: \"kubernetes.io/projected/4b3a31b8-84ff-4138-b4fe-20f46753715f-kube-api-access-sh88t\") pod \"controller-manager-5bdf586cd6-pg526\" (UID: \"4b3a31b8-84ff-4138-b4fe-20f46753715f\") " pod="openshift-controller-manager/controller-manager-5bdf586cd6-pg526" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.244039 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35a6cab9-b63f-4ed5-ac08-897e894498c5" path="/var/lib/kubelet/pods/35a6cab9-b63f-4ed5-ac08-897e894498c5/volumes" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.246101 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.246711 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="defba3b0-b36e-4b8e-a8b1-577782a54249" path="/var/lib/kubelet/pods/defba3b0-b36e-4b8e-a8b1-577782a54249/volumes" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.281684 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9160fc9-aa00-4ce7-9ea2-15aac1e11e00-catalog-content\") pod \"redhat-operators-5rhcb\" (UID: \"b9160fc9-aa00-4ce7-9ea2-15aac1e11e00\") " pod="openshift-marketplace/redhat-operators-5rhcb" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.281801 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9160fc9-aa00-4ce7-9ea2-15aac1e11e00-utilities\") pod \"redhat-operators-5rhcb\" (UID: \"b9160fc9-aa00-4ce7-9ea2-15aac1e11e00\") " pod="openshift-marketplace/redhat-operators-5rhcb" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.281823 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-476hk\" (UniqueName: \"kubernetes.io/projected/b9160fc9-aa00-4ce7-9ea2-15aac1e11e00-kube-api-access-476hk\") pod \"redhat-operators-5rhcb\" (UID: \"b9160fc9-aa00-4ce7-9ea2-15aac1e11e00\") " pod="openshift-marketplace/redhat-operators-5rhcb" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.369825 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-789cdc55bc-7snpt" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.373716 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sk6wx"] Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.380046 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bdf586cd6-pg526" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.383356 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9160fc9-aa00-4ce7-9ea2-15aac1e11e00-catalog-content\") pod \"redhat-operators-5rhcb\" (UID: \"b9160fc9-aa00-4ce7-9ea2-15aac1e11e00\") " pod="openshift-marketplace/redhat-operators-5rhcb" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.383420 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9160fc9-aa00-4ce7-9ea2-15aac1e11e00-utilities\") pod \"redhat-operators-5rhcb\" (UID: \"b9160fc9-aa00-4ce7-9ea2-15aac1e11e00\") " pod="openshift-marketplace/redhat-operators-5rhcb" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.383454 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-476hk\" (UniqueName: \"kubernetes.io/projected/b9160fc9-aa00-4ce7-9ea2-15aac1e11e00-kube-api-access-476hk\") pod \"redhat-operators-5rhcb\" (UID: \"b9160fc9-aa00-4ce7-9ea2-15aac1e11e00\") " pod="openshift-marketplace/redhat-operators-5rhcb" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.383890 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9160fc9-aa00-4ce7-9ea2-15aac1e11e00-catalog-content\") pod \"redhat-operators-5rhcb\" (UID: \"b9160fc9-aa00-4ce7-9ea2-15aac1e11e00\") " pod="openshift-marketplace/redhat-operators-5rhcb" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.384026 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9160fc9-aa00-4ce7-9ea2-15aac1e11e00-utilities\") pod \"redhat-operators-5rhcb\" (UID: \"b9160fc9-aa00-4ce7-9ea2-15aac1e11e00\") " pod="openshift-marketplace/redhat-operators-5rhcb" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.400648 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-476hk\" (UniqueName: \"kubernetes.io/projected/b9160fc9-aa00-4ce7-9ea2-15aac1e11e00-kube-api-access-476hk\") pod \"redhat-operators-5rhcb\" (UID: \"b9160fc9-aa00-4ce7-9ea2-15aac1e11e00\") " pod="openshift-marketplace/redhat-operators-5rhcb" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.479244 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5rhcb" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.558419 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7qfcr"] Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.563118 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qfcr" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.577874 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7qfcr"] Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.688043 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f62cb37f-382a-4e46-adf7-a26bac073bbe-catalog-content\") pod \"redhat-operators-7qfcr\" (UID: \"f62cb37f-382a-4e46-adf7-a26bac073bbe\") " pod="openshift-marketplace/redhat-operators-7qfcr" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.688610 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f62cb37f-382a-4e46-adf7-a26bac073bbe-utilities\") pod \"redhat-operators-7qfcr\" (UID: \"f62cb37f-382a-4e46-adf7-a26bac073bbe\") " pod="openshift-marketplace/redhat-operators-7qfcr" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.688684 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dbxr\" (UniqueName: \"kubernetes.io/projected/f62cb37f-382a-4e46-adf7-a26bac073bbe-kube-api-access-2dbxr\") pod \"redhat-operators-7qfcr\" (UID: \"f62cb37f-382a-4e46-adf7-a26bac073bbe\") " pod="openshift-marketplace/redhat-operators-7qfcr" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.740310 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-lfz57" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.740399 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-lfz57" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.743924 4830 patch_prober.go:28] interesting pod/console-f9d7485db-lfz57 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.38:8443/health\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.744042 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-lfz57" podUID="95b4d24e-09da-4c0d-9d24-81621509024a" containerName="console" probeResult="failure" output="Get \"https://10.217.0.38:8443/health\": dial tcp 10.217.0.38:8443: connect: connection refused" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.790318 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f62cb37f-382a-4e46-adf7-a26bac073bbe-utilities\") pod \"redhat-operators-7qfcr\" (UID: \"f62cb37f-382a-4e46-adf7-a26bac073bbe\") " pod="openshift-marketplace/redhat-operators-7qfcr" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.790416 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dbxr\" (UniqueName: \"kubernetes.io/projected/f62cb37f-382a-4e46-adf7-a26bac073bbe-kube-api-access-2dbxr\") pod \"redhat-operators-7qfcr\" (UID: \"f62cb37f-382a-4e46-adf7-a26bac073bbe\") " pod="openshift-marketplace/redhat-operators-7qfcr" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.790466 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f62cb37f-382a-4e46-adf7-a26bac073bbe-catalog-content\") pod \"redhat-operators-7qfcr\" (UID: \"f62cb37f-382a-4e46-adf7-a26bac073bbe\") " pod="openshift-marketplace/redhat-operators-7qfcr" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.792791 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f62cb37f-382a-4e46-adf7-a26bac073bbe-catalog-content\") pod \"redhat-operators-7qfcr\" (UID: \"f62cb37f-382a-4e46-adf7-a26bac073bbe\") " pod="openshift-marketplace/redhat-operators-7qfcr" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.792835 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f62cb37f-382a-4e46-adf7-a26bac073bbe-utilities\") pod \"redhat-operators-7qfcr\" (UID: \"f62cb37f-382a-4e46-adf7-a26bac073bbe\") " pod="openshift-marketplace/redhat-operators-7qfcr" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.805456 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5bdf586cd6-pg526"] Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.809125 4830 generic.go:334] "Generic (PLEG): container finished" podID="28acc7fe-7976-4396-89b7-c17a9e836b22" containerID="31c12d81c60a6aa9a7f1b5897ace75b50a2ca587dfbd86d39e10f62dd3756f63" exitCode=0 Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.809209 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pgl85" event={"ID":"28acc7fe-7976-4396-89b7-c17a9e836b22","Type":"ContainerDied","Data":"31c12d81c60a6aa9a7f1b5897ace75b50a2ca587dfbd86d39e10f62dd3756f63"} Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.809271 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pgl85" event={"ID":"28acc7fe-7976-4396-89b7-c17a9e836b22","Type":"ContainerStarted","Data":"6320f1125d9bfab72190e3c7917331467a5a0d30f90223ee475c0666444555ab"} Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.825678 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dbxr\" (UniqueName: \"kubernetes.io/projected/f62cb37f-382a-4e46-adf7-a26bac073bbe-kube-api-access-2dbxr\") pod \"redhat-operators-7qfcr\" (UID: \"f62cb37f-382a-4e46-adf7-a26bac073bbe\") " pod="openshift-marketplace/redhat-operators-7qfcr" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.867085 4830 generic.go:334] "Generic (PLEG): container finished" podID="70e3f193-7e57-4f53-b136-7642548b0767" containerID="239e3a37e53e8537b3b48a9b02ec21a5a5a2c6c88f9bdbfd442d95d1b213e945" exitCode=0 Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.867240 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"70e3f193-7e57-4f53-b136-7642548b0767","Type":"ContainerDied","Data":"239e3a37e53e8537b3b48a9b02ec21a5a5a2c6c88f9bdbfd442d95d1b213e945"} Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.900926 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5rhcb"] Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.922870 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.923668 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 18:04:38 crc kubenswrapper[4830]: E0318 18:04:38.927484 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="81b8be7dfb58f04ef54ad801b0e06061e15df68bb8d2b73116eb4d548beb2a4c" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.928281 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.928528 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 18 18:04:38 crc kubenswrapper[4830]: E0318 18:04:38.945347 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="81b8be7dfb58f04ef54ad801b0e06061e15df68bb8d2b73116eb4d548beb2a4c" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.945677 4830 generic.go:334] "Generic (PLEG): container finished" podID="dbfe6d63-05ee-40d2-affa-03b9310a27c1" containerID="6ce5799dd83d5007902c4d4afc49d4759d15f0e4e4e54f35ea8e19b46a37d987" exitCode=0 Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.947267 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sk6wx" event={"ID":"dbfe6d63-05ee-40d2-affa-03b9310a27c1","Type":"ContainerDied","Data":"6ce5799dd83d5007902c4d4afc49d4759d15f0e4e4e54f35ea8e19b46a37d987"} Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.947301 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sk6wx" event={"ID":"dbfe6d63-05ee-40d2-affa-03b9310a27c1","Type":"ContainerStarted","Data":"641429fab93568eccc5393904a74582a2edc0a5605340276dc967a94b469507e"} Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.953780 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-bzlw5" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.957746 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 18:04:38 crc kubenswrapper[4830]: E0318 18:04:38.962230 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="81b8be7dfb58f04ef54ad801b0e06061e15df68bb8d2b73116eb4d548beb2a4c" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 18:04:38 crc kubenswrapper[4830]: E0318 18:04:38.971368 4830 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-lrwxl" podUID="1e193992-9fbb-46cc-bb80-ed0563456687" containerName="kube-multus-additional-cni-plugins" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.998896 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9748e57d-46f0-4d93-aa53-1d79513e9905-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9748e57d-46f0-4d93-aa53-1d79513e9905\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 18:04:38 crc kubenswrapper[4830]: I0318 18:04:38.999060 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9748e57d-46f0-4d93-aa53-1d79513e9905-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9748e57d-46f0-4d93-aa53-1d79513e9905\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 18:04:39 crc kubenswrapper[4830]: I0318 18:04:39.012763 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-789cdc55bc-7snpt"] Mar 18 18:04:39 crc kubenswrapper[4830]: W0318 18:04:39.040876 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67e70641_19a9_4646_899e_4427b292fe9a.slice/crio-e96e5758b822656fa20f37ef5793b06764f6628d49f2c26a0b93f9f832a18865 WatchSource:0}: Error finding container e96e5758b822656fa20f37ef5793b06764f6628d49f2c26a0b93f9f832a18865: Status 404 returned error can't find the container with id e96e5758b822656fa20f37ef5793b06764f6628d49f2c26a0b93f9f832a18865 Mar 18 18:04:39 crc kubenswrapper[4830]: I0318 18:04:39.055853 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qfcr" Mar 18 18:04:39 crc kubenswrapper[4830]: I0318 18:04:39.101252 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9748e57d-46f0-4d93-aa53-1d79513e9905-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9748e57d-46f0-4d93-aa53-1d79513e9905\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 18:04:39 crc kubenswrapper[4830]: I0318 18:04:39.101342 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9748e57d-46f0-4d93-aa53-1d79513e9905-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9748e57d-46f0-4d93-aa53-1d79513e9905\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 18:04:39 crc kubenswrapper[4830]: I0318 18:04:39.101439 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9748e57d-46f0-4d93-aa53-1d79513e9905-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9748e57d-46f0-4d93-aa53-1d79513e9905\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 18:04:39 crc kubenswrapper[4830]: I0318 18:04:39.128121 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9748e57d-46f0-4d93-aa53-1d79513e9905-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9748e57d-46f0-4d93-aa53-1d79513e9905\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 18:04:39 crc kubenswrapper[4830]: I0318 18:04:39.164114 4830 patch_prober.go:28] interesting pod/router-default-5444994796-mfdzp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 18:04:39 crc kubenswrapper[4830]: [-]has-synced failed: reason withheld Mar 18 18:04:39 crc kubenswrapper[4830]: [+]process-running ok Mar 18 18:04:39 crc kubenswrapper[4830]: healthz check failed Mar 18 18:04:39 crc kubenswrapper[4830]: I0318 18:04:39.164180 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mfdzp" podUID="008bfbc9-9b16-4769-ba0a-116a67b7fdb4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 18:04:39 crc kubenswrapper[4830]: I0318 18:04:39.294329 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 18:04:39 crc kubenswrapper[4830]: I0318 18:04:39.345126 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-6hx2v" Mar 18 18:04:39 crc kubenswrapper[4830]: I0318 18:04:39.404390 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/19aab548-96a0-4056-8226-f9e7cf4b3ca3-secret-volume\") pod \"19aab548-96a0-4056-8226-f9e7cf4b3ca3\" (UID: \"19aab548-96a0-4056-8226-f9e7cf4b3ca3\") " Mar 18 18:04:39 crc kubenswrapper[4830]: I0318 18:04:39.404497 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s2kn\" (UniqueName: \"kubernetes.io/projected/19aab548-96a0-4056-8226-f9e7cf4b3ca3-kube-api-access-5s2kn\") pod \"19aab548-96a0-4056-8226-f9e7cf4b3ca3\" (UID: \"19aab548-96a0-4056-8226-f9e7cf4b3ca3\") " Mar 18 18:04:39 crc kubenswrapper[4830]: I0318 18:04:39.404676 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/19aab548-96a0-4056-8226-f9e7cf4b3ca3-config-volume\") pod \"19aab548-96a0-4056-8226-f9e7cf4b3ca3\" (UID: \"19aab548-96a0-4056-8226-f9e7cf4b3ca3\") " Mar 18 18:04:39 crc kubenswrapper[4830]: I0318 18:04:39.405988 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19aab548-96a0-4056-8226-f9e7cf4b3ca3-config-volume" (OuterVolumeSpecName: "config-volume") pod "19aab548-96a0-4056-8226-f9e7cf4b3ca3" (UID: "19aab548-96a0-4056-8226-f9e7cf4b3ca3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:04:39 crc kubenswrapper[4830]: I0318 18:04:39.417209 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19aab548-96a0-4056-8226-f9e7cf4b3ca3-kube-api-access-5s2kn" (OuterVolumeSpecName: "kube-api-access-5s2kn") pod "19aab548-96a0-4056-8226-f9e7cf4b3ca3" (UID: "19aab548-96a0-4056-8226-f9e7cf4b3ca3"). InnerVolumeSpecName "kube-api-access-5s2kn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:04:39 crc kubenswrapper[4830]: I0318 18:04:39.418074 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19aab548-96a0-4056-8226-f9e7cf4b3ca3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "19aab548-96a0-4056-8226-f9e7cf4b3ca3" (UID: "19aab548-96a0-4056-8226-f9e7cf4b3ca3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:04:39 crc kubenswrapper[4830]: I0318 18:04:39.452558 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7qfcr"] Mar 18 18:04:39 crc kubenswrapper[4830]: I0318 18:04:39.511126 4830 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/19aab548-96a0-4056-8226-f9e7cf4b3ca3-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 18:04:39 crc kubenswrapper[4830]: I0318 18:04:39.511158 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s2kn\" (UniqueName: \"kubernetes.io/projected/19aab548-96a0-4056-8226-f9e7cf4b3ca3-kube-api-access-5s2kn\") on node \"crc\" DevicePath \"\"" Mar 18 18:04:39 crc kubenswrapper[4830]: I0318 18:04:39.511167 4830 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/19aab548-96a0-4056-8226-f9e7cf4b3ca3-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 18:04:39 crc kubenswrapper[4830]: I0318 18:04:39.610668 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 18:04:39 crc kubenswrapper[4830]: I0318 18:04:39.852627 4830 ???:1] "http: TLS handshake error from 192.168.126.11:49802: no serving certificate available for the kubelet" Mar 18 18:04:39 crc kubenswrapper[4830]: I0318 18:04:39.960039 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-6hx2v" event={"ID":"19aab548-96a0-4056-8226-f9e7cf4b3ca3","Type":"ContainerDied","Data":"25c71557e35285fabcbc5018999db857e6c2d7dceef7c15de7d8b301679c6aa1"} Mar 18 18:04:39 crc kubenswrapper[4830]: I0318 18:04:39.960347 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25c71557e35285fabcbc5018999db857e6c2d7dceef7c15de7d8b301679c6aa1" Mar 18 18:04:39 crc kubenswrapper[4830]: I0318 18:04:39.960836 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-6hx2v" Mar 18 18:04:39 crc kubenswrapper[4830]: I0318 18:04:39.973888 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bdf586cd6-pg526" event={"ID":"4b3a31b8-84ff-4138-b4fe-20f46753715f","Type":"ContainerStarted","Data":"f94428caf35dd1b6a927918c31ae5254b0e796a65fed5d8d344f97132bae01da"} Mar 18 18:04:39 crc kubenswrapper[4830]: I0318 18:04:39.974095 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bdf586cd6-pg526" event={"ID":"4b3a31b8-84ff-4138-b4fe-20f46753715f","Type":"ContainerStarted","Data":"4204a5155f5e46cbe5b6e62d3670ab8078e41829c3ce1838b01ccc510686bd4d"} Mar 18 18:04:39 crc kubenswrapper[4830]: I0318 18:04:39.975048 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5bdf586cd6-pg526" Mar 18 18:04:39 crc kubenswrapper[4830]: I0318 18:04:39.994942 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9748e57d-46f0-4d93-aa53-1d79513e9905","Type":"ContainerStarted","Data":"49352d888337755b70bca578f64c9c8647b6608a447761d776482684c9f7bb06"} Mar 18 18:04:40 crc kubenswrapper[4830]: I0318 18:04:39.998352 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qfcr" event={"ID":"f62cb37f-382a-4e46-adf7-a26bac073bbe","Type":"ContainerStarted","Data":"1bd37c2658a4378af117a7c45e9906a137b357875491262f937baad633be79f5"} Mar 18 18:04:40 crc kubenswrapper[4830]: I0318 18:04:40.001565 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5bdf586cd6-pg526" Mar 18 18:04:40 crc kubenswrapper[4830]: I0318 18:04:40.021562 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-789cdc55bc-7snpt" event={"ID":"67e70641-19a9-4646-899e-4427b292fe9a","Type":"ContainerStarted","Data":"5ac19d9934cac89a58cfa0737151c93ae314752183751f2c76ab100a09480a0d"} Mar 18 18:04:40 crc kubenswrapper[4830]: I0318 18:04:40.021621 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-789cdc55bc-7snpt" event={"ID":"67e70641-19a9-4646-899e-4427b292fe9a","Type":"ContainerStarted","Data":"e96e5758b822656fa20f37ef5793b06764f6628d49f2c26a0b93f9f832a18865"} Mar 18 18:04:40 crc kubenswrapper[4830]: I0318 18:04:40.021901 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-789cdc55bc-7snpt" Mar 18 18:04:40 crc kubenswrapper[4830]: I0318 18:04:40.022660 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5bdf586cd6-pg526" podStartSLOduration=4.022635202 podStartE2EDuration="4.022635202s" podCreationTimestamp="2026-03-18 18:04:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:40.011988963 +0000 UTC m=+114.579619285" watchObservedRunningTime="2026-03-18 18:04:40.022635202 +0000 UTC m=+114.590265544" Mar 18 18:04:40 crc kubenswrapper[4830]: I0318 18:04:40.051141 4830 generic.go:334] "Generic (PLEG): container finished" podID="b9160fc9-aa00-4ce7-9ea2-15aac1e11e00" containerID="3a85770aba5f302eed9e1195ccf3291ad1f43d7562babdeb6847147d9a4077b6" exitCode=0 Mar 18 18:04:40 crc kubenswrapper[4830]: I0318 18:04:40.051271 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rhcb" event={"ID":"b9160fc9-aa00-4ce7-9ea2-15aac1e11e00","Type":"ContainerDied","Data":"3a85770aba5f302eed9e1195ccf3291ad1f43d7562babdeb6847147d9a4077b6"} Mar 18 18:04:40 crc kubenswrapper[4830]: I0318 18:04:40.051317 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rhcb" event={"ID":"b9160fc9-aa00-4ce7-9ea2-15aac1e11e00","Type":"ContainerStarted","Data":"81ee93b655560cebea802e010ceae8cd1a58abe1d5512b542da024ce9497bc30"} Mar 18 18:04:40 crc kubenswrapper[4830]: I0318 18:04:40.072842 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-789cdc55bc-7snpt" podStartSLOduration=4.072817742 podStartE2EDuration="4.072817742s" podCreationTimestamp="2026-03-18 18:04:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:40.066741458 +0000 UTC m=+114.634371780" watchObservedRunningTime="2026-03-18 18:04:40.072817742 +0000 UTC m=+114.640448074" Mar 18 18:04:40 crc kubenswrapper[4830]: I0318 18:04:40.096857 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-789cdc55bc-7snpt" Mar 18 18:04:40 crc kubenswrapper[4830]: I0318 18:04:40.175100 4830 patch_prober.go:28] interesting pod/router-default-5444994796-mfdzp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 18:04:40 crc kubenswrapper[4830]: [-]has-synced failed: reason withheld Mar 18 18:04:40 crc kubenswrapper[4830]: [+]process-running ok Mar 18 18:04:40 crc kubenswrapper[4830]: healthz check failed Mar 18 18:04:40 crc kubenswrapper[4830]: I0318 18:04:40.175168 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mfdzp" podUID="008bfbc9-9b16-4769-ba0a-116a67b7fdb4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 18:04:40 crc kubenswrapper[4830]: I0318 18:04:40.491216 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 18:04:40 crc kubenswrapper[4830]: I0318 18:04:40.532657 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70e3f193-7e57-4f53-b136-7642548b0767-kube-api-access\") pod \"70e3f193-7e57-4f53-b136-7642548b0767\" (UID: \"70e3f193-7e57-4f53-b136-7642548b0767\") " Mar 18 18:04:40 crc kubenswrapper[4830]: I0318 18:04:40.532799 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/70e3f193-7e57-4f53-b136-7642548b0767-kubelet-dir\") pod \"70e3f193-7e57-4f53-b136-7642548b0767\" (UID: \"70e3f193-7e57-4f53-b136-7642548b0767\") " Mar 18 18:04:40 crc kubenswrapper[4830]: I0318 18:04:40.532938 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/70e3f193-7e57-4f53-b136-7642548b0767-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "70e3f193-7e57-4f53-b136-7642548b0767" (UID: "70e3f193-7e57-4f53-b136-7642548b0767"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:04:40 crc kubenswrapper[4830]: I0318 18:04:40.533283 4830 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/70e3f193-7e57-4f53-b136-7642548b0767-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 18:04:40 crc kubenswrapper[4830]: I0318 18:04:40.543059 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70e3f193-7e57-4f53-b136-7642548b0767-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "70e3f193-7e57-4f53-b136-7642548b0767" (UID: "70e3f193-7e57-4f53-b136-7642548b0767"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:04:40 crc kubenswrapper[4830]: I0318 18:04:40.634786 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70e3f193-7e57-4f53-b136-7642548b0767-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 18:04:41 crc kubenswrapper[4830]: I0318 18:04:41.064209 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9748e57d-46f0-4d93-aa53-1d79513e9905","Type":"ContainerStarted","Data":"043de016e56364ef9c2917162621ccbc5828efddde9b8df5cd7de9dff4f5fe27"} Mar 18 18:04:41 crc kubenswrapper[4830]: I0318 18:04:41.081022 4830 generic.go:334] "Generic (PLEG): container finished" podID="f62cb37f-382a-4e46-adf7-a26bac073bbe" containerID="9542d7f933d806aba5e1a3274bf315e3bc7dfac8aa8d5d5b349f291a0e972b9b" exitCode=0 Mar 18 18:04:41 crc kubenswrapper[4830]: I0318 18:04:41.081808 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qfcr" event={"ID":"f62cb37f-382a-4e46-adf7-a26bac073bbe","Type":"ContainerDied","Data":"9542d7f933d806aba5e1a3274bf315e3bc7dfac8aa8d5d5b349f291a0e972b9b"} Mar 18 18:04:41 crc kubenswrapper[4830]: I0318 18:04:41.089987 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"70e3f193-7e57-4f53-b136-7642548b0767","Type":"ContainerDied","Data":"68b759e33edb7c8e8cbcf3b142e16185326918bacad0995d5e370128cfb0200d"} Mar 18 18:04:41 crc kubenswrapper[4830]: I0318 18:04:41.090058 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68b759e33edb7c8e8cbcf3b142e16185326918bacad0995d5e370128cfb0200d" Mar 18 18:04:41 crc kubenswrapper[4830]: I0318 18:04:41.090162 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 18:04:41 crc kubenswrapper[4830]: I0318 18:04:41.106691 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.106669278 podStartE2EDuration="3.106669278s" podCreationTimestamp="2026-03-18 18:04:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:41.08618982 +0000 UTC m=+115.653820152" watchObservedRunningTime="2026-03-18 18:04:41.106669278 +0000 UTC m=+115.674299610" Mar 18 18:04:41 crc kubenswrapper[4830]: I0318 18:04:41.156960 4830 patch_prober.go:28] interesting pod/router-default-5444994796-mfdzp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 18:04:41 crc kubenswrapper[4830]: [-]has-synced failed: reason withheld Mar 18 18:04:41 crc kubenswrapper[4830]: [+]process-running ok Mar 18 18:04:41 crc kubenswrapper[4830]: healthz check failed Mar 18 18:04:41 crc kubenswrapper[4830]: I0318 18:04:41.157151 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mfdzp" podUID="008bfbc9-9b16-4769-ba0a-116a67b7fdb4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 18:04:42 crc kubenswrapper[4830]: I0318 18:04:42.138319 4830 generic.go:334] "Generic (PLEG): container finished" podID="9748e57d-46f0-4d93-aa53-1d79513e9905" containerID="043de016e56364ef9c2917162621ccbc5828efddde9b8df5cd7de9dff4f5fe27" exitCode=0 Mar 18 18:04:42 crc kubenswrapper[4830]: I0318 18:04:42.138432 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9748e57d-46f0-4d93-aa53-1d79513e9905","Type":"ContainerDied","Data":"043de016e56364ef9c2917162621ccbc5828efddde9b8df5cd7de9dff4f5fe27"} Mar 18 18:04:42 crc kubenswrapper[4830]: I0318 18:04:42.155460 4830 patch_prober.go:28] interesting pod/router-default-5444994796-mfdzp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 18:04:42 crc kubenswrapper[4830]: [-]has-synced failed: reason withheld Mar 18 18:04:42 crc kubenswrapper[4830]: [+]process-running ok Mar 18 18:04:42 crc kubenswrapper[4830]: healthz check failed Mar 18 18:04:42 crc kubenswrapper[4830]: I0318 18:04:42.155514 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mfdzp" podUID="008bfbc9-9b16-4769-ba0a-116a67b7fdb4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 18:04:43 crc kubenswrapper[4830]: I0318 18:04:43.154597 4830 patch_prober.go:28] interesting pod/router-default-5444994796-mfdzp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 18:04:43 crc kubenswrapper[4830]: [-]has-synced failed: reason withheld Mar 18 18:04:43 crc kubenswrapper[4830]: [+]process-running ok Mar 18 18:04:43 crc kubenswrapper[4830]: healthz check failed Mar 18 18:04:43 crc kubenswrapper[4830]: I0318 18:04:43.155000 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mfdzp" podUID="008bfbc9-9b16-4769-ba0a-116a67b7fdb4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 18:04:43 crc kubenswrapper[4830]: I0318 18:04:43.509472 4830 ???:1] "http: TLS handshake error from 192.168.126.11:49812: no serving certificate available for the kubelet" Mar 18 18:04:43 crc kubenswrapper[4830]: I0318 18:04:43.812551 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-qx95k" Mar 18 18:04:44 crc kubenswrapper[4830]: I0318 18:04:44.156394 4830 patch_prober.go:28] interesting pod/router-default-5444994796-mfdzp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 18:04:44 crc kubenswrapper[4830]: [-]has-synced failed: reason withheld Mar 18 18:04:44 crc kubenswrapper[4830]: [+]process-running ok Mar 18 18:04:44 crc kubenswrapper[4830]: healthz check failed Mar 18 18:04:44 crc kubenswrapper[4830]: I0318 18:04:44.156507 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mfdzp" podUID="008bfbc9-9b16-4769-ba0a-116a67b7fdb4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 18:04:45 crc kubenswrapper[4830]: I0318 18:04:45.005081 4830 ???:1] "http: TLS handshake error from 192.168.126.11:49816: no serving certificate available for the kubelet" Mar 18 18:04:45 crc kubenswrapper[4830]: I0318 18:04:45.166040 4830 patch_prober.go:28] interesting pod/router-default-5444994796-mfdzp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 18:04:45 crc kubenswrapper[4830]: [-]has-synced failed: reason withheld Mar 18 18:04:45 crc kubenswrapper[4830]: [+]process-running ok Mar 18 18:04:45 crc kubenswrapper[4830]: healthz check failed Mar 18 18:04:45 crc kubenswrapper[4830]: I0318 18:04:45.166094 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mfdzp" podUID="008bfbc9-9b16-4769-ba0a-116a67b7fdb4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 18:04:46 crc kubenswrapper[4830]: I0318 18:04:46.155488 4830 patch_prober.go:28] interesting pod/router-default-5444994796-mfdzp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 18:04:46 crc kubenswrapper[4830]: [-]has-synced failed: reason withheld Mar 18 18:04:46 crc kubenswrapper[4830]: [+]process-running ok Mar 18 18:04:46 crc kubenswrapper[4830]: healthz check failed Mar 18 18:04:46 crc kubenswrapper[4830]: I0318 18:04:46.155580 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mfdzp" podUID="008bfbc9-9b16-4769-ba0a-116a67b7fdb4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 18:04:47 crc kubenswrapper[4830]: I0318 18:04:47.159133 4830 patch_prober.go:28] interesting pod/router-default-5444994796-mfdzp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 18:04:47 crc kubenswrapper[4830]: [-]has-synced failed: reason withheld Mar 18 18:04:47 crc kubenswrapper[4830]: [+]process-running ok Mar 18 18:04:47 crc kubenswrapper[4830]: healthz check failed Mar 18 18:04:47 crc kubenswrapper[4830]: I0318 18:04:47.159506 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mfdzp" podUID="008bfbc9-9b16-4769-ba0a-116a67b7fdb4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 18:04:47 crc kubenswrapper[4830]: I0318 18:04:47.245966 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 18 18:04:47 crc kubenswrapper[4830]: I0318 18:04:47.736080 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-t7xl2" Mar 18 18:04:47 crc kubenswrapper[4830]: I0318 18:04:47.772740 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=0.772721606 podStartE2EDuration="772.721606ms" podCreationTimestamp="2026-03-18 18:04:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:04:47.750942885 +0000 UTC m=+122.318573227" watchObservedRunningTime="2026-03-18 18:04:47.772721606 +0000 UTC m=+122.340351938" Mar 18 18:04:48 crc kubenswrapper[4830]: I0318 18:04:48.155328 4830 patch_prober.go:28] interesting pod/router-default-5444994796-mfdzp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 18:04:48 crc kubenswrapper[4830]: [-]has-synced failed: reason withheld Mar 18 18:04:48 crc kubenswrapper[4830]: [+]process-running ok Mar 18 18:04:48 crc kubenswrapper[4830]: healthz check failed Mar 18 18:04:48 crc kubenswrapper[4830]: I0318 18:04:48.155397 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mfdzp" podUID="008bfbc9-9b16-4769-ba0a-116a67b7fdb4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 18:04:48 crc kubenswrapper[4830]: I0318 18:04:48.741486 4830 patch_prober.go:28] interesting pod/console-f9d7485db-lfz57 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.38:8443/health\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Mar 18 18:04:48 crc kubenswrapper[4830]: I0318 18:04:48.741547 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-lfz57" podUID="95b4d24e-09da-4c0d-9d24-81621509024a" containerName="console" probeResult="failure" output="Get \"https://10.217.0.38:8443/health\": dial tcp 10.217.0.38:8443: connect: connection refused" Mar 18 18:04:48 crc kubenswrapper[4830]: E0318 18:04:48.870554 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="81b8be7dfb58f04ef54ad801b0e06061e15df68bb8d2b73116eb4d548beb2a4c" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 18:04:48 crc kubenswrapper[4830]: E0318 18:04:48.872754 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="81b8be7dfb58f04ef54ad801b0e06061e15df68bb8d2b73116eb4d548beb2a4c" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 18:04:48 crc kubenswrapper[4830]: E0318 18:04:48.874663 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="81b8be7dfb58f04ef54ad801b0e06061e15df68bb8d2b73116eb4d548beb2a4c" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 18:04:48 crc kubenswrapper[4830]: E0318 18:04:48.874816 4830 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-lrwxl" podUID="1e193992-9fbb-46cc-bb80-ed0563456687" containerName="kube-multus-additional-cni-plugins" Mar 18 18:04:49 crc kubenswrapper[4830]: I0318 18:04:49.155535 4830 patch_prober.go:28] interesting pod/router-default-5444994796-mfdzp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 18:04:49 crc kubenswrapper[4830]: [-]has-synced failed: reason withheld Mar 18 18:04:49 crc kubenswrapper[4830]: [+]process-running ok Mar 18 18:04:49 crc kubenswrapper[4830]: healthz check failed Mar 18 18:04:49 crc kubenswrapper[4830]: I0318 18:04:49.155609 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mfdzp" podUID="008bfbc9-9b16-4769-ba0a-116a67b7fdb4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 18:04:49 crc kubenswrapper[4830]: I0318 18:04:49.283183 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 18:04:49 crc kubenswrapper[4830]: I0318 18:04:49.407530 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9748e57d-46f0-4d93-aa53-1d79513e9905-kube-api-access\") pod \"9748e57d-46f0-4d93-aa53-1d79513e9905\" (UID: \"9748e57d-46f0-4d93-aa53-1d79513e9905\") " Mar 18 18:04:49 crc kubenswrapper[4830]: I0318 18:04:49.407653 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9748e57d-46f0-4d93-aa53-1d79513e9905-kubelet-dir\") pod \"9748e57d-46f0-4d93-aa53-1d79513e9905\" (UID: \"9748e57d-46f0-4d93-aa53-1d79513e9905\") " Mar 18 18:04:49 crc kubenswrapper[4830]: I0318 18:04:49.407734 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9748e57d-46f0-4d93-aa53-1d79513e9905-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9748e57d-46f0-4d93-aa53-1d79513e9905" (UID: "9748e57d-46f0-4d93-aa53-1d79513e9905"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:04:49 crc kubenswrapper[4830]: I0318 18:04:49.408053 4830 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9748e57d-46f0-4d93-aa53-1d79513e9905-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 18:04:49 crc kubenswrapper[4830]: I0318 18:04:49.414191 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9748e57d-46f0-4d93-aa53-1d79513e9905-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9748e57d-46f0-4d93-aa53-1d79513e9905" (UID: "9748e57d-46f0-4d93-aa53-1d79513e9905"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:04:49 crc kubenswrapper[4830]: I0318 18:04:49.509426 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9748e57d-46f0-4d93-aa53-1d79513e9905-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 18:04:50 crc kubenswrapper[4830]: I0318 18:04:50.156137 4830 patch_prober.go:28] interesting pod/router-default-5444994796-mfdzp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 18:04:50 crc kubenswrapper[4830]: [+]has-synced ok Mar 18 18:04:50 crc kubenswrapper[4830]: [+]process-running ok Mar 18 18:04:50 crc kubenswrapper[4830]: healthz check failed Mar 18 18:04:50 crc kubenswrapper[4830]: I0318 18:04:50.156219 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mfdzp" podUID="008bfbc9-9b16-4769-ba0a-116a67b7fdb4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 18:04:50 crc kubenswrapper[4830]: I0318 18:04:50.224952 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9748e57d-46f0-4d93-aa53-1d79513e9905","Type":"ContainerDied","Data":"49352d888337755b70bca578f64c9c8647b6608a447761d776482684c9f7bb06"} Mar 18 18:04:50 crc kubenswrapper[4830]: I0318 18:04:50.224996 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49352d888337755b70bca578f64c9c8647b6608a447761d776482684c9f7bb06" Mar 18 18:04:50 crc kubenswrapper[4830]: I0318 18:04:50.225085 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 18:04:51 crc kubenswrapper[4830]: I0318 18:04:51.157152 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-mfdzp" Mar 18 18:04:51 crc kubenswrapper[4830]: I0318 18:04:51.162000 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-mfdzp" Mar 18 18:04:53 crc kubenswrapper[4830]: I0318 18:04:53.809159 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5bdf586cd6-pg526"] Mar 18 18:04:53 crc kubenswrapper[4830]: I0318 18:04:53.810126 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5bdf586cd6-pg526" podUID="4b3a31b8-84ff-4138-b4fe-20f46753715f" containerName="controller-manager" containerID="cri-o://f94428caf35dd1b6a927918c31ae5254b0e796a65fed5d8d344f97132bae01da" gracePeriod=30 Mar 18 18:04:53 crc kubenswrapper[4830]: I0318 18:04:53.828527 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-789cdc55bc-7snpt"] Mar 18 18:04:53 crc kubenswrapper[4830]: I0318 18:04:53.828783 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-789cdc55bc-7snpt" podUID="67e70641-19a9-4646-899e-4427b292fe9a" containerName="route-controller-manager" containerID="cri-o://5ac19d9934cac89a58cfa0737151c93ae314752183751f2c76ab100a09480a0d" gracePeriod=30 Mar 18 18:04:54 crc kubenswrapper[4830]: I0318 18:04:54.246931 4830 generic.go:334] "Generic (PLEG): container finished" podID="4b3a31b8-84ff-4138-b4fe-20f46753715f" containerID="f94428caf35dd1b6a927918c31ae5254b0e796a65fed5d8d344f97132bae01da" exitCode=0 Mar 18 18:04:54 crc kubenswrapper[4830]: I0318 18:04:54.247021 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bdf586cd6-pg526" event={"ID":"4b3a31b8-84ff-4138-b4fe-20f46753715f","Type":"ContainerDied","Data":"f94428caf35dd1b6a927918c31ae5254b0e796a65fed5d8d344f97132bae01da"} Mar 18 18:04:54 crc kubenswrapper[4830]: I0318 18:04:54.248949 4830 generic.go:334] "Generic (PLEG): container finished" podID="67e70641-19a9-4646-899e-4427b292fe9a" containerID="5ac19d9934cac89a58cfa0737151c93ae314752183751f2c76ab100a09480a0d" exitCode=0 Mar 18 18:04:54 crc kubenswrapper[4830]: I0318 18:04:54.248999 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-789cdc55bc-7snpt" event={"ID":"67e70641-19a9-4646-899e-4427b292fe9a","Type":"ContainerDied","Data":"5ac19d9934cac89a58cfa0737151c93ae314752183751f2c76ab100a09480a0d"} Mar 18 18:04:56 crc kubenswrapper[4830]: I0318 18:04:56.764126 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:04:58 crc kubenswrapper[4830]: I0318 18:04:58.381006 4830 patch_prober.go:28] interesting pod/controller-manager-5bdf586cd6-pg526 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" start-of-body= Mar 18 18:04:58 crc kubenswrapper[4830]: I0318 18:04:58.381412 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5bdf586cd6-pg526" podUID="4b3a31b8-84ff-4138-b4fe-20f46753715f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" Mar 18 18:04:58 crc kubenswrapper[4830]: I0318 18:04:58.750869 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-lfz57" Mar 18 18:04:58 crc kubenswrapper[4830]: I0318 18:04:58.753915 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-lfz57" Mar 18 18:04:58 crc kubenswrapper[4830]: E0318 18:04:58.872470 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="81b8be7dfb58f04ef54ad801b0e06061e15df68bb8d2b73116eb4d548beb2a4c" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 18:04:58 crc kubenswrapper[4830]: E0318 18:04:58.873806 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="81b8be7dfb58f04ef54ad801b0e06061e15df68bb8d2b73116eb4d548beb2a4c" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 18:04:58 crc kubenswrapper[4830]: E0318 18:04:58.885181 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="81b8be7dfb58f04ef54ad801b0e06061e15df68bb8d2b73116eb4d548beb2a4c" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 18:04:58 crc kubenswrapper[4830]: E0318 18:04:58.885259 4830 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-lrwxl" podUID="1e193992-9fbb-46cc-bb80-ed0563456687" containerName="kube-multus-additional-cni-plugins" Mar 18 18:04:59 crc kubenswrapper[4830]: I0318 18:04:59.370979 4830 patch_prober.go:28] interesting pod/route-controller-manager-789cdc55bc-7snpt container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: i/o timeout" start-of-body= Mar 18 18:04:59 crc kubenswrapper[4830]: I0318 18:04:59.371051 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-789cdc55bc-7snpt" podUID="67e70641-19a9-4646-899e-4427b292fe9a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: i/o timeout" Mar 18 18:05:05 crc kubenswrapper[4830]: I0318 18:05:05.255595 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 18 18:05:05 crc kubenswrapper[4830]: I0318 18:05:05.309222 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-lrwxl_1e193992-9fbb-46cc-bb80-ed0563456687/kube-multus-additional-cni-plugins/0.log" Mar 18 18:05:05 crc kubenswrapper[4830]: I0318 18:05:05.309279 4830 generic.go:334] "Generic (PLEG): container finished" podID="1e193992-9fbb-46cc-bb80-ed0563456687" containerID="81b8be7dfb58f04ef54ad801b0e06061e15df68bb8d2b73116eb4d548beb2a4c" exitCode=137 Mar 18 18:05:05 crc kubenswrapper[4830]: I0318 18:05:05.310292 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-lrwxl" event={"ID":"1e193992-9fbb-46cc-bb80-ed0563456687","Type":"ContainerDied","Data":"81b8be7dfb58f04ef54ad801b0e06061e15df68bb8d2b73116eb4d548beb2a4c"} Mar 18 18:05:05 crc kubenswrapper[4830]: E0318 18:05:05.311741 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 18 18:05:05 crc kubenswrapper[4830]: E0318 18:05:05.311942 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-476hk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-5rhcb_openshift-marketplace(b9160fc9-aa00-4ce7-9ea2-15aac1e11e00): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 18:05:05 crc kubenswrapper[4830]: E0318 18:05:05.313155 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-5rhcb" podUID="b9160fc9-aa00-4ce7-9ea2-15aac1e11e00" Mar 18 18:05:05 crc kubenswrapper[4830]: I0318 18:05:05.523371 4830 ???:1] "http: TLS handshake error from 192.168.126.11:49794: no serving certificate available for the kubelet" Mar 18 18:05:06 crc kubenswrapper[4830]: I0318 18:05:06.263127 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=1.263097175 podStartE2EDuration="1.263097175s" podCreationTimestamp="2026-03-18 18:05:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:06.256344811 +0000 UTC m=+140.823975143" watchObservedRunningTime="2026-03-18 18:05:06.263097175 +0000 UTC m=+140.830727587" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.004648 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6kfbf" Mar 18 18:05:08 crc kubenswrapper[4830]: E0318 18:05:08.286109 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-5rhcb" podUID="b9160fc9-aa00-4ce7-9ea2-15aac1e11e00" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.333797 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-789cdc55bc-7snpt" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.338740 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-789cdc55bc-7snpt" event={"ID":"67e70641-19a9-4646-899e-4427b292fe9a","Type":"ContainerDied","Data":"e96e5758b822656fa20f37ef5793b06764f6628d49f2c26a0b93f9f832a18865"} Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.338799 4830 scope.go:117] "RemoveContainer" containerID="5ac19d9934cac89a58cfa0737151c93ae314752183751f2c76ab100a09480a0d" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.338831 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-789cdc55bc-7snpt" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.375498 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f5fc9d9b-6cckb"] Mar 18 18:05:08 crc kubenswrapper[4830]: E0318 18:05:08.375743 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e70641-19a9-4646-899e-4427b292fe9a" containerName="route-controller-manager" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.375757 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e70641-19a9-4646-899e-4427b292fe9a" containerName="route-controller-manager" Mar 18 18:05:08 crc kubenswrapper[4830]: E0318 18:05:08.375779 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9748e57d-46f0-4d93-aa53-1d79513e9905" containerName="pruner" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.375786 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="9748e57d-46f0-4d93-aa53-1d79513e9905" containerName="pruner" Mar 18 18:05:08 crc kubenswrapper[4830]: E0318 18:05:08.375798 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19aab548-96a0-4056-8226-f9e7cf4b3ca3" containerName="collect-profiles" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.375805 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="19aab548-96a0-4056-8226-f9e7cf4b3ca3" containerName="collect-profiles" Mar 18 18:05:08 crc kubenswrapper[4830]: E0318 18:05:08.375818 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70e3f193-7e57-4f53-b136-7642548b0767" containerName="pruner" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.375824 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="70e3f193-7e57-4f53-b136-7642548b0767" containerName="pruner" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.375907 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="19aab548-96a0-4056-8226-f9e7cf4b3ca3" containerName="collect-profiles" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.375921 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="67e70641-19a9-4646-899e-4427b292fe9a" containerName="route-controller-manager" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.375935 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="9748e57d-46f0-4d93-aa53-1d79513e9905" containerName="pruner" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.375940 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="70e3f193-7e57-4f53-b136-7642548b0767" containerName="pruner" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.376287 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f5fc9d9b-6cckb" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.380670 4830 patch_prober.go:28] interesting pod/controller-manager-5bdf586cd6-pg526 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" start-of-body= Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.380711 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5bdf586cd6-pg526" podUID="4b3a31b8-84ff-4138-b4fe-20f46753715f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.380967 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f5fc9d9b-6cckb"] Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.395615 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67e70641-19a9-4646-899e-4427b292fe9a-config\") pod \"67e70641-19a9-4646-899e-4427b292fe9a\" (UID: \"67e70641-19a9-4646-899e-4427b292fe9a\") " Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.395837 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67e70641-19a9-4646-899e-4427b292fe9a-client-ca\") pod \"67e70641-19a9-4646-899e-4427b292fe9a\" (UID: \"67e70641-19a9-4646-899e-4427b292fe9a\") " Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.395890 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67e70641-19a9-4646-899e-4427b292fe9a-serving-cert\") pod \"67e70641-19a9-4646-899e-4427b292fe9a\" (UID: \"67e70641-19a9-4646-899e-4427b292fe9a\") " Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.396015 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dbg6\" (UniqueName: \"kubernetes.io/projected/67e70641-19a9-4646-899e-4427b292fe9a-kube-api-access-6dbg6\") pod \"67e70641-19a9-4646-899e-4427b292fe9a\" (UID: \"67e70641-19a9-4646-899e-4427b292fe9a\") " Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.396262 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ce0279b-c198-44a3-bb28-365f2ccf845e-serving-cert\") pod \"route-controller-manager-6f5fc9d9b-6cckb\" (UID: \"6ce0279b-c198-44a3-bb28-365f2ccf845e\") " pod="openshift-route-controller-manager/route-controller-manager-6f5fc9d9b-6cckb" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.396327 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ktlq\" (UniqueName: \"kubernetes.io/projected/6ce0279b-c198-44a3-bb28-365f2ccf845e-kube-api-access-7ktlq\") pod \"route-controller-manager-6f5fc9d9b-6cckb\" (UID: \"6ce0279b-c198-44a3-bb28-365f2ccf845e\") " pod="openshift-route-controller-manager/route-controller-manager-6f5fc9d9b-6cckb" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.396423 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ce0279b-c198-44a3-bb28-365f2ccf845e-client-ca\") pod \"route-controller-manager-6f5fc9d9b-6cckb\" (UID: \"6ce0279b-c198-44a3-bb28-365f2ccf845e\") " pod="openshift-route-controller-manager/route-controller-manager-6f5fc9d9b-6cckb" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.396485 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ce0279b-c198-44a3-bb28-365f2ccf845e-config\") pod \"route-controller-manager-6f5fc9d9b-6cckb\" (UID: \"6ce0279b-c198-44a3-bb28-365f2ccf845e\") " pod="openshift-route-controller-manager/route-controller-manager-6f5fc9d9b-6cckb" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.396558 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67e70641-19a9-4646-899e-4427b292fe9a-client-ca" (OuterVolumeSpecName: "client-ca") pod "67e70641-19a9-4646-899e-4427b292fe9a" (UID: "67e70641-19a9-4646-899e-4427b292fe9a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.396598 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67e70641-19a9-4646-899e-4427b292fe9a-config" (OuterVolumeSpecName: "config") pod "67e70641-19a9-4646-899e-4427b292fe9a" (UID: "67e70641-19a9-4646-899e-4427b292fe9a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.402439 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67e70641-19a9-4646-899e-4427b292fe9a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "67e70641-19a9-4646-899e-4427b292fe9a" (UID: "67e70641-19a9-4646-899e-4427b292fe9a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.403244 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67e70641-19a9-4646-899e-4427b292fe9a-kube-api-access-6dbg6" (OuterVolumeSpecName: "kube-api-access-6dbg6") pod "67e70641-19a9-4646-899e-4427b292fe9a" (UID: "67e70641-19a9-4646-899e-4427b292fe9a"). InnerVolumeSpecName "kube-api-access-6dbg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:05:08 crc kubenswrapper[4830]: E0318 18:05:08.403946 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 18 18:05:08 crc kubenswrapper[4830]: E0318 18:05:08.404097 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g5q2g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-czlcm_openshift-marketplace(3fa33f19-15c5-4f78-88ef-1db6eb605aa7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 18:05:08 crc kubenswrapper[4830]: E0318 18:05:08.405483 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-czlcm" podUID="3fa33f19-15c5-4f78-88ef-1db6eb605aa7" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.499593 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ce0279b-c198-44a3-bb28-365f2ccf845e-client-ca\") pod \"route-controller-manager-6f5fc9d9b-6cckb\" (UID: \"6ce0279b-c198-44a3-bb28-365f2ccf845e\") " pod="openshift-route-controller-manager/route-controller-manager-6f5fc9d9b-6cckb" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.499663 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ce0279b-c198-44a3-bb28-365f2ccf845e-config\") pod \"route-controller-manager-6f5fc9d9b-6cckb\" (UID: \"6ce0279b-c198-44a3-bb28-365f2ccf845e\") " pod="openshift-route-controller-manager/route-controller-manager-6f5fc9d9b-6cckb" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.499707 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ce0279b-c198-44a3-bb28-365f2ccf845e-serving-cert\") pod \"route-controller-manager-6f5fc9d9b-6cckb\" (UID: \"6ce0279b-c198-44a3-bb28-365f2ccf845e\") " pod="openshift-route-controller-manager/route-controller-manager-6f5fc9d9b-6cckb" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.499743 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ktlq\" (UniqueName: \"kubernetes.io/projected/6ce0279b-c198-44a3-bb28-365f2ccf845e-kube-api-access-7ktlq\") pod \"route-controller-manager-6f5fc9d9b-6cckb\" (UID: \"6ce0279b-c198-44a3-bb28-365f2ccf845e\") " pod="openshift-route-controller-manager/route-controller-manager-6f5fc9d9b-6cckb" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.499795 4830 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67e70641-19a9-4646-899e-4427b292fe9a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.499808 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67e70641-19a9-4646-899e-4427b292fe9a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.499820 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dbg6\" (UniqueName: \"kubernetes.io/projected/67e70641-19a9-4646-899e-4427b292fe9a-kube-api-access-6dbg6\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.499832 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67e70641-19a9-4646-899e-4427b292fe9a-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.500679 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ce0279b-c198-44a3-bb28-365f2ccf845e-client-ca\") pod \"route-controller-manager-6f5fc9d9b-6cckb\" (UID: \"6ce0279b-c198-44a3-bb28-365f2ccf845e\") " pod="openshift-route-controller-manager/route-controller-manager-6f5fc9d9b-6cckb" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.500848 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ce0279b-c198-44a3-bb28-365f2ccf845e-config\") pod \"route-controller-manager-6f5fc9d9b-6cckb\" (UID: \"6ce0279b-c198-44a3-bb28-365f2ccf845e\") " pod="openshift-route-controller-manager/route-controller-manager-6f5fc9d9b-6cckb" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.503940 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ce0279b-c198-44a3-bb28-365f2ccf845e-serving-cert\") pod \"route-controller-manager-6f5fc9d9b-6cckb\" (UID: \"6ce0279b-c198-44a3-bb28-365f2ccf845e\") " pod="openshift-route-controller-manager/route-controller-manager-6f5fc9d9b-6cckb" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.516590 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ktlq\" (UniqueName: \"kubernetes.io/projected/6ce0279b-c198-44a3-bb28-365f2ccf845e-kube-api-access-7ktlq\") pod \"route-controller-manager-6f5fc9d9b-6cckb\" (UID: \"6ce0279b-c198-44a3-bb28-365f2ccf845e\") " pod="openshift-route-controller-manager/route-controller-manager-6f5fc9d9b-6cckb" Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.679460 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-789cdc55bc-7snpt"] Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.683453 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-789cdc55bc-7snpt"] Mar 18 18:05:08 crc kubenswrapper[4830]: I0318 18:05:08.701973 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f5fc9d9b-6cckb" Mar 18 18:05:08 crc kubenswrapper[4830]: E0318 18:05:08.869422 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 81b8be7dfb58f04ef54ad801b0e06061e15df68bb8d2b73116eb4d548beb2a4c is running failed: container process not found" containerID="81b8be7dfb58f04ef54ad801b0e06061e15df68bb8d2b73116eb4d548beb2a4c" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 18:05:08 crc kubenswrapper[4830]: E0318 18:05:08.869958 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 81b8be7dfb58f04ef54ad801b0e06061e15df68bb8d2b73116eb4d548beb2a4c is running failed: container process not found" containerID="81b8be7dfb58f04ef54ad801b0e06061e15df68bb8d2b73116eb4d548beb2a4c" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 18:05:08 crc kubenswrapper[4830]: E0318 18:05:08.870428 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 81b8be7dfb58f04ef54ad801b0e06061e15df68bb8d2b73116eb4d548beb2a4c is running failed: container process not found" containerID="81b8be7dfb58f04ef54ad801b0e06061e15df68bb8d2b73116eb4d548beb2a4c" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 18:05:08 crc kubenswrapper[4830]: E0318 18:05:08.870468 4830 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 81b8be7dfb58f04ef54ad801b0e06061e15df68bb8d2b73116eb4d548beb2a4c is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-lrwxl" podUID="1e193992-9fbb-46cc-bb80-ed0563456687" containerName="kube-multus-additional-cni-plugins" Mar 18 18:05:09 crc kubenswrapper[4830]: E0318 18:05:09.866004 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-czlcm" podUID="3fa33f19-15c5-4f78-88ef-1db6eb605aa7" Mar 18 18:05:09 crc kubenswrapper[4830]: E0318 18:05:09.951124 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 18 18:05:09 crc kubenswrapper[4830]: E0318 18:05:09.951328 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-th75n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-szdp2_openshift-marketplace(b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 18:05:09 crc kubenswrapper[4830]: E0318 18:05:09.952600 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-szdp2" podUID="b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b" Mar 18 18:05:09 crc kubenswrapper[4830]: I0318 18:05:09.978045 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bdf586cd6-pg526" Mar 18 18:05:09 crc kubenswrapper[4830]: I0318 18:05:09.990067 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-lrwxl_1e193992-9fbb-46cc-bb80-ed0563456687/kube-multus-additional-cni-plugins/0.log" Mar 18 18:05:09 crc kubenswrapper[4830]: I0318 18:05:09.990158 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-lrwxl" Mar 18 18:05:09 crc kubenswrapper[4830]: E0318 18:05:09.995114 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 18 18:05:09 crc kubenswrapper[4830]: E0318 18:05:09.995310 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dkzx6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-sk6wx_openshift-marketplace(dbfe6d63-05ee-40d2-affa-03b9310a27c1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 18:05:09 crc kubenswrapper[4830]: E0318 18:05:09.997008 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-sk6wx" podUID="dbfe6d63-05ee-40d2-affa-03b9310a27c1" Mar 18 18:05:10 crc kubenswrapper[4830]: E0318 18:05:10.008604 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 18 18:05:10 crc kubenswrapper[4830]: E0318 18:05:10.008785 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dqkg9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-2tsg6_openshift-marketplace(4d8e7b87-f442-4d60-bd65-35eacd097689): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 18:05:10 crc kubenswrapper[4830]: E0318 18:05:10.010147 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-2tsg6" podUID="4d8e7b87-f442-4d60-bd65-35eacd097689" Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.028995 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/1e193992-9fbb-46cc-bb80-ed0563456687-ready\") pod \"1e193992-9fbb-46cc-bb80-ed0563456687\" (UID: \"1e193992-9fbb-46cc-bb80-ed0563456687\") " Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.029375 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b3a31b8-84ff-4138-b4fe-20f46753715f-config\") pod \"4b3a31b8-84ff-4138-b4fe-20f46753715f\" (UID: \"4b3a31b8-84ff-4138-b4fe-20f46753715f\") " Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.029417 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1e193992-9fbb-46cc-bb80-ed0563456687-cni-sysctl-allowlist\") pod \"1e193992-9fbb-46cc-bb80-ed0563456687\" (UID: \"1e193992-9fbb-46cc-bb80-ed0563456687\") " Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.029449 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b3a31b8-84ff-4138-b4fe-20f46753715f-serving-cert\") pod \"4b3a31b8-84ff-4138-b4fe-20f46753715f\" (UID: \"4b3a31b8-84ff-4138-b4fe-20f46753715f\") " Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.029528 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b3a31b8-84ff-4138-b4fe-20f46753715f-client-ca\") pod \"4b3a31b8-84ff-4138-b4fe-20f46753715f\" (UID: \"4b3a31b8-84ff-4138-b4fe-20f46753715f\") " Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.029553 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sh88t\" (UniqueName: \"kubernetes.io/projected/4b3a31b8-84ff-4138-b4fe-20f46753715f-kube-api-access-sh88t\") pod \"4b3a31b8-84ff-4138-b4fe-20f46753715f\" (UID: \"4b3a31b8-84ff-4138-b4fe-20f46753715f\") " Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.029698 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b3a31b8-84ff-4138-b4fe-20f46753715f-proxy-ca-bundles\") pod \"4b3a31b8-84ff-4138-b4fe-20f46753715f\" (UID: \"4b3a31b8-84ff-4138-b4fe-20f46753715f\") " Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.029784 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dbl8\" (UniqueName: \"kubernetes.io/projected/1e193992-9fbb-46cc-bb80-ed0563456687-kube-api-access-9dbl8\") pod \"1e193992-9fbb-46cc-bb80-ed0563456687\" (UID: \"1e193992-9fbb-46cc-bb80-ed0563456687\") " Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.029814 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1e193992-9fbb-46cc-bb80-ed0563456687-tuning-conf-dir\") pod \"1e193992-9fbb-46cc-bb80-ed0563456687\" (UID: \"1e193992-9fbb-46cc-bb80-ed0563456687\") " Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.030316 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e193992-9fbb-46cc-bb80-ed0563456687-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "1e193992-9fbb-46cc-bb80-ed0563456687" (UID: "1e193992-9fbb-46cc-bb80-ed0563456687"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.030863 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e193992-9fbb-46cc-bb80-ed0563456687-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "1e193992-9fbb-46cc-bb80-ed0563456687" (UID: "1e193992-9fbb-46cc-bb80-ed0563456687"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.031110 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b3a31b8-84ff-4138-b4fe-20f46753715f-config" (OuterVolumeSpecName: "config") pod "4b3a31b8-84ff-4138-b4fe-20f46753715f" (UID: "4b3a31b8-84ff-4138-b4fe-20f46753715f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.031159 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e193992-9fbb-46cc-bb80-ed0563456687-ready" (OuterVolumeSpecName: "ready") pod "1e193992-9fbb-46cc-bb80-ed0563456687" (UID: "1e193992-9fbb-46cc-bb80-ed0563456687"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.031411 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b3a31b8-84ff-4138-b4fe-20f46753715f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4b3a31b8-84ff-4138-b4fe-20f46753715f" (UID: "4b3a31b8-84ff-4138-b4fe-20f46753715f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.031569 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b3a31b8-84ff-4138-b4fe-20f46753715f-client-ca" (OuterVolumeSpecName: "client-ca") pod "4b3a31b8-84ff-4138-b4fe-20f46753715f" (UID: "4b3a31b8-84ff-4138-b4fe-20f46753715f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.041840 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b3a31b8-84ff-4138-b4fe-20f46753715f-kube-api-access-sh88t" (OuterVolumeSpecName: "kube-api-access-sh88t") pod "4b3a31b8-84ff-4138-b4fe-20f46753715f" (UID: "4b3a31b8-84ff-4138-b4fe-20f46753715f"). InnerVolumeSpecName "kube-api-access-sh88t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.050859 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e193992-9fbb-46cc-bb80-ed0563456687-kube-api-access-9dbl8" (OuterVolumeSpecName: "kube-api-access-9dbl8") pod "1e193992-9fbb-46cc-bb80-ed0563456687" (UID: "1e193992-9fbb-46cc-bb80-ed0563456687"). InnerVolumeSpecName "kube-api-access-9dbl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.050996 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b3a31b8-84ff-4138-b4fe-20f46753715f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4b3a31b8-84ff-4138-b4fe-20f46753715f" (UID: "4b3a31b8-84ff-4138-b4fe-20f46753715f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:05:10 crc kubenswrapper[4830]: E0318 18:05:10.080430 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 18 18:05:10 crc kubenswrapper[4830]: E0318 18:05:10.080544 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lzb79,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-klcdh_openshift-marketplace(b1176643-c4d6-4be9-8317-a99886a32b29): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 18:05:10 crc kubenswrapper[4830]: E0318 18:05:10.082733 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-klcdh" podUID="b1176643-c4d6-4be9-8317-a99886a32b29" Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.131588 4830 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b3a31b8-84ff-4138-b4fe-20f46753715f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.131622 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sh88t\" (UniqueName: \"kubernetes.io/projected/4b3a31b8-84ff-4138-b4fe-20f46753715f-kube-api-access-sh88t\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.131633 4830 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b3a31b8-84ff-4138-b4fe-20f46753715f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.131643 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dbl8\" (UniqueName: \"kubernetes.io/projected/1e193992-9fbb-46cc-bb80-ed0563456687-kube-api-access-9dbl8\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.131652 4830 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1e193992-9fbb-46cc-bb80-ed0563456687-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.131662 4830 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/1e193992-9fbb-46cc-bb80-ed0563456687-ready\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.131671 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b3a31b8-84ff-4138-b4fe-20f46753715f-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.131679 4830 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1e193992-9fbb-46cc-bb80-ed0563456687-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.131687 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b3a31b8-84ff-4138-b4fe-20f46753715f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.245540 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67e70641-19a9-4646-899e-4427b292fe9a" path="/var/lib/kubelet/pods/67e70641-19a9-4646-899e-4427b292fe9a/volumes" Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.350962 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-lrwxl_1e193992-9fbb-46cc-bb80-ed0563456687/kube-multus-additional-cni-plugins/0.log" Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.351314 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-lrwxl" event={"ID":"1e193992-9fbb-46cc-bb80-ed0563456687","Type":"ContainerDied","Data":"9bed7717e556d33909244919a91e052eb4990fe7f7b890f2c09d8f0650485320"} Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.351536 4830 scope.go:117] "RemoveContainer" containerID="81b8be7dfb58f04ef54ad801b0e06061e15df68bb8d2b73116eb4d548beb2a4c" Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.351746 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-lrwxl" Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.356509 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bdf586cd6-pg526" event={"ID":"4b3a31b8-84ff-4138-b4fe-20f46753715f","Type":"ContainerDied","Data":"4204a5155f5e46cbe5b6e62d3670ab8078e41829c3ce1838b01ccc510686bd4d"} Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.356592 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bdf586cd6-pg526" Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.365905 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qfcr" event={"ID":"f62cb37f-382a-4e46-adf7-a26bac073bbe","Type":"ContainerStarted","Data":"7d89b672c8810d0f1d427f94ad1f802170a8e44b939f4438a00f114dbcc1398b"} Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.372488 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f5fc9d9b-6cckb"] Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.374325 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pgl85" event={"ID":"28acc7fe-7976-4396-89b7-c17a9e836b22","Type":"ContainerStarted","Data":"960b5a884be83df21b88728efc6fd7d0af09db0ce1ca8f1c58b2b2f996cf2ea1"} Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.385483 4830 scope.go:117] "RemoveContainer" containerID="f94428caf35dd1b6a927918c31ae5254b0e796a65fed5d8d344f97132bae01da" Mar 18 18:05:10 crc kubenswrapper[4830]: E0318 18:05:10.387911 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-2tsg6" podUID="4d8e7b87-f442-4d60-bd65-35eacd097689" Mar 18 18:05:10 crc kubenswrapper[4830]: E0318 18:05:10.388187 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-klcdh" podUID="b1176643-c4d6-4be9-8317-a99886a32b29" Mar 18 18:05:10 crc kubenswrapper[4830]: W0318 18:05:10.389453 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ce0279b_c198_44a3_bb28_365f2ccf845e.slice/crio-0d8714ea2e4a356bd070aca004f8bc14b09613e1f4195982a58624b22165e9b9 WatchSource:0}: Error finding container 0d8714ea2e4a356bd070aca004f8bc14b09613e1f4195982a58624b22165e9b9: Status 404 returned error can't find the container with id 0d8714ea2e4a356bd070aca004f8bc14b09613e1f4195982a58624b22165e9b9 Mar 18 18:05:10 crc kubenswrapper[4830]: E0318 18:05:10.389528 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-szdp2" podUID="b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b" Mar 18 18:05:10 crc kubenswrapper[4830]: E0318 18:05:10.389550 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sk6wx" podUID="dbfe6d63-05ee-40d2-affa-03b9310a27c1" Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.421527 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5bdf586cd6-pg526"] Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.425733 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5bdf586cd6-pg526"] Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.429291 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-lrwxl"] Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.432802 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-lrwxl"] Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.929178 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 18:05:10 crc kubenswrapper[4830]: E0318 18:05:10.929630 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e193992-9fbb-46cc-bb80-ed0563456687" containerName="kube-multus-additional-cni-plugins" Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.929644 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e193992-9fbb-46cc-bb80-ed0563456687" containerName="kube-multus-additional-cni-plugins" Mar 18 18:05:10 crc kubenswrapper[4830]: E0318 18:05:10.929659 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b3a31b8-84ff-4138-b4fe-20f46753715f" containerName="controller-manager" Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.929665 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b3a31b8-84ff-4138-b4fe-20f46753715f" containerName="controller-manager" Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.929753 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e193992-9fbb-46cc-bb80-ed0563456687" containerName="kube-multus-additional-cni-plugins" Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.929784 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b3a31b8-84ff-4138-b4fe-20f46753715f" containerName="controller-manager" Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.930114 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.934484 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.936310 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.942613 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.996681 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-665b54db8d-jwqg7"] Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.997379 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-665b54db8d-jwqg7" Mar 18 18:05:10 crc kubenswrapper[4830]: I0318 18:05:10.999529 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.000114 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.000142 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.000206 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.000712 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.001529 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.046034 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-665b54db8d-jwqg7"] Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.048745 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.146846 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnf57\" (UniqueName: \"kubernetes.io/projected/d8b0f47d-6e61-48e3-97cf-4f63945a9f88-kube-api-access-qnf57\") pod \"controller-manager-665b54db8d-jwqg7\" (UID: \"d8b0f47d-6e61-48e3-97cf-4f63945a9f88\") " pod="openshift-controller-manager/controller-manager-665b54db8d-jwqg7" Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.146926 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8b0f47d-6e61-48e3-97cf-4f63945a9f88-proxy-ca-bundles\") pod \"controller-manager-665b54db8d-jwqg7\" (UID: \"d8b0f47d-6e61-48e3-97cf-4f63945a9f88\") " pod="openshift-controller-manager/controller-manager-665b54db8d-jwqg7" Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.147186 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a0eef53b-2d7d-4ef8-a037-2f51125e8ed5-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a0eef53b-2d7d-4ef8-a037-2f51125e8ed5\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.147236 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d8b0f47d-6e61-48e3-97cf-4f63945a9f88-client-ca\") pod \"controller-manager-665b54db8d-jwqg7\" (UID: \"d8b0f47d-6e61-48e3-97cf-4f63945a9f88\") " pod="openshift-controller-manager/controller-manager-665b54db8d-jwqg7" Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.147271 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8b0f47d-6e61-48e3-97cf-4f63945a9f88-config\") pod \"controller-manager-665b54db8d-jwqg7\" (UID: \"d8b0f47d-6e61-48e3-97cf-4f63945a9f88\") " pod="openshift-controller-manager/controller-manager-665b54db8d-jwqg7" Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.147325 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8b0f47d-6e61-48e3-97cf-4f63945a9f88-serving-cert\") pod \"controller-manager-665b54db8d-jwqg7\" (UID: \"d8b0f47d-6e61-48e3-97cf-4f63945a9f88\") " pod="openshift-controller-manager/controller-manager-665b54db8d-jwqg7" Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.147369 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0eef53b-2d7d-4ef8-a037-2f51125e8ed5-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a0eef53b-2d7d-4ef8-a037-2f51125e8ed5\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.167586 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.248184 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0eef53b-2d7d-4ef8-a037-2f51125e8ed5-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a0eef53b-2d7d-4ef8-a037-2f51125e8ed5\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.248258 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnf57\" (UniqueName: \"kubernetes.io/projected/d8b0f47d-6e61-48e3-97cf-4f63945a9f88-kube-api-access-qnf57\") pod \"controller-manager-665b54db8d-jwqg7\" (UID: \"d8b0f47d-6e61-48e3-97cf-4f63945a9f88\") " pod="openshift-controller-manager/controller-manager-665b54db8d-jwqg7" Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.248282 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8b0f47d-6e61-48e3-97cf-4f63945a9f88-proxy-ca-bundles\") pod \"controller-manager-665b54db8d-jwqg7\" (UID: \"d8b0f47d-6e61-48e3-97cf-4f63945a9f88\") " pod="openshift-controller-manager/controller-manager-665b54db8d-jwqg7" Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.248334 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a0eef53b-2d7d-4ef8-a037-2f51125e8ed5-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a0eef53b-2d7d-4ef8-a037-2f51125e8ed5\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.248377 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d8b0f47d-6e61-48e3-97cf-4f63945a9f88-client-ca\") pod \"controller-manager-665b54db8d-jwqg7\" (UID: \"d8b0f47d-6e61-48e3-97cf-4f63945a9f88\") " pod="openshift-controller-manager/controller-manager-665b54db8d-jwqg7" Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.248399 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8b0f47d-6e61-48e3-97cf-4f63945a9f88-config\") pod \"controller-manager-665b54db8d-jwqg7\" (UID: \"d8b0f47d-6e61-48e3-97cf-4f63945a9f88\") " pod="openshift-controller-manager/controller-manager-665b54db8d-jwqg7" Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.248434 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8b0f47d-6e61-48e3-97cf-4f63945a9f88-serving-cert\") pod \"controller-manager-665b54db8d-jwqg7\" (UID: \"d8b0f47d-6e61-48e3-97cf-4f63945a9f88\") " pod="openshift-controller-manager/controller-manager-665b54db8d-jwqg7" Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.249899 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a0eef53b-2d7d-4ef8-a037-2f51125e8ed5-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a0eef53b-2d7d-4ef8-a037-2f51125e8ed5\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.251323 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8b0f47d-6e61-48e3-97cf-4f63945a9f88-proxy-ca-bundles\") pod \"controller-manager-665b54db8d-jwqg7\" (UID: \"d8b0f47d-6e61-48e3-97cf-4f63945a9f88\") " pod="openshift-controller-manager/controller-manager-665b54db8d-jwqg7" Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.251552 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d8b0f47d-6e61-48e3-97cf-4f63945a9f88-client-ca\") pod \"controller-manager-665b54db8d-jwqg7\" (UID: \"d8b0f47d-6e61-48e3-97cf-4f63945a9f88\") " pod="openshift-controller-manager/controller-manager-665b54db8d-jwqg7" Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.252435 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8b0f47d-6e61-48e3-97cf-4f63945a9f88-config\") pod \"controller-manager-665b54db8d-jwqg7\" (UID: \"d8b0f47d-6e61-48e3-97cf-4f63945a9f88\") " pod="openshift-controller-manager/controller-manager-665b54db8d-jwqg7" Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.255851 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8b0f47d-6e61-48e3-97cf-4f63945a9f88-serving-cert\") pod \"controller-manager-665b54db8d-jwqg7\" (UID: \"d8b0f47d-6e61-48e3-97cf-4f63945a9f88\") " pod="openshift-controller-manager/controller-manager-665b54db8d-jwqg7" Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.280228 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnf57\" (UniqueName: \"kubernetes.io/projected/d8b0f47d-6e61-48e3-97cf-4f63945a9f88-kube-api-access-qnf57\") pod \"controller-manager-665b54db8d-jwqg7\" (UID: \"d8b0f47d-6e61-48e3-97cf-4f63945a9f88\") " pod="openshift-controller-manager/controller-manager-665b54db8d-jwqg7" Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.284101 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0eef53b-2d7d-4ef8-a037-2f51125e8ed5-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a0eef53b-2d7d-4ef8-a037-2f51125e8ed5\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.362929 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-665b54db8d-jwqg7" Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.381131 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f5fc9d9b-6cckb" event={"ID":"6ce0279b-c198-44a3-bb28-365f2ccf845e","Type":"ContainerStarted","Data":"6a17841231fe0ea7964d3f993ff8c37002411d9aa388eefecac463971bc3e35f"} Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.381172 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f5fc9d9b-6cckb" event={"ID":"6ce0279b-c198-44a3-bb28-365f2ccf845e","Type":"ContainerStarted","Data":"0d8714ea2e4a356bd070aca004f8bc14b09613e1f4195982a58624b22165e9b9"} Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.381884 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6f5fc9d9b-6cckb" Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.383729 4830 generic.go:334] "Generic (PLEG): container finished" podID="f62cb37f-382a-4e46-adf7-a26bac073bbe" containerID="7d89b672c8810d0f1d427f94ad1f802170a8e44b939f4438a00f114dbcc1398b" exitCode=0 Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.383786 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qfcr" event={"ID":"f62cb37f-382a-4e46-adf7-a26bac073bbe","Type":"ContainerDied","Data":"7d89b672c8810d0f1d427f94ad1f802170a8e44b939f4438a00f114dbcc1398b"} Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.385604 4830 generic.go:334] "Generic (PLEG): container finished" podID="28acc7fe-7976-4396-89b7-c17a9e836b22" containerID="960b5a884be83df21b88728efc6fd7d0af09db0ce1ca8f1c58b2b2f996cf2ea1" exitCode=0 Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.385658 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pgl85" event={"ID":"28acc7fe-7976-4396-89b7-c17a9e836b22","Type":"ContainerDied","Data":"960b5a884be83df21b88728efc6fd7d0af09db0ce1ca8f1c58b2b2f996cf2ea1"} Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.391389 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6f5fc9d9b-6cckb" Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.407408 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6f5fc9d9b-6cckb" podStartSLOduration=18.407383537 podStartE2EDuration="18.407383537s" podCreationTimestamp="2026-03-18 18:04:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:11.403365084 +0000 UTC m=+145.970995426" watchObservedRunningTime="2026-03-18 18:05:11.407383537 +0000 UTC m=+145.975013869" Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.577744 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.589347 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-665b54db8d-jwqg7"] Mar 18 18:05:11 crc kubenswrapper[4830]: W0318 18:05:11.602305 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8b0f47d_6e61_48e3_97cf_4f63945a9f88.slice/crio-de7df5b38dcc7015f69f6c6c31d7e8f0a4e16bd14d405d2d1af7ca9b1903d20d WatchSource:0}: Error finding container de7df5b38dcc7015f69f6c6c31d7e8f0a4e16bd14d405d2d1af7ca9b1903d20d: Status 404 returned error can't find the container with id de7df5b38dcc7015f69f6c6c31d7e8f0a4e16bd14d405d2d1af7ca9b1903d20d Mar 18 18:05:11 crc kubenswrapper[4830]: I0318 18:05:11.776711 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 18:05:11 crc kubenswrapper[4830]: W0318 18:05:11.785399 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda0eef53b_2d7d_4ef8_a037_2f51125e8ed5.slice/crio-5108d903df9fc97adc036717e7d12bea4e33fbfb4d146fb675d1a81f7c7ff252 WatchSource:0}: Error finding container 5108d903df9fc97adc036717e7d12bea4e33fbfb4d146fb675d1a81f7c7ff252: Status 404 returned error can't find the container with id 5108d903df9fc97adc036717e7d12bea4e33fbfb4d146fb675d1a81f7c7ff252 Mar 18 18:05:12 crc kubenswrapper[4830]: I0318 18:05:12.241521 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e193992-9fbb-46cc-bb80-ed0563456687" path="/var/lib/kubelet/pods/1e193992-9fbb-46cc-bb80-ed0563456687/volumes" Mar 18 18:05:12 crc kubenswrapper[4830]: I0318 18:05:12.242641 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b3a31b8-84ff-4138-b4fe-20f46753715f" path="/var/lib/kubelet/pods/4b3a31b8-84ff-4138-b4fe-20f46753715f/volumes" Mar 18 18:05:12 crc kubenswrapper[4830]: I0318 18:05:12.409131 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pgl85" event={"ID":"28acc7fe-7976-4396-89b7-c17a9e836b22","Type":"ContainerStarted","Data":"f854cd2996b7a5f26ab5e48779f51b79c2a7a32754e3ff242dda78ccf37b6fe7"} Mar 18 18:05:12 crc kubenswrapper[4830]: I0318 18:05:12.411633 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a0eef53b-2d7d-4ef8-a037-2f51125e8ed5","Type":"ContainerStarted","Data":"5666cf36af9c7b9e2bfd9dc1e88f8c5f01775f388b761b6efa73aa8ad62adcad"} Mar 18 18:05:12 crc kubenswrapper[4830]: I0318 18:05:12.411671 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a0eef53b-2d7d-4ef8-a037-2f51125e8ed5","Type":"ContainerStarted","Data":"5108d903df9fc97adc036717e7d12bea4e33fbfb4d146fb675d1a81f7c7ff252"} Mar 18 18:05:12 crc kubenswrapper[4830]: I0318 18:05:12.413800 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-665b54db8d-jwqg7" event={"ID":"d8b0f47d-6e61-48e3-97cf-4f63945a9f88","Type":"ContainerStarted","Data":"eead50a8429ac2838cb30b2c0dea4e3477461721afefdead11cfd4a5974aba18"} Mar 18 18:05:12 crc kubenswrapper[4830]: I0318 18:05:12.413828 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-665b54db8d-jwqg7" Mar 18 18:05:12 crc kubenswrapper[4830]: I0318 18:05:12.413838 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-665b54db8d-jwqg7" event={"ID":"d8b0f47d-6e61-48e3-97cf-4f63945a9f88","Type":"ContainerStarted","Data":"de7df5b38dcc7015f69f6c6c31d7e8f0a4e16bd14d405d2d1af7ca9b1903d20d"} Mar 18 18:05:12 crc kubenswrapper[4830]: I0318 18:05:12.418095 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-665b54db8d-jwqg7" Mar 18 18:05:12 crc kubenswrapper[4830]: I0318 18:05:12.425283 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pgl85" podStartSLOduration=2.175984785 podStartE2EDuration="35.42526462s" podCreationTimestamp="2026-03-18 18:04:37 +0000 UTC" firstStartedPulling="2026-03-18 18:04:38.829476285 +0000 UTC m=+113.397106617" lastFinishedPulling="2026-03-18 18:05:12.07875612 +0000 UTC m=+146.646386452" observedRunningTime="2026-03-18 18:05:12.421174664 +0000 UTC m=+146.988804996" watchObservedRunningTime="2026-03-18 18:05:12.42526462 +0000 UTC m=+146.992894952" Mar 18 18:05:12 crc kubenswrapper[4830]: I0318 18:05:12.442107 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-665b54db8d-jwqg7" podStartSLOduration=19.442090297 podStartE2EDuration="19.442090297s" podCreationTimestamp="2026-03-18 18:04:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:12.439873354 +0000 UTC m=+147.007503706" watchObservedRunningTime="2026-03-18 18:05:12.442090297 +0000 UTC m=+147.009720629" Mar 18 18:05:12 crc kubenswrapper[4830]: I0318 18:05:12.455420 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.455403678 podStartE2EDuration="2.455403678s" podCreationTimestamp="2026-03-18 18:05:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:12.451422677 +0000 UTC m=+147.019053009" watchObservedRunningTime="2026-03-18 18:05:12.455403678 +0000 UTC m=+147.023034010" Mar 18 18:05:13 crc kubenswrapper[4830]: I0318 18:05:13.423701 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qfcr" event={"ID":"f62cb37f-382a-4e46-adf7-a26bac073bbe","Type":"ContainerStarted","Data":"7a23acdb3d6685042b800a8661791721484afadb1a264dc58175449e05fcb860"} Mar 18 18:05:13 crc kubenswrapper[4830]: I0318 18:05:13.427386 4830 generic.go:334] "Generic (PLEG): container finished" podID="a0eef53b-2d7d-4ef8-a037-2f51125e8ed5" containerID="5666cf36af9c7b9e2bfd9dc1e88f8c5f01775f388b761b6efa73aa8ad62adcad" exitCode=0 Mar 18 18:05:13 crc kubenswrapper[4830]: I0318 18:05:13.427530 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a0eef53b-2d7d-4ef8-a037-2f51125e8ed5","Type":"ContainerDied","Data":"5666cf36af9c7b9e2bfd9dc1e88f8c5f01775f388b761b6efa73aa8ad62adcad"} Mar 18 18:05:13 crc kubenswrapper[4830]: I0318 18:05:13.447300 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7qfcr" podStartSLOduration=4.216164798 podStartE2EDuration="35.447274569s" podCreationTimestamp="2026-03-18 18:04:38 +0000 UTC" firstStartedPulling="2026-03-18 18:04:41.092214662 +0000 UTC m=+115.659844994" lastFinishedPulling="2026-03-18 18:05:12.323324423 +0000 UTC m=+146.890954765" observedRunningTime="2026-03-18 18:05:13.445179479 +0000 UTC m=+148.012809811" watchObservedRunningTime="2026-03-18 18:05:13.447274569 +0000 UTC m=+148.014904911" Mar 18 18:05:13 crc kubenswrapper[4830]: I0318 18:05:13.778560 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-665b54db8d-jwqg7"] Mar 18 18:05:13 crc kubenswrapper[4830]: I0318 18:05:13.878509 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f5fc9d9b-6cckb"] Mar 18 18:05:14 crc kubenswrapper[4830]: I0318 18:05:14.432692 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6f5fc9d9b-6cckb" podUID="6ce0279b-c198-44a3-bb28-365f2ccf845e" containerName="route-controller-manager" containerID="cri-o://6a17841231fe0ea7964d3f993ff8c37002411d9aa388eefecac463971bc3e35f" gracePeriod=30 Mar 18 18:05:14 crc kubenswrapper[4830]: I0318 18:05:14.768214 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 18:05:14 crc kubenswrapper[4830]: I0318 18:05:14.795145 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a0eef53b-2d7d-4ef8-a037-2f51125e8ed5-kubelet-dir\") pod \"a0eef53b-2d7d-4ef8-a037-2f51125e8ed5\" (UID: \"a0eef53b-2d7d-4ef8-a037-2f51125e8ed5\") " Mar 18 18:05:14 crc kubenswrapper[4830]: I0318 18:05:14.795208 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0eef53b-2d7d-4ef8-a037-2f51125e8ed5-kube-api-access\") pod \"a0eef53b-2d7d-4ef8-a037-2f51125e8ed5\" (UID: \"a0eef53b-2d7d-4ef8-a037-2f51125e8ed5\") " Mar 18 18:05:14 crc kubenswrapper[4830]: I0318 18:05:14.796410 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0eef53b-2d7d-4ef8-a037-2f51125e8ed5-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a0eef53b-2d7d-4ef8-a037-2f51125e8ed5" (UID: "a0eef53b-2d7d-4ef8-a037-2f51125e8ed5"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:05:14 crc kubenswrapper[4830]: I0318 18:05:14.804965 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0eef53b-2d7d-4ef8-a037-2f51125e8ed5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a0eef53b-2d7d-4ef8-a037-2f51125e8ed5" (UID: "a0eef53b-2d7d-4ef8-a037-2f51125e8ed5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:05:14 crc kubenswrapper[4830]: I0318 18:05:14.856012 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f5fc9d9b-6cckb" Mar 18 18:05:14 crc kubenswrapper[4830]: I0318 18:05:14.895940 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ce0279b-c198-44a3-bb28-365f2ccf845e-serving-cert\") pod \"6ce0279b-c198-44a3-bb28-365f2ccf845e\" (UID: \"6ce0279b-c198-44a3-bb28-365f2ccf845e\") " Mar 18 18:05:14 crc kubenswrapper[4830]: I0318 18:05:14.896009 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ktlq\" (UniqueName: \"kubernetes.io/projected/6ce0279b-c198-44a3-bb28-365f2ccf845e-kube-api-access-7ktlq\") pod \"6ce0279b-c198-44a3-bb28-365f2ccf845e\" (UID: \"6ce0279b-c198-44a3-bb28-365f2ccf845e\") " Mar 18 18:05:14 crc kubenswrapper[4830]: I0318 18:05:14.896041 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ce0279b-c198-44a3-bb28-365f2ccf845e-config\") pod \"6ce0279b-c198-44a3-bb28-365f2ccf845e\" (UID: \"6ce0279b-c198-44a3-bb28-365f2ccf845e\") " Mar 18 18:05:14 crc kubenswrapper[4830]: I0318 18:05:14.896211 4830 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a0eef53b-2d7d-4ef8-a037-2f51125e8ed5-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:14 crc kubenswrapper[4830]: I0318 18:05:14.896226 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0eef53b-2d7d-4ef8-a037-2f51125e8ed5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:14 crc kubenswrapper[4830]: I0318 18:05:14.897228 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ce0279b-c198-44a3-bb28-365f2ccf845e-config" (OuterVolumeSpecName: "config") pod "6ce0279b-c198-44a3-bb28-365f2ccf845e" (UID: "6ce0279b-c198-44a3-bb28-365f2ccf845e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:05:14 crc kubenswrapper[4830]: I0318 18:05:14.899637 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ce0279b-c198-44a3-bb28-365f2ccf845e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6ce0279b-c198-44a3-bb28-365f2ccf845e" (UID: "6ce0279b-c198-44a3-bb28-365f2ccf845e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:05:14 crc kubenswrapper[4830]: I0318 18:05:14.901037 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ce0279b-c198-44a3-bb28-365f2ccf845e-kube-api-access-7ktlq" (OuterVolumeSpecName: "kube-api-access-7ktlq") pod "6ce0279b-c198-44a3-bb28-365f2ccf845e" (UID: "6ce0279b-c198-44a3-bb28-365f2ccf845e"). InnerVolumeSpecName "kube-api-access-7ktlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:05:14 crc kubenswrapper[4830]: I0318 18:05:14.996833 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ce0279b-c198-44a3-bb28-365f2ccf845e-client-ca\") pod \"6ce0279b-c198-44a3-bb28-365f2ccf845e\" (UID: \"6ce0279b-c198-44a3-bb28-365f2ccf845e\") " Mar 18 18:05:14 crc kubenswrapper[4830]: I0318 18:05:14.997282 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ktlq\" (UniqueName: \"kubernetes.io/projected/6ce0279b-c198-44a3-bb28-365f2ccf845e-kube-api-access-7ktlq\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:14 crc kubenswrapper[4830]: I0318 18:05:14.997308 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ce0279b-c198-44a3-bb28-365f2ccf845e-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:14 crc kubenswrapper[4830]: I0318 18:05:14.997321 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ce0279b-c198-44a3-bb28-365f2ccf845e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:14 crc kubenswrapper[4830]: I0318 18:05:14.997346 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ce0279b-c198-44a3-bb28-365f2ccf845e-client-ca" (OuterVolumeSpecName: "client-ca") pod "6ce0279b-c198-44a3-bb28-365f2ccf845e" (UID: "6ce0279b-c198-44a3-bb28-365f2ccf845e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.000316 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66c479d4cd-w8c42"] Mar 18 18:05:15 crc kubenswrapper[4830]: E0318 18:05:15.000543 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0eef53b-2d7d-4ef8-a037-2f51125e8ed5" containerName="pruner" Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.000560 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0eef53b-2d7d-4ef8-a037-2f51125e8ed5" containerName="pruner" Mar 18 18:05:15 crc kubenswrapper[4830]: E0318 18:05:15.000590 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ce0279b-c198-44a3-bb28-365f2ccf845e" containerName="route-controller-manager" Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.000597 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ce0279b-c198-44a3-bb28-365f2ccf845e" containerName="route-controller-manager" Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.000708 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ce0279b-c198-44a3-bb28-365f2ccf845e" containerName="route-controller-manager" Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.000726 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0eef53b-2d7d-4ef8-a037-2f51125e8ed5" containerName="pruner" Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.001216 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66c479d4cd-w8c42" Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.013713 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66c479d4cd-w8c42"] Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.098205 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvsg9\" (UniqueName: \"kubernetes.io/projected/4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9-kube-api-access-jvsg9\") pod \"route-controller-manager-66c479d4cd-w8c42\" (UID: \"4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9\") " pod="openshift-route-controller-manager/route-controller-manager-66c479d4cd-w8c42" Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.098288 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9-config\") pod \"route-controller-manager-66c479d4cd-w8c42\" (UID: \"4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9\") " pod="openshift-route-controller-manager/route-controller-manager-66c479d4cd-w8c42" Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.098445 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9-client-ca\") pod \"route-controller-manager-66c479d4cd-w8c42\" (UID: \"4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9\") " pod="openshift-route-controller-manager/route-controller-manager-66c479d4cd-w8c42" Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.098591 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9-serving-cert\") pod \"route-controller-manager-66c479d4cd-w8c42\" (UID: \"4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9\") " pod="openshift-route-controller-manager/route-controller-manager-66c479d4cd-w8c42" Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.098683 4830 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ce0279b-c198-44a3-bb28-365f2ccf845e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.177899 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kczvm"] Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.199415 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9-client-ca\") pod \"route-controller-manager-66c479d4cd-w8c42\" (UID: \"4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9\") " pod="openshift-route-controller-manager/route-controller-manager-66c479d4cd-w8c42" Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.199840 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9-serving-cert\") pod \"route-controller-manager-66c479d4cd-w8c42\" (UID: \"4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9\") " pod="openshift-route-controller-manager/route-controller-manager-66c479d4cd-w8c42" Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.199909 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvsg9\" (UniqueName: \"kubernetes.io/projected/4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9-kube-api-access-jvsg9\") pod \"route-controller-manager-66c479d4cd-w8c42\" (UID: \"4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9\") " pod="openshift-route-controller-manager/route-controller-manager-66c479d4cd-w8c42" Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.200966 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9-config\") pod \"route-controller-manager-66c479d4cd-w8c42\" (UID: \"4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9\") " pod="openshift-route-controller-manager/route-controller-manager-66c479d4cd-w8c42" Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.201350 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9-config\") pod \"route-controller-manager-66c479d4cd-w8c42\" (UID: \"4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9\") " pod="openshift-route-controller-manager/route-controller-manager-66c479d4cd-w8c42" Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.201398 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9-client-ca\") pod \"route-controller-manager-66c479d4cd-w8c42\" (UID: \"4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9\") " pod="openshift-route-controller-manager/route-controller-manager-66c479d4cd-w8c42" Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.207439 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9-serving-cert\") pod \"route-controller-manager-66c479d4cd-w8c42\" (UID: \"4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9\") " pod="openshift-route-controller-manager/route-controller-manager-66c479d4cd-w8c42" Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.224262 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvsg9\" (UniqueName: \"kubernetes.io/projected/4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9-kube-api-access-jvsg9\") pod \"route-controller-manager-66c479d4cd-w8c42\" (UID: \"4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9\") " pod="openshift-route-controller-manager/route-controller-manager-66c479d4cd-w8c42" Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.318164 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66c479d4cd-w8c42" Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.445741 4830 generic.go:334] "Generic (PLEG): container finished" podID="6ce0279b-c198-44a3-bb28-365f2ccf845e" containerID="6a17841231fe0ea7964d3f993ff8c37002411d9aa388eefecac463971bc3e35f" exitCode=0 Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.445825 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f5fc9d9b-6cckb" Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.445847 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f5fc9d9b-6cckb" event={"ID":"6ce0279b-c198-44a3-bb28-365f2ccf845e","Type":"ContainerDied","Data":"6a17841231fe0ea7964d3f993ff8c37002411d9aa388eefecac463971bc3e35f"} Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.445910 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f5fc9d9b-6cckb" event={"ID":"6ce0279b-c198-44a3-bb28-365f2ccf845e","Type":"ContainerDied","Data":"0d8714ea2e4a356bd070aca004f8bc14b09613e1f4195982a58624b22165e9b9"} Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.445933 4830 scope.go:117] "RemoveContainer" containerID="6a17841231fe0ea7964d3f993ff8c37002411d9aa388eefecac463971bc3e35f" Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.447583 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a0eef53b-2d7d-4ef8-a037-2f51125e8ed5","Type":"ContainerDied","Data":"5108d903df9fc97adc036717e7d12bea4e33fbfb4d146fb675d1a81f7c7ff252"} Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.447643 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5108d903df9fc97adc036717e7d12bea4e33fbfb4d146fb675d1a81f7c7ff252" Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.447601 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.447784 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-665b54db8d-jwqg7" podUID="d8b0f47d-6e61-48e3-97cf-4f63945a9f88" containerName="controller-manager" containerID="cri-o://eead50a8429ac2838cb30b2c0dea4e3477461721afefdead11cfd4a5974aba18" gracePeriod=30 Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.500561 4830 scope.go:117] "RemoveContainer" containerID="6a17841231fe0ea7964d3f993ff8c37002411d9aa388eefecac463971bc3e35f" Mar 18 18:05:15 crc kubenswrapper[4830]: E0318 18:05:15.501886 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a17841231fe0ea7964d3f993ff8c37002411d9aa388eefecac463971bc3e35f\": container with ID starting with 6a17841231fe0ea7964d3f993ff8c37002411d9aa388eefecac463971bc3e35f not found: ID does not exist" containerID="6a17841231fe0ea7964d3f993ff8c37002411d9aa388eefecac463971bc3e35f" Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.501940 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a17841231fe0ea7964d3f993ff8c37002411d9aa388eefecac463971bc3e35f"} err="failed to get container status \"6a17841231fe0ea7964d3f993ff8c37002411d9aa388eefecac463971bc3e35f\": rpc error: code = NotFound desc = could not find container \"6a17841231fe0ea7964d3f993ff8c37002411d9aa388eefecac463971bc3e35f\": container with ID starting with 6a17841231fe0ea7964d3f993ff8c37002411d9aa388eefecac463971bc3e35f not found: ID does not exist" Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.502754 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f5fc9d9b-6cckb"] Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.510496 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f5fc9d9b-6cckb"] Mar 18 18:05:15 crc kubenswrapper[4830]: I0318 18:05:15.604949 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66c479d4cd-w8c42"] Mar 18 18:05:15 crc kubenswrapper[4830]: W0318 18:05:15.629370 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c618f7c_fe7d_4eb1_b78a_937c9d28e3e9.slice/crio-f1ebdc510b2cb94e958bab1d36ca3093b31108ce8b4c8f9faf9dc6d126eb62c1 WatchSource:0}: Error finding container f1ebdc510b2cb94e958bab1d36ca3093b31108ce8b4c8f9faf9dc6d126eb62c1: Status 404 returned error can't find the container with id f1ebdc510b2cb94e958bab1d36ca3093b31108ce8b4c8f9faf9dc6d126eb62c1 Mar 18 18:05:16 crc kubenswrapper[4830]: I0318 18:05:16.193112 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-665b54db8d-jwqg7" Mar 18 18:05:16 crc kubenswrapper[4830]: I0318 18:05:16.243822 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ce0279b-c198-44a3-bb28-365f2ccf845e" path="/var/lib/kubelet/pods/6ce0279b-c198-44a3-bb28-365f2ccf845e/volumes" Mar 18 18:05:16 crc kubenswrapper[4830]: I0318 18:05:16.311864 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnf57\" (UniqueName: \"kubernetes.io/projected/d8b0f47d-6e61-48e3-97cf-4f63945a9f88-kube-api-access-qnf57\") pod \"d8b0f47d-6e61-48e3-97cf-4f63945a9f88\" (UID: \"d8b0f47d-6e61-48e3-97cf-4f63945a9f88\") " Mar 18 18:05:16 crc kubenswrapper[4830]: I0318 18:05:16.312155 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8b0f47d-6e61-48e3-97cf-4f63945a9f88-proxy-ca-bundles\") pod \"d8b0f47d-6e61-48e3-97cf-4f63945a9f88\" (UID: \"d8b0f47d-6e61-48e3-97cf-4f63945a9f88\") " Mar 18 18:05:16 crc kubenswrapper[4830]: I0318 18:05:16.312263 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8b0f47d-6e61-48e3-97cf-4f63945a9f88-config\") pod \"d8b0f47d-6e61-48e3-97cf-4f63945a9f88\" (UID: \"d8b0f47d-6e61-48e3-97cf-4f63945a9f88\") " Mar 18 18:05:16 crc kubenswrapper[4830]: I0318 18:05:16.312319 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8b0f47d-6e61-48e3-97cf-4f63945a9f88-serving-cert\") pod \"d8b0f47d-6e61-48e3-97cf-4f63945a9f88\" (UID: \"d8b0f47d-6e61-48e3-97cf-4f63945a9f88\") " Mar 18 18:05:16 crc kubenswrapper[4830]: I0318 18:05:16.312353 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d8b0f47d-6e61-48e3-97cf-4f63945a9f88-client-ca\") pod \"d8b0f47d-6e61-48e3-97cf-4f63945a9f88\" (UID: \"d8b0f47d-6e61-48e3-97cf-4f63945a9f88\") " Mar 18 18:05:16 crc kubenswrapper[4830]: I0318 18:05:16.312886 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8b0f47d-6e61-48e3-97cf-4f63945a9f88-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d8b0f47d-6e61-48e3-97cf-4f63945a9f88" (UID: "d8b0f47d-6e61-48e3-97cf-4f63945a9f88"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:05:16 crc kubenswrapper[4830]: I0318 18:05:16.312902 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8b0f47d-6e61-48e3-97cf-4f63945a9f88-client-ca" (OuterVolumeSpecName: "client-ca") pod "d8b0f47d-6e61-48e3-97cf-4f63945a9f88" (UID: "d8b0f47d-6e61-48e3-97cf-4f63945a9f88"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:05:16 crc kubenswrapper[4830]: I0318 18:05:16.313092 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8b0f47d-6e61-48e3-97cf-4f63945a9f88-config" (OuterVolumeSpecName: "config") pod "d8b0f47d-6e61-48e3-97cf-4f63945a9f88" (UID: "d8b0f47d-6e61-48e3-97cf-4f63945a9f88"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:05:16 crc kubenswrapper[4830]: I0318 18:05:16.317460 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8b0f47d-6e61-48e3-97cf-4f63945a9f88-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d8b0f47d-6e61-48e3-97cf-4f63945a9f88" (UID: "d8b0f47d-6e61-48e3-97cf-4f63945a9f88"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:05:16 crc kubenswrapper[4830]: I0318 18:05:16.317570 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8b0f47d-6e61-48e3-97cf-4f63945a9f88-kube-api-access-qnf57" (OuterVolumeSpecName: "kube-api-access-qnf57") pod "d8b0f47d-6e61-48e3-97cf-4f63945a9f88" (UID: "d8b0f47d-6e61-48e3-97cf-4f63945a9f88"). InnerVolumeSpecName "kube-api-access-qnf57". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:05:16 crc kubenswrapper[4830]: I0318 18:05:16.413527 4830 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8b0f47d-6e61-48e3-97cf-4f63945a9f88-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:16 crc kubenswrapper[4830]: I0318 18:05:16.413550 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8b0f47d-6e61-48e3-97cf-4f63945a9f88-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:16 crc kubenswrapper[4830]: I0318 18:05:16.413560 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8b0f47d-6e61-48e3-97cf-4f63945a9f88-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:16 crc kubenswrapper[4830]: I0318 18:05:16.413571 4830 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d8b0f47d-6e61-48e3-97cf-4f63945a9f88-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:16 crc kubenswrapper[4830]: I0318 18:05:16.413580 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnf57\" (UniqueName: \"kubernetes.io/projected/d8b0f47d-6e61-48e3-97cf-4f63945a9f88-kube-api-access-qnf57\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:16 crc kubenswrapper[4830]: I0318 18:05:16.455303 4830 generic.go:334] "Generic (PLEG): container finished" podID="d8b0f47d-6e61-48e3-97cf-4f63945a9f88" containerID="eead50a8429ac2838cb30b2c0dea4e3477461721afefdead11cfd4a5974aba18" exitCode=0 Mar 18 18:05:16 crc kubenswrapper[4830]: I0318 18:05:16.455349 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-665b54db8d-jwqg7" event={"ID":"d8b0f47d-6e61-48e3-97cf-4f63945a9f88","Type":"ContainerDied","Data":"eead50a8429ac2838cb30b2c0dea4e3477461721afefdead11cfd4a5974aba18"} Mar 18 18:05:16 crc kubenswrapper[4830]: I0318 18:05:16.455380 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-665b54db8d-jwqg7" Mar 18 18:05:16 crc kubenswrapper[4830]: I0318 18:05:16.455398 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-665b54db8d-jwqg7" event={"ID":"d8b0f47d-6e61-48e3-97cf-4f63945a9f88","Type":"ContainerDied","Data":"de7df5b38dcc7015f69f6c6c31d7e8f0a4e16bd14d405d2d1af7ca9b1903d20d"} Mar 18 18:05:16 crc kubenswrapper[4830]: I0318 18:05:16.455413 4830 scope.go:117] "RemoveContainer" containerID="eead50a8429ac2838cb30b2c0dea4e3477461721afefdead11cfd4a5974aba18" Mar 18 18:05:16 crc kubenswrapper[4830]: I0318 18:05:16.463498 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66c479d4cd-w8c42" event={"ID":"4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9","Type":"ContainerStarted","Data":"a2f0d34e170b25ff2736201c160a534dc683d0c93491b9239191095cfaaf483f"} Mar 18 18:05:16 crc kubenswrapper[4830]: I0318 18:05:16.463532 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66c479d4cd-w8c42" event={"ID":"4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9","Type":"ContainerStarted","Data":"f1ebdc510b2cb94e958bab1d36ca3093b31108ce8b4c8f9faf9dc6d126eb62c1"} Mar 18 18:05:16 crc kubenswrapper[4830]: I0318 18:05:16.464025 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-66c479d4cd-w8c42" Mar 18 18:05:16 crc kubenswrapper[4830]: I0318 18:05:16.503196 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-66c479d4cd-w8c42" podStartSLOduration=3.503179883 podStartE2EDuration="3.503179883s" podCreationTimestamp="2026-03-18 18:05:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:16.492382765 +0000 UTC m=+151.060013097" watchObservedRunningTime="2026-03-18 18:05:16.503179883 +0000 UTC m=+151.070810205" Mar 18 18:05:16 crc kubenswrapper[4830]: I0318 18:05:16.504926 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-665b54db8d-jwqg7"] Mar 18 18:05:16 crc kubenswrapper[4830]: I0318 18:05:16.505424 4830 scope.go:117] "RemoveContainer" containerID="eead50a8429ac2838cb30b2c0dea4e3477461721afefdead11cfd4a5974aba18" Mar 18 18:05:16 crc kubenswrapper[4830]: I0318 18:05:16.507554 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-665b54db8d-jwqg7"] Mar 18 18:05:16 crc kubenswrapper[4830]: E0318 18:05:16.510145 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eead50a8429ac2838cb30b2c0dea4e3477461721afefdead11cfd4a5974aba18\": container with ID starting with eead50a8429ac2838cb30b2c0dea4e3477461721afefdead11cfd4a5974aba18 not found: ID does not exist" containerID="eead50a8429ac2838cb30b2c0dea4e3477461721afefdead11cfd4a5974aba18" Mar 18 18:05:16 crc kubenswrapper[4830]: I0318 18:05:16.510215 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eead50a8429ac2838cb30b2c0dea4e3477461721afefdead11cfd4a5974aba18"} err="failed to get container status \"eead50a8429ac2838cb30b2c0dea4e3477461721afefdead11cfd4a5974aba18\": rpc error: code = NotFound desc = could not find container \"eead50a8429ac2838cb30b2c0dea4e3477461721afefdead11cfd4a5974aba18\": container with ID starting with eead50a8429ac2838cb30b2c0dea4e3477461721afefdead11cfd4a5974aba18 not found: ID does not exist" Mar 18 18:05:16 crc kubenswrapper[4830]: I0318 18:05:16.887005 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-66c479d4cd-w8c42" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.003285 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-77ddc5bfb6-95ffc"] Mar 18 18:05:17 crc kubenswrapper[4830]: E0318 18:05:17.003502 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8b0f47d-6e61-48e3-97cf-4f63945a9f88" containerName="controller-manager" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.003514 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8b0f47d-6e61-48e3-97cf-4f63945a9f88" containerName="controller-manager" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.003625 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8b0f47d-6e61-48e3-97cf-4f63945a9f88" containerName="controller-manager" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.010233 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77ddc5bfb6-95ffc" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.014576 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.014700 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.014868 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.014954 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.015288 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.015630 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.024262 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.028491 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77ddc5bfb6-95ffc"] Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.128386 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0c7a577c-7ae8-4a3a-a4e8-51254422ea62-client-ca\") pod \"controller-manager-77ddc5bfb6-95ffc\" (UID: \"0c7a577c-7ae8-4a3a-a4e8-51254422ea62\") " pod="openshift-controller-manager/controller-manager-77ddc5bfb6-95ffc" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.128459 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c7a577c-7ae8-4a3a-a4e8-51254422ea62-config\") pod \"controller-manager-77ddc5bfb6-95ffc\" (UID: \"0c7a577c-7ae8-4a3a-a4e8-51254422ea62\") " pod="openshift-controller-manager/controller-manager-77ddc5bfb6-95ffc" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.128490 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjwng\" (UniqueName: \"kubernetes.io/projected/0c7a577c-7ae8-4a3a-a4e8-51254422ea62-kube-api-access-jjwng\") pod \"controller-manager-77ddc5bfb6-95ffc\" (UID: \"0c7a577c-7ae8-4a3a-a4e8-51254422ea62\") " pod="openshift-controller-manager/controller-manager-77ddc5bfb6-95ffc" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.128545 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0c7a577c-7ae8-4a3a-a4e8-51254422ea62-proxy-ca-bundles\") pod \"controller-manager-77ddc5bfb6-95ffc\" (UID: \"0c7a577c-7ae8-4a3a-a4e8-51254422ea62\") " pod="openshift-controller-manager/controller-manager-77ddc5bfb6-95ffc" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.128572 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c7a577c-7ae8-4a3a-a4e8-51254422ea62-serving-cert\") pod \"controller-manager-77ddc5bfb6-95ffc\" (UID: \"0c7a577c-7ae8-4a3a-a4e8-51254422ea62\") " pod="openshift-controller-manager/controller-manager-77ddc5bfb6-95ffc" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.141093 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.142601 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.148640 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.149645 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.149795 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.229529 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0c7a577c-7ae8-4a3a-a4e8-51254422ea62-client-ca\") pod \"controller-manager-77ddc5bfb6-95ffc\" (UID: \"0c7a577c-7ae8-4a3a-a4e8-51254422ea62\") " pod="openshift-controller-manager/controller-manager-77ddc5bfb6-95ffc" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.229890 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c7a577c-7ae8-4a3a-a4e8-51254422ea62-config\") pod \"controller-manager-77ddc5bfb6-95ffc\" (UID: \"0c7a577c-7ae8-4a3a-a4e8-51254422ea62\") " pod="openshift-controller-manager/controller-manager-77ddc5bfb6-95ffc" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.229916 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjwng\" (UniqueName: \"kubernetes.io/projected/0c7a577c-7ae8-4a3a-a4e8-51254422ea62-kube-api-access-jjwng\") pod \"controller-manager-77ddc5bfb6-95ffc\" (UID: \"0c7a577c-7ae8-4a3a-a4e8-51254422ea62\") " pod="openshift-controller-manager/controller-manager-77ddc5bfb6-95ffc" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.230607 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0c7a577c-7ae8-4a3a-a4e8-51254422ea62-client-ca\") pod \"controller-manager-77ddc5bfb6-95ffc\" (UID: \"0c7a577c-7ae8-4a3a-a4e8-51254422ea62\") " pod="openshift-controller-manager/controller-manager-77ddc5bfb6-95ffc" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.231081 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c7a577c-7ae8-4a3a-a4e8-51254422ea62-config\") pod \"controller-manager-77ddc5bfb6-95ffc\" (UID: \"0c7a577c-7ae8-4a3a-a4e8-51254422ea62\") " pod="openshift-controller-manager/controller-manager-77ddc5bfb6-95ffc" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.231160 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0c7a577c-7ae8-4a3a-a4e8-51254422ea62-proxy-ca-bundles\") pod \"controller-manager-77ddc5bfb6-95ffc\" (UID: \"0c7a577c-7ae8-4a3a-a4e8-51254422ea62\") " pod="openshift-controller-manager/controller-manager-77ddc5bfb6-95ffc" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.231181 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c7a577c-7ae8-4a3a-a4e8-51254422ea62-serving-cert\") pod \"controller-manager-77ddc5bfb6-95ffc\" (UID: \"0c7a577c-7ae8-4a3a-a4e8-51254422ea62\") " pod="openshift-controller-manager/controller-manager-77ddc5bfb6-95ffc" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.232119 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0c7a577c-7ae8-4a3a-a4e8-51254422ea62-proxy-ca-bundles\") pod \"controller-manager-77ddc5bfb6-95ffc\" (UID: \"0c7a577c-7ae8-4a3a-a4e8-51254422ea62\") " pod="openshift-controller-manager/controller-manager-77ddc5bfb6-95ffc" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.239824 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c7a577c-7ae8-4a3a-a4e8-51254422ea62-serving-cert\") pod \"controller-manager-77ddc5bfb6-95ffc\" (UID: \"0c7a577c-7ae8-4a3a-a4e8-51254422ea62\") " pod="openshift-controller-manager/controller-manager-77ddc5bfb6-95ffc" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.245711 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjwng\" (UniqueName: \"kubernetes.io/projected/0c7a577c-7ae8-4a3a-a4e8-51254422ea62-kube-api-access-jjwng\") pod \"controller-manager-77ddc5bfb6-95ffc\" (UID: \"0c7a577c-7ae8-4a3a-a4e8-51254422ea62\") " pod="openshift-controller-manager/controller-manager-77ddc5bfb6-95ffc" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.333505 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5a1d7a8b-7a09-41f3-828b-83e57be4bac7-var-lock\") pod \"installer-9-crc\" (UID: \"5a1d7a8b-7a09-41f3-828b-83e57be4bac7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.333566 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a1d7a8b-7a09-41f3-828b-83e57be4bac7-kube-api-access\") pod \"installer-9-crc\" (UID: \"5a1d7a8b-7a09-41f3-828b-83e57be4bac7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.333828 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5a1d7a8b-7a09-41f3-828b-83e57be4bac7-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5a1d7a8b-7a09-41f3-828b-83e57be4bac7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.336995 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77ddc5bfb6-95ffc" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.436079 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a1d7a8b-7a09-41f3-828b-83e57be4bac7-kube-api-access\") pod \"installer-9-crc\" (UID: \"5a1d7a8b-7a09-41f3-828b-83e57be4bac7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.436162 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5a1d7a8b-7a09-41f3-828b-83e57be4bac7-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5a1d7a8b-7a09-41f3-828b-83e57be4bac7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.436227 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5a1d7a8b-7a09-41f3-828b-83e57be4bac7-var-lock\") pod \"installer-9-crc\" (UID: \"5a1d7a8b-7a09-41f3-828b-83e57be4bac7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.436314 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5a1d7a8b-7a09-41f3-828b-83e57be4bac7-var-lock\") pod \"installer-9-crc\" (UID: \"5a1d7a8b-7a09-41f3-828b-83e57be4bac7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.436341 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5a1d7a8b-7a09-41f3-828b-83e57be4bac7-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5a1d7a8b-7a09-41f3-828b-83e57be4bac7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.455122 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a1d7a8b-7a09-41f3-828b-83e57be4bac7-kube-api-access\") pod \"installer-9-crc\" (UID: \"5a1d7a8b-7a09-41f3-828b-83e57be4bac7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.463073 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.474137 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pgl85" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.474527 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pgl85" Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.566680 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77ddc5bfb6-95ffc"] Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.723338 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 18:05:17 crc kubenswrapper[4830]: W0318 18:05:17.733572 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5a1d7a8b_7a09_41f3_828b_83e57be4bac7.slice/crio-05f025272629d54e1c05bb8c488b3201eb0bbd160f40d78f76f604936181530c WatchSource:0}: Error finding container 05f025272629d54e1c05bb8c488b3201eb0bbd160f40d78f76f604936181530c: Status 404 returned error can't find the container with id 05f025272629d54e1c05bb8c488b3201eb0bbd160f40d78f76f604936181530c Mar 18 18:05:17 crc kubenswrapper[4830]: I0318 18:05:17.833785 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pgl85" Mar 18 18:05:18 crc kubenswrapper[4830]: I0318 18:05:18.243080 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8b0f47d-6e61-48e3-97cf-4f63945a9f88" path="/var/lib/kubelet/pods/d8b0f47d-6e61-48e3-97cf-4f63945a9f88/volumes" Mar 18 18:05:18 crc kubenswrapper[4830]: I0318 18:05:18.477586 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77ddc5bfb6-95ffc" event={"ID":"0c7a577c-7ae8-4a3a-a4e8-51254422ea62","Type":"ContainerStarted","Data":"8051b2a434e479f52ca59a771bd373efd99baa656a1cbf9b5985001871b42b2c"} Mar 18 18:05:18 crc kubenswrapper[4830]: I0318 18:05:18.477990 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-77ddc5bfb6-95ffc" Mar 18 18:05:18 crc kubenswrapper[4830]: I0318 18:05:18.478003 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77ddc5bfb6-95ffc" event={"ID":"0c7a577c-7ae8-4a3a-a4e8-51254422ea62","Type":"ContainerStarted","Data":"3b16d53b13e8401a45e640fbbae97e35d198798b05f7e9b8e11490e0d7bfdace"} Mar 18 18:05:18 crc kubenswrapper[4830]: I0318 18:05:18.479398 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5a1d7a8b-7a09-41f3-828b-83e57be4bac7","Type":"ContainerStarted","Data":"180be0aef1f3145a96040a06d16cfbdfbf4a972fc09ee5a2b38354e0903a5550"} Mar 18 18:05:18 crc kubenswrapper[4830]: I0318 18:05:18.479426 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5a1d7a8b-7a09-41f3-828b-83e57be4bac7","Type":"ContainerStarted","Data":"05f025272629d54e1c05bb8c488b3201eb0bbd160f40d78f76f604936181530c"} Mar 18 18:05:18 crc kubenswrapper[4830]: I0318 18:05:18.486674 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-77ddc5bfb6-95ffc" Mar 18 18:05:18 crc kubenswrapper[4830]: I0318 18:05:18.500561 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-77ddc5bfb6-95ffc" podStartSLOduration=5.500536115 podStartE2EDuration="5.500536115s" podCreationTimestamp="2026-03-18 18:05:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:18.497563687 +0000 UTC m=+153.065194019" watchObservedRunningTime="2026-03-18 18:05:18.500536115 +0000 UTC m=+153.068166447" Mar 18 18:05:18 crc kubenswrapper[4830]: I0318 18:05:18.544903 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.5448861040000001 podStartE2EDuration="1.544886104s" podCreationTimestamp="2026-03-18 18:05:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:18.541331657 +0000 UTC m=+153.108961989" watchObservedRunningTime="2026-03-18 18:05:18.544886104 +0000 UTC m=+153.112516436" Mar 18 18:05:18 crc kubenswrapper[4830]: I0318 18:05:18.558914 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pgl85" Mar 18 18:05:19 crc kubenswrapper[4830]: I0318 18:05:19.056632 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7qfcr" Mar 18 18:05:19 crc kubenswrapper[4830]: I0318 18:05:19.056689 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7qfcr" Mar 18 18:05:20 crc kubenswrapper[4830]: I0318 18:05:20.102627 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7qfcr" podUID="f62cb37f-382a-4e46-adf7-a26bac073bbe" containerName="registry-server" probeResult="failure" output=< Mar 18 18:05:20 crc kubenswrapper[4830]: timeout: failed to connect service ":50051" within 1s Mar 18 18:05:20 crc kubenswrapper[4830]: > Mar 18 18:05:21 crc kubenswrapper[4830]: I0318 18:05:21.495721 4830 generic.go:334] "Generic (PLEG): container finished" podID="b9160fc9-aa00-4ce7-9ea2-15aac1e11e00" containerID="ea24c23eb2065a637c2699e54f61c0d7e6bf6d3e2a62993a38f72bf1cab2049c" exitCode=0 Mar 18 18:05:21 crc kubenswrapper[4830]: I0318 18:05:21.495784 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rhcb" event={"ID":"b9160fc9-aa00-4ce7-9ea2-15aac1e11e00","Type":"ContainerDied","Data":"ea24c23eb2065a637c2699e54f61c0d7e6bf6d3e2a62993a38f72bf1cab2049c"} Mar 18 18:05:24 crc kubenswrapper[4830]: I0318 18:05:24.519953 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rhcb" event={"ID":"b9160fc9-aa00-4ce7-9ea2-15aac1e11e00","Type":"ContainerStarted","Data":"35eb2be93b7d0764a58ab2a57d8f800a513854a8cc2da33c69f8cae68f95351c"} Mar 18 18:05:24 crc kubenswrapper[4830]: I0318 18:05:24.535436 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5rhcb" podStartSLOduration=2.45271507 podStartE2EDuration="46.535420822s" podCreationTimestamp="2026-03-18 18:04:38 +0000 UTC" firstStartedPulling="2026-03-18 18:04:40.057212087 +0000 UTC m=+114.624842419" lastFinishedPulling="2026-03-18 18:05:24.139917839 +0000 UTC m=+158.707548171" observedRunningTime="2026-03-18 18:05:24.534782861 +0000 UTC m=+159.102413193" watchObservedRunningTime="2026-03-18 18:05:24.535420822 +0000 UTC m=+159.103051154" Mar 18 18:05:25 crc kubenswrapper[4830]: I0318 18:05:25.531917 4830 generic.go:334] "Generic (PLEG): container finished" podID="b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b" containerID="d6884eefe6dc0814e6117e6448cd1e441b1930cffc4960080c7f9d8921b62cf9" exitCode=0 Mar 18 18:05:25 crc kubenswrapper[4830]: I0318 18:05:25.532003 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szdp2" event={"ID":"b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b","Type":"ContainerDied","Data":"d6884eefe6dc0814e6117e6448cd1e441b1930cffc4960080c7f9d8921b62cf9"} Mar 18 18:05:25 crc kubenswrapper[4830]: I0318 18:05:25.534853 4830 generic.go:334] "Generic (PLEG): container finished" podID="dbfe6d63-05ee-40d2-affa-03b9310a27c1" containerID="4b99ac56390f23695d876f72270af61627a83473bda5ee2d9c5392fcc6871595" exitCode=0 Mar 18 18:05:25 crc kubenswrapper[4830]: I0318 18:05:25.534945 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sk6wx" event={"ID":"dbfe6d63-05ee-40d2-affa-03b9310a27c1","Type":"ContainerDied","Data":"4b99ac56390f23695d876f72270af61627a83473bda5ee2d9c5392fcc6871595"} Mar 18 18:05:25 crc kubenswrapper[4830]: I0318 18:05:25.539445 4830 generic.go:334] "Generic (PLEG): container finished" podID="b1176643-c4d6-4be9-8317-a99886a32b29" containerID="ae7c46db00c8bd306e66778097e977b03646392cf9954feb93951dd4e6017270" exitCode=0 Mar 18 18:05:25 crc kubenswrapper[4830]: I0318 18:05:25.539534 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-klcdh" event={"ID":"b1176643-c4d6-4be9-8317-a99886a32b29","Type":"ContainerDied","Data":"ae7c46db00c8bd306e66778097e977b03646392cf9954feb93951dd4e6017270"} Mar 18 18:05:25 crc kubenswrapper[4830]: I0318 18:05:25.543587 4830 generic.go:334] "Generic (PLEG): container finished" podID="3fa33f19-15c5-4f78-88ef-1db6eb605aa7" containerID="3522fef8c822fb2a5ab4ea87689bcc0589d010c03a8df7edc75b78e5c15021bc" exitCode=0 Mar 18 18:05:25 crc kubenswrapper[4830]: I0318 18:05:25.543667 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-czlcm" event={"ID":"3fa33f19-15c5-4f78-88ef-1db6eb605aa7","Type":"ContainerDied","Data":"3522fef8c822fb2a5ab4ea87689bcc0589d010c03a8df7edc75b78e5c15021bc"} Mar 18 18:05:25 crc kubenswrapper[4830]: I0318 18:05:25.547993 4830 generic.go:334] "Generic (PLEG): container finished" podID="4d8e7b87-f442-4d60-bd65-35eacd097689" containerID="b827e2a6908d53b4c0a9cb794f216f84c284e15b68f5923e2bb47c52f5a2b7ad" exitCode=0 Mar 18 18:05:25 crc kubenswrapper[4830]: I0318 18:05:25.548032 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2tsg6" event={"ID":"4d8e7b87-f442-4d60-bd65-35eacd097689","Type":"ContainerDied","Data":"b827e2a6908d53b4c0a9cb794f216f84c284e15b68f5923e2bb47c52f5a2b7ad"} Mar 18 18:05:26 crc kubenswrapper[4830]: I0318 18:05:26.566291 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-klcdh" event={"ID":"b1176643-c4d6-4be9-8317-a99886a32b29","Type":"ContainerStarted","Data":"a3aab7c607c8ae4ca0201cb2e060b65ce929a8b367ebf1f54767b1a2d39a9ad2"} Mar 18 18:05:26 crc kubenswrapper[4830]: I0318 18:05:26.588641 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-klcdh" podStartSLOduration=3.066962532 podStartE2EDuration="51.588620766s" podCreationTimestamp="2026-03-18 18:04:35 +0000 UTC" firstStartedPulling="2026-03-18 18:04:37.800356059 +0000 UTC m=+112.367986391" lastFinishedPulling="2026-03-18 18:05:26.322014293 +0000 UTC m=+160.889644625" observedRunningTime="2026-03-18 18:05:26.587428506 +0000 UTC m=+161.155058848" watchObservedRunningTime="2026-03-18 18:05:26.588620766 +0000 UTC m=+161.156251098" Mar 18 18:05:27 crc kubenswrapper[4830]: I0318 18:05:27.573634 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sk6wx" event={"ID":"dbfe6d63-05ee-40d2-affa-03b9310a27c1","Type":"ContainerStarted","Data":"b93b54fca0ed3980ad98b856481a144cf02aeb1139dc101165a7406441a567ab"} Mar 18 18:05:27 crc kubenswrapper[4830]: I0318 18:05:27.575320 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-czlcm" event={"ID":"3fa33f19-15c5-4f78-88ef-1db6eb605aa7","Type":"ContainerStarted","Data":"472449b11f809f3c0b26735711d802627f73b1b92edb175aa677101e4dae465c"} Mar 18 18:05:27 crc kubenswrapper[4830]: I0318 18:05:27.576893 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2tsg6" event={"ID":"4d8e7b87-f442-4d60-bd65-35eacd097689","Type":"ContainerStarted","Data":"b40570a257725b0ded8998c2f6f61f7ab3b0c55b679b190aaefa939b3cd1eef6"} Mar 18 18:05:27 crc kubenswrapper[4830]: I0318 18:05:27.578959 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szdp2" event={"ID":"b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b","Type":"ContainerStarted","Data":"276129449ac1ad9821a5efb88f1be1d11cf98fe4dfc24d32a7405056790f4d4e"} Mar 18 18:05:27 crc kubenswrapper[4830]: I0318 18:05:27.595251 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sk6wx" podStartSLOduration=3.009612385 podStartE2EDuration="50.595237635s" podCreationTimestamp="2026-03-18 18:04:37 +0000 UTC" firstStartedPulling="2026-03-18 18:04:38.958248493 +0000 UTC m=+113.525878815" lastFinishedPulling="2026-03-18 18:05:26.543873733 +0000 UTC m=+161.111504065" observedRunningTime="2026-03-18 18:05:27.591519622 +0000 UTC m=+162.159149954" watchObservedRunningTime="2026-03-18 18:05:27.595237635 +0000 UTC m=+162.162867967" Mar 18 18:05:27 crc kubenswrapper[4830]: I0318 18:05:27.612478 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-czlcm" podStartSLOduration=4.002517913 podStartE2EDuration="52.612459216s" podCreationTimestamp="2026-03-18 18:04:35 +0000 UTC" firstStartedPulling="2026-03-18 18:04:37.819982715 +0000 UTC m=+112.387613047" lastFinishedPulling="2026-03-18 18:05:26.429924018 +0000 UTC m=+160.997554350" observedRunningTime="2026-03-18 18:05:27.61168914 +0000 UTC m=+162.179319472" watchObservedRunningTime="2026-03-18 18:05:27.612459216 +0000 UTC m=+162.180089548" Mar 18 18:05:27 crc kubenswrapper[4830]: I0318 18:05:27.628624 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-szdp2" podStartSLOduration=3.608059767 podStartE2EDuration="53.628609841s" podCreationTimestamp="2026-03-18 18:04:34 +0000 UTC" firstStartedPulling="2026-03-18 18:04:36.62326326 +0000 UTC m=+111.190893592" lastFinishedPulling="2026-03-18 18:05:26.643813334 +0000 UTC m=+161.211443666" observedRunningTime="2026-03-18 18:05:27.627412601 +0000 UTC m=+162.195042933" watchObservedRunningTime="2026-03-18 18:05:27.628609841 +0000 UTC m=+162.196240173" Mar 18 18:05:27 crc kubenswrapper[4830]: I0318 18:05:27.650683 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2tsg6" podStartSLOduration=2.824688172 podStartE2EDuration="52.650665081s" podCreationTimestamp="2026-03-18 18:04:35 +0000 UTC" firstStartedPulling="2026-03-18 18:04:36.53515131 +0000 UTC m=+111.102781642" lastFinishedPulling="2026-03-18 18:05:26.361128209 +0000 UTC m=+160.928758551" observedRunningTime="2026-03-18 18:05:27.645269002 +0000 UTC m=+162.212899334" watchObservedRunningTime="2026-03-18 18:05:27.650665081 +0000 UTC m=+162.218295413" Mar 18 18:05:28 crc kubenswrapper[4830]: I0318 18:05:28.185283 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sk6wx" Mar 18 18:05:28 crc kubenswrapper[4830]: I0318 18:05:28.185356 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sk6wx" Mar 18 18:05:28 crc kubenswrapper[4830]: I0318 18:05:28.479885 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5rhcb" Mar 18 18:05:28 crc kubenswrapper[4830]: I0318 18:05:28.479929 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5rhcb" Mar 18 18:05:29 crc kubenswrapper[4830]: I0318 18:05:29.091494 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7qfcr" Mar 18 18:05:29 crc kubenswrapper[4830]: I0318 18:05:29.144435 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7qfcr" Mar 18 18:05:29 crc kubenswrapper[4830]: I0318 18:05:29.234458 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-sk6wx" podUID="dbfe6d63-05ee-40d2-affa-03b9310a27c1" containerName="registry-server" probeResult="failure" output=< Mar 18 18:05:29 crc kubenswrapper[4830]: timeout: failed to connect service ":50051" within 1s Mar 18 18:05:29 crc kubenswrapper[4830]: > Mar 18 18:05:29 crc kubenswrapper[4830]: I0318 18:05:29.514737 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5rhcb" podUID="b9160fc9-aa00-4ce7-9ea2-15aac1e11e00" containerName="registry-server" probeResult="failure" output=< Mar 18 18:05:29 crc kubenswrapper[4830]: timeout: failed to connect service ":50051" within 1s Mar 18 18:05:29 crc kubenswrapper[4830]: > Mar 18 18:05:31 crc kubenswrapper[4830]: I0318 18:05:31.580153 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7qfcr"] Mar 18 18:05:31 crc kubenswrapper[4830]: I0318 18:05:31.580929 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7qfcr" podUID="f62cb37f-382a-4e46-adf7-a26bac073bbe" containerName="registry-server" containerID="cri-o://7a23acdb3d6685042b800a8661791721484afadb1a264dc58175449e05fcb860" gracePeriod=2 Mar 18 18:05:32 crc kubenswrapper[4830]: I0318 18:05:32.617585 4830 generic.go:334] "Generic (PLEG): container finished" podID="f62cb37f-382a-4e46-adf7-a26bac073bbe" containerID="7a23acdb3d6685042b800a8661791721484afadb1a264dc58175449e05fcb860" exitCode=0 Mar 18 18:05:32 crc kubenswrapper[4830]: I0318 18:05:32.617692 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qfcr" event={"ID":"f62cb37f-382a-4e46-adf7-a26bac073bbe","Type":"ContainerDied","Data":"7a23acdb3d6685042b800a8661791721484afadb1a264dc58175449e05fcb860"} Mar 18 18:05:32 crc kubenswrapper[4830]: I0318 18:05:32.750156 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qfcr" Mar 18 18:05:32 crc kubenswrapper[4830]: I0318 18:05:32.784912 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f62cb37f-382a-4e46-adf7-a26bac073bbe-utilities\") pod \"f62cb37f-382a-4e46-adf7-a26bac073bbe\" (UID: \"f62cb37f-382a-4e46-adf7-a26bac073bbe\") " Mar 18 18:05:32 crc kubenswrapper[4830]: I0318 18:05:32.785005 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dbxr\" (UniqueName: \"kubernetes.io/projected/f62cb37f-382a-4e46-adf7-a26bac073bbe-kube-api-access-2dbxr\") pod \"f62cb37f-382a-4e46-adf7-a26bac073bbe\" (UID: \"f62cb37f-382a-4e46-adf7-a26bac073bbe\") " Mar 18 18:05:32 crc kubenswrapper[4830]: I0318 18:05:32.785059 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f62cb37f-382a-4e46-adf7-a26bac073bbe-catalog-content\") pod \"f62cb37f-382a-4e46-adf7-a26bac073bbe\" (UID: \"f62cb37f-382a-4e46-adf7-a26bac073bbe\") " Mar 18 18:05:32 crc kubenswrapper[4830]: I0318 18:05:32.786676 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f62cb37f-382a-4e46-adf7-a26bac073bbe-utilities" (OuterVolumeSpecName: "utilities") pod "f62cb37f-382a-4e46-adf7-a26bac073bbe" (UID: "f62cb37f-382a-4e46-adf7-a26bac073bbe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:05:32 crc kubenswrapper[4830]: I0318 18:05:32.791092 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f62cb37f-382a-4e46-adf7-a26bac073bbe-kube-api-access-2dbxr" (OuterVolumeSpecName: "kube-api-access-2dbxr") pod "f62cb37f-382a-4e46-adf7-a26bac073bbe" (UID: "f62cb37f-382a-4e46-adf7-a26bac073bbe"). InnerVolumeSpecName "kube-api-access-2dbxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:05:32 crc kubenswrapper[4830]: I0318 18:05:32.886938 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f62cb37f-382a-4e46-adf7-a26bac073bbe-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:32 crc kubenswrapper[4830]: I0318 18:05:32.887019 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dbxr\" (UniqueName: \"kubernetes.io/projected/f62cb37f-382a-4e46-adf7-a26bac073bbe-kube-api-access-2dbxr\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:32 crc kubenswrapper[4830]: I0318 18:05:32.942034 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f62cb37f-382a-4e46-adf7-a26bac073bbe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f62cb37f-382a-4e46-adf7-a26bac073bbe" (UID: "f62cb37f-382a-4e46-adf7-a26bac073bbe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:05:32 crc kubenswrapper[4830]: I0318 18:05:32.988796 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f62cb37f-382a-4e46-adf7-a26bac073bbe-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:33 crc kubenswrapper[4830]: I0318 18:05:33.646569 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qfcr" event={"ID":"f62cb37f-382a-4e46-adf7-a26bac073bbe","Type":"ContainerDied","Data":"1bd37c2658a4378af117a7c45e9906a137b357875491262f937baad633be79f5"} Mar 18 18:05:33 crc kubenswrapper[4830]: I0318 18:05:33.646715 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qfcr" Mar 18 18:05:33 crc kubenswrapper[4830]: I0318 18:05:33.646976 4830 scope.go:117] "RemoveContainer" containerID="7a23acdb3d6685042b800a8661791721484afadb1a264dc58175449e05fcb860" Mar 18 18:05:33 crc kubenswrapper[4830]: I0318 18:05:33.675240 4830 scope.go:117] "RemoveContainer" containerID="7d89b672c8810d0f1d427f94ad1f802170a8e44b939f4438a00f114dbcc1398b" Mar 18 18:05:33 crc kubenswrapper[4830]: I0318 18:05:33.714489 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7qfcr"] Mar 18 18:05:33 crc kubenswrapper[4830]: I0318 18:05:33.717326 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7qfcr"] Mar 18 18:05:33 crc kubenswrapper[4830]: I0318 18:05:33.724011 4830 scope.go:117] "RemoveContainer" containerID="9542d7f933d806aba5e1a3274bf315e3bc7dfac8aa8d5d5b349f291a0e972b9b" Mar 18 18:05:33 crc kubenswrapper[4830]: I0318 18:05:33.814611 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77ddc5bfb6-95ffc"] Mar 18 18:05:33 crc kubenswrapper[4830]: I0318 18:05:33.814973 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-77ddc5bfb6-95ffc" podUID="0c7a577c-7ae8-4a3a-a4e8-51254422ea62" containerName="controller-manager" containerID="cri-o://8051b2a434e479f52ca59a771bd373efd99baa656a1cbf9b5985001871b42b2c" gracePeriod=30 Mar 18 18:05:33 crc kubenswrapper[4830]: I0318 18:05:33.844053 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66c479d4cd-w8c42"] Mar 18 18:05:33 crc kubenswrapper[4830]: I0318 18:05:33.844326 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-66c479d4cd-w8c42" podUID="4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9" containerName="route-controller-manager" containerID="cri-o://a2f0d34e170b25ff2736201c160a534dc683d0c93491b9239191095cfaaf483f" gracePeriod=30 Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.243057 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f62cb37f-382a-4e46-adf7-a26bac073bbe" path="/var/lib/kubelet/pods/f62cb37f-382a-4e46-adf7-a26bac073bbe/volumes" Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.327379 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66c479d4cd-w8c42" Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.406850 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77ddc5bfb6-95ffc" Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.412879 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvsg9\" (UniqueName: \"kubernetes.io/projected/4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9-kube-api-access-jvsg9\") pod \"4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9\" (UID: \"4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9\") " Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.413008 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9-serving-cert\") pod \"4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9\" (UID: \"4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9\") " Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.413064 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9-client-ca\") pod \"4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9\" (UID: \"4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9\") " Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.413094 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9-config\") pod \"4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9\" (UID: \"4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9\") " Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.414045 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9-client-ca" (OuterVolumeSpecName: "client-ca") pod "4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9" (UID: "4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.414152 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9-config" (OuterVolumeSpecName: "config") pod "4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9" (UID: "4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.418945 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9" (UID: "4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.419193 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9-kube-api-access-jvsg9" (OuterVolumeSpecName: "kube-api-access-jvsg9") pod "4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9" (UID: "4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9"). InnerVolumeSpecName "kube-api-access-jvsg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.513985 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjwng\" (UniqueName: \"kubernetes.io/projected/0c7a577c-7ae8-4a3a-a4e8-51254422ea62-kube-api-access-jjwng\") pod \"0c7a577c-7ae8-4a3a-a4e8-51254422ea62\" (UID: \"0c7a577c-7ae8-4a3a-a4e8-51254422ea62\") " Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.514053 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0c7a577c-7ae8-4a3a-a4e8-51254422ea62-client-ca\") pod \"0c7a577c-7ae8-4a3a-a4e8-51254422ea62\" (UID: \"0c7a577c-7ae8-4a3a-a4e8-51254422ea62\") " Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.514115 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0c7a577c-7ae8-4a3a-a4e8-51254422ea62-proxy-ca-bundles\") pod \"0c7a577c-7ae8-4a3a-a4e8-51254422ea62\" (UID: \"0c7a577c-7ae8-4a3a-a4e8-51254422ea62\") " Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.514179 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c7a577c-7ae8-4a3a-a4e8-51254422ea62-serving-cert\") pod \"0c7a577c-7ae8-4a3a-a4e8-51254422ea62\" (UID: \"0c7a577c-7ae8-4a3a-a4e8-51254422ea62\") " Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.514222 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c7a577c-7ae8-4a3a-a4e8-51254422ea62-config\") pod \"0c7a577c-7ae8-4a3a-a4e8-51254422ea62\" (UID: \"0c7a577c-7ae8-4a3a-a4e8-51254422ea62\") " Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.514401 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.514412 4830 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.514421 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.514430 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvsg9\" (UniqueName: \"kubernetes.io/projected/4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9-kube-api-access-jvsg9\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.514931 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c7a577c-7ae8-4a3a-a4e8-51254422ea62-client-ca" (OuterVolumeSpecName: "client-ca") pod "0c7a577c-7ae8-4a3a-a4e8-51254422ea62" (UID: "0c7a577c-7ae8-4a3a-a4e8-51254422ea62"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.514967 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c7a577c-7ae8-4a3a-a4e8-51254422ea62-config" (OuterVolumeSpecName: "config") pod "0c7a577c-7ae8-4a3a-a4e8-51254422ea62" (UID: "0c7a577c-7ae8-4a3a-a4e8-51254422ea62"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.515134 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c7a577c-7ae8-4a3a-a4e8-51254422ea62-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0c7a577c-7ae8-4a3a-a4e8-51254422ea62" (UID: "0c7a577c-7ae8-4a3a-a4e8-51254422ea62"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.517325 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c7a577c-7ae8-4a3a-a4e8-51254422ea62-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0c7a577c-7ae8-4a3a-a4e8-51254422ea62" (UID: "0c7a577c-7ae8-4a3a-a4e8-51254422ea62"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.517629 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c7a577c-7ae8-4a3a-a4e8-51254422ea62-kube-api-access-jjwng" (OuterVolumeSpecName: "kube-api-access-jjwng") pod "0c7a577c-7ae8-4a3a-a4e8-51254422ea62" (UID: "0c7a577c-7ae8-4a3a-a4e8-51254422ea62"). InnerVolumeSpecName "kube-api-access-jjwng". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.616197 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c7a577c-7ae8-4a3a-a4e8-51254422ea62-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.616257 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjwng\" (UniqueName: \"kubernetes.io/projected/0c7a577c-7ae8-4a3a-a4e8-51254422ea62-kube-api-access-jjwng\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.616285 4830 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0c7a577c-7ae8-4a3a-a4e8-51254422ea62-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.616302 4830 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0c7a577c-7ae8-4a3a-a4e8-51254422ea62-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.616316 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c7a577c-7ae8-4a3a-a4e8-51254422ea62-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.657748 4830 generic.go:334] "Generic (PLEG): container finished" podID="4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9" containerID="a2f0d34e170b25ff2736201c160a534dc683d0c93491b9239191095cfaaf483f" exitCode=0 Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.657842 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66c479d4cd-w8c42" event={"ID":"4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9","Type":"ContainerDied","Data":"a2f0d34e170b25ff2736201c160a534dc683d0c93491b9239191095cfaaf483f"} Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.657911 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66c479d4cd-w8c42" event={"ID":"4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9","Type":"ContainerDied","Data":"f1ebdc510b2cb94e958bab1d36ca3093b31108ce8b4c8f9faf9dc6d126eb62c1"} Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.657913 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66c479d4cd-w8c42" Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.657957 4830 scope.go:117] "RemoveContainer" containerID="a2f0d34e170b25ff2736201c160a534dc683d0c93491b9239191095cfaaf483f" Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.661761 4830 generic.go:334] "Generic (PLEG): container finished" podID="0c7a577c-7ae8-4a3a-a4e8-51254422ea62" containerID="8051b2a434e479f52ca59a771bd373efd99baa656a1cbf9b5985001871b42b2c" exitCode=0 Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.661835 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77ddc5bfb6-95ffc" event={"ID":"0c7a577c-7ae8-4a3a-a4e8-51254422ea62","Type":"ContainerDied","Data":"8051b2a434e479f52ca59a771bd373efd99baa656a1cbf9b5985001871b42b2c"} Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.661899 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77ddc5bfb6-95ffc" event={"ID":"0c7a577c-7ae8-4a3a-a4e8-51254422ea62","Type":"ContainerDied","Data":"3b16d53b13e8401a45e640fbbae97e35d198798b05f7e9b8e11490e0d7bfdace"} Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.661931 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77ddc5bfb6-95ffc" Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.695019 4830 scope.go:117] "RemoveContainer" containerID="a2f0d34e170b25ff2736201c160a534dc683d0c93491b9239191095cfaaf483f" Mar 18 18:05:34 crc kubenswrapper[4830]: E0318 18:05:34.695509 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2f0d34e170b25ff2736201c160a534dc683d0c93491b9239191095cfaaf483f\": container with ID starting with a2f0d34e170b25ff2736201c160a534dc683d0c93491b9239191095cfaaf483f not found: ID does not exist" containerID="a2f0d34e170b25ff2736201c160a534dc683d0c93491b9239191095cfaaf483f" Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.695547 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2f0d34e170b25ff2736201c160a534dc683d0c93491b9239191095cfaaf483f"} err="failed to get container status \"a2f0d34e170b25ff2736201c160a534dc683d0c93491b9239191095cfaaf483f\": rpc error: code = NotFound desc = could not find container \"a2f0d34e170b25ff2736201c160a534dc683d0c93491b9239191095cfaaf483f\": container with ID starting with a2f0d34e170b25ff2736201c160a534dc683d0c93491b9239191095cfaaf483f not found: ID does not exist" Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.695572 4830 scope.go:117] "RemoveContainer" containerID="8051b2a434e479f52ca59a771bd373efd99baa656a1cbf9b5985001871b42b2c" Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.696175 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77ddc5bfb6-95ffc"] Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.698368 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-77ddc5bfb6-95ffc"] Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.734667 4830 scope.go:117] "RemoveContainer" containerID="8051b2a434e479f52ca59a771bd373efd99baa656a1cbf9b5985001871b42b2c" Mar 18 18:05:34 crc kubenswrapper[4830]: E0318 18:05:34.735179 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8051b2a434e479f52ca59a771bd373efd99baa656a1cbf9b5985001871b42b2c\": container with ID starting with 8051b2a434e479f52ca59a771bd373efd99baa656a1cbf9b5985001871b42b2c not found: ID does not exist" containerID="8051b2a434e479f52ca59a771bd373efd99baa656a1cbf9b5985001871b42b2c" Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.735222 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8051b2a434e479f52ca59a771bd373efd99baa656a1cbf9b5985001871b42b2c"} err="failed to get container status \"8051b2a434e479f52ca59a771bd373efd99baa656a1cbf9b5985001871b42b2c\": rpc error: code = NotFound desc = could not find container \"8051b2a434e479f52ca59a771bd373efd99baa656a1cbf9b5985001871b42b2c\": container with ID starting with 8051b2a434e479f52ca59a771bd373efd99baa656a1cbf9b5985001871b42b2c not found: ID does not exist" Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.737712 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66c479d4cd-w8c42"] Mar 18 18:05:34 crc kubenswrapper[4830]: I0318 18:05:34.744363 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66c479d4cd-w8c42"] Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.017603 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8795dc848-jtshw"] Mar 18 18:05:35 crc kubenswrapper[4830]: E0318 18:05:35.018112 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f62cb37f-382a-4e46-adf7-a26bac073bbe" containerName="extract-content" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.018143 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f62cb37f-382a-4e46-adf7-a26bac073bbe" containerName="extract-content" Mar 18 18:05:35 crc kubenswrapper[4830]: E0318 18:05:35.018175 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c7a577c-7ae8-4a3a-a4e8-51254422ea62" containerName="controller-manager" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.018192 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c7a577c-7ae8-4a3a-a4e8-51254422ea62" containerName="controller-manager" Mar 18 18:05:35 crc kubenswrapper[4830]: E0318 18:05:35.018221 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f62cb37f-382a-4e46-adf7-a26bac073bbe" containerName="registry-server" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.018237 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f62cb37f-382a-4e46-adf7-a26bac073bbe" containerName="registry-server" Mar 18 18:05:35 crc kubenswrapper[4830]: E0318 18:05:35.018282 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f62cb37f-382a-4e46-adf7-a26bac073bbe" containerName="extract-utilities" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.018298 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f62cb37f-382a-4e46-adf7-a26bac073bbe" containerName="extract-utilities" Mar 18 18:05:35 crc kubenswrapper[4830]: E0318 18:05:35.018318 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9" containerName="route-controller-manager" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.018334 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9" containerName="route-controller-manager" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.018554 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9" containerName="route-controller-manager" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.018578 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f62cb37f-382a-4e46-adf7-a26bac073bbe" containerName="registry-server" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.018596 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c7a577c-7ae8-4a3a-a4e8-51254422ea62" containerName="controller-manager" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.019268 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8795dc848-jtshw" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.021602 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-58657d8f8c-gvpnx"] Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.022491 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58657d8f8c-gvpnx" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.022615 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.023612 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.023860 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.024014 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.024199 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.025049 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.025225 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.025332 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.025385 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.025819 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.025989 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.029631 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.035209 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.043637 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8795dc848-jtshw"] Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.053197 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-58657d8f8c-gvpnx"] Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.123200 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b4966489-d69c-4915-bc0d-3337a7d5067e-proxy-ca-bundles\") pod \"controller-manager-58657d8f8c-gvpnx\" (UID: \"b4966489-d69c-4915-bc0d-3337a7d5067e\") " pod="openshift-controller-manager/controller-manager-58657d8f8c-gvpnx" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.123260 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d-serving-cert\") pod \"route-controller-manager-8795dc848-jtshw\" (UID: \"0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d\") " pod="openshift-route-controller-manager/route-controller-manager-8795dc848-jtshw" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.123292 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4966489-d69c-4915-bc0d-3337a7d5067e-serving-cert\") pod \"controller-manager-58657d8f8c-gvpnx\" (UID: \"b4966489-d69c-4915-bc0d-3337a7d5067e\") " pod="openshift-controller-manager/controller-manager-58657d8f8c-gvpnx" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.123324 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d-client-ca\") pod \"route-controller-manager-8795dc848-jtshw\" (UID: \"0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d\") " pod="openshift-route-controller-manager/route-controller-manager-8795dc848-jtshw" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.123349 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5fm8\" (UniqueName: \"kubernetes.io/projected/0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d-kube-api-access-n5fm8\") pod \"route-controller-manager-8795dc848-jtshw\" (UID: \"0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d\") " pod="openshift-route-controller-manager/route-controller-manager-8795dc848-jtshw" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.123385 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4966489-d69c-4915-bc0d-3337a7d5067e-config\") pod \"controller-manager-58657d8f8c-gvpnx\" (UID: \"b4966489-d69c-4915-bc0d-3337a7d5067e\") " pod="openshift-controller-manager/controller-manager-58657d8f8c-gvpnx" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.123434 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql7rm\" (UniqueName: \"kubernetes.io/projected/b4966489-d69c-4915-bc0d-3337a7d5067e-kube-api-access-ql7rm\") pod \"controller-manager-58657d8f8c-gvpnx\" (UID: \"b4966489-d69c-4915-bc0d-3337a7d5067e\") " pod="openshift-controller-manager/controller-manager-58657d8f8c-gvpnx" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.123502 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b4966489-d69c-4915-bc0d-3337a7d5067e-client-ca\") pod \"controller-manager-58657d8f8c-gvpnx\" (UID: \"b4966489-d69c-4915-bc0d-3337a7d5067e\") " pod="openshift-controller-manager/controller-manager-58657d8f8c-gvpnx" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.123536 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d-config\") pod \"route-controller-manager-8795dc848-jtshw\" (UID: \"0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d\") " pod="openshift-route-controller-manager/route-controller-manager-8795dc848-jtshw" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.225089 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d-client-ca\") pod \"route-controller-manager-8795dc848-jtshw\" (UID: \"0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d\") " pod="openshift-route-controller-manager/route-controller-manager-8795dc848-jtshw" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.225569 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5fm8\" (UniqueName: \"kubernetes.io/projected/0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d-kube-api-access-n5fm8\") pod \"route-controller-manager-8795dc848-jtshw\" (UID: \"0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d\") " pod="openshift-route-controller-manager/route-controller-manager-8795dc848-jtshw" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.225623 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4966489-d69c-4915-bc0d-3337a7d5067e-config\") pod \"controller-manager-58657d8f8c-gvpnx\" (UID: \"b4966489-d69c-4915-bc0d-3337a7d5067e\") " pod="openshift-controller-manager/controller-manager-58657d8f8c-gvpnx" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.225689 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql7rm\" (UniqueName: \"kubernetes.io/projected/b4966489-d69c-4915-bc0d-3337a7d5067e-kube-api-access-ql7rm\") pod \"controller-manager-58657d8f8c-gvpnx\" (UID: \"b4966489-d69c-4915-bc0d-3337a7d5067e\") " pod="openshift-controller-manager/controller-manager-58657d8f8c-gvpnx" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.225795 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b4966489-d69c-4915-bc0d-3337a7d5067e-client-ca\") pod \"controller-manager-58657d8f8c-gvpnx\" (UID: \"b4966489-d69c-4915-bc0d-3337a7d5067e\") " pod="openshift-controller-manager/controller-manager-58657d8f8c-gvpnx" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.225832 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d-config\") pod \"route-controller-manager-8795dc848-jtshw\" (UID: \"0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d\") " pod="openshift-route-controller-manager/route-controller-manager-8795dc848-jtshw" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.225885 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b4966489-d69c-4915-bc0d-3337a7d5067e-proxy-ca-bundles\") pod \"controller-manager-58657d8f8c-gvpnx\" (UID: \"b4966489-d69c-4915-bc0d-3337a7d5067e\") " pod="openshift-controller-manager/controller-manager-58657d8f8c-gvpnx" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.225922 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d-serving-cert\") pod \"route-controller-manager-8795dc848-jtshw\" (UID: \"0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d\") " pod="openshift-route-controller-manager/route-controller-manager-8795dc848-jtshw" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.225953 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4966489-d69c-4915-bc0d-3337a7d5067e-serving-cert\") pod \"controller-manager-58657d8f8c-gvpnx\" (UID: \"b4966489-d69c-4915-bc0d-3337a7d5067e\") " pod="openshift-controller-manager/controller-manager-58657d8f8c-gvpnx" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.226941 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d-client-ca\") pod \"route-controller-manager-8795dc848-jtshw\" (UID: \"0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d\") " pod="openshift-route-controller-manager/route-controller-manager-8795dc848-jtshw" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.227249 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b4966489-d69c-4915-bc0d-3337a7d5067e-client-ca\") pod \"controller-manager-58657d8f8c-gvpnx\" (UID: \"b4966489-d69c-4915-bc0d-3337a7d5067e\") " pod="openshift-controller-manager/controller-manager-58657d8f8c-gvpnx" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.227832 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b4966489-d69c-4915-bc0d-3337a7d5067e-proxy-ca-bundles\") pod \"controller-manager-58657d8f8c-gvpnx\" (UID: \"b4966489-d69c-4915-bc0d-3337a7d5067e\") " pod="openshift-controller-manager/controller-manager-58657d8f8c-gvpnx" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.228428 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4966489-d69c-4915-bc0d-3337a7d5067e-config\") pod \"controller-manager-58657d8f8c-gvpnx\" (UID: \"b4966489-d69c-4915-bc0d-3337a7d5067e\") " pod="openshift-controller-manager/controller-manager-58657d8f8c-gvpnx" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.228692 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d-config\") pod \"route-controller-manager-8795dc848-jtshw\" (UID: \"0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d\") " pod="openshift-route-controller-manager/route-controller-manager-8795dc848-jtshw" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.233263 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4966489-d69c-4915-bc0d-3337a7d5067e-serving-cert\") pod \"controller-manager-58657d8f8c-gvpnx\" (UID: \"b4966489-d69c-4915-bc0d-3337a7d5067e\") " pod="openshift-controller-manager/controller-manager-58657d8f8c-gvpnx" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.237039 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d-serving-cert\") pod \"route-controller-manager-8795dc848-jtshw\" (UID: \"0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d\") " pod="openshift-route-controller-manager/route-controller-manager-8795dc848-jtshw" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.248370 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5fm8\" (UniqueName: \"kubernetes.io/projected/0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d-kube-api-access-n5fm8\") pod \"route-controller-manager-8795dc848-jtshw\" (UID: \"0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d\") " pod="openshift-route-controller-manager/route-controller-manager-8795dc848-jtshw" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.253145 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql7rm\" (UniqueName: \"kubernetes.io/projected/b4966489-d69c-4915-bc0d-3337a7d5067e-kube-api-access-ql7rm\") pod \"controller-manager-58657d8f8c-gvpnx\" (UID: \"b4966489-d69c-4915-bc0d-3337a7d5067e\") " pod="openshift-controller-manager/controller-manager-58657d8f8c-gvpnx" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.369590 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8795dc848-jtshw" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.388185 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58657d8f8c-gvpnx" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.517582 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2tsg6" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.518907 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2tsg6" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.573615 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2tsg6" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.646125 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-szdp2" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.647168 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-szdp2" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.657816 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8795dc848-jtshw"] Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.691498 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-58657d8f8c-gvpnx"] Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.701519 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-klcdh" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.702469 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-klcdh" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.704693 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-szdp2" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.747459 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2tsg6" Mar 18 18:05:35 crc kubenswrapper[4830]: I0318 18:05:35.756236 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-klcdh" Mar 18 18:05:36 crc kubenswrapper[4830]: I0318 18:05:36.011836 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-czlcm" Mar 18 18:05:36 crc kubenswrapper[4830]: I0318 18:05:36.012422 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-czlcm" Mar 18 18:05:36 crc kubenswrapper[4830]: I0318 18:05:36.058232 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-czlcm" Mar 18 18:05:36 crc kubenswrapper[4830]: I0318 18:05:36.242696 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c7a577c-7ae8-4a3a-a4e8-51254422ea62" path="/var/lib/kubelet/pods/0c7a577c-7ae8-4a3a-a4e8-51254422ea62/volumes" Mar 18 18:05:36 crc kubenswrapper[4830]: I0318 18:05:36.243665 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9" path="/var/lib/kubelet/pods/4c618f7c-fe7d-4eb1-b78a-937c9d28e3e9/volumes" Mar 18 18:05:36 crc kubenswrapper[4830]: I0318 18:05:36.681733 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8795dc848-jtshw" event={"ID":"0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d","Type":"ContainerStarted","Data":"e78ae3fb160cd1b0c1530b9939af98b79dea3f4eeeac9110794dc7e92cd4a75f"} Mar 18 18:05:36 crc kubenswrapper[4830]: I0318 18:05:36.681789 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8795dc848-jtshw" event={"ID":"0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d","Type":"ContainerStarted","Data":"405132e2612c24140a4ecf0d1da8b5164e14d002078dc588284926f6503e35fb"} Mar 18 18:05:36 crc kubenswrapper[4830]: I0318 18:05:36.682156 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8795dc848-jtshw" Mar 18 18:05:36 crc kubenswrapper[4830]: I0318 18:05:36.684047 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58657d8f8c-gvpnx" event={"ID":"b4966489-d69c-4915-bc0d-3337a7d5067e","Type":"ContainerStarted","Data":"8640141c2dca319567fbc2b63f5dfe6d2c29ac4c3c481bb7e31003a39730754c"} Mar 18 18:05:36 crc kubenswrapper[4830]: I0318 18:05:36.684074 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58657d8f8c-gvpnx" event={"ID":"b4966489-d69c-4915-bc0d-3337a7d5067e","Type":"ContainerStarted","Data":"a8213183496dfba63f5413296ae789f1a43fcc7904109079a4814f4415c5b0e6"} Mar 18 18:05:36 crc kubenswrapper[4830]: I0318 18:05:36.691032 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8795dc848-jtshw" Mar 18 18:05:36 crc kubenswrapper[4830]: I0318 18:05:36.703291 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8795dc848-jtshw" podStartSLOduration=3.703266007 podStartE2EDuration="3.703266007s" podCreationTimestamp="2026-03-18 18:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:36.702139889 +0000 UTC m=+171.269770251" watchObservedRunningTime="2026-03-18 18:05:36.703266007 +0000 UTC m=+171.270896349" Mar 18 18:05:36 crc kubenswrapper[4830]: I0318 18:05:36.739121 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-czlcm" Mar 18 18:05:36 crc kubenswrapper[4830]: I0318 18:05:36.748897 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-szdp2" Mar 18 18:05:36 crc kubenswrapper[4830]: I0318 18:05:36.750498 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-58657d8f8c-gvpnx" podStartSLOduration=3.75046227 podStartE2EDuration="3.75046227s" podCreationTimestamp="2026-03-18 18:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:36.749214949 +0000 UTC m=+171.316845291" watchObservedRunningTime="2026-03-18 18:05:36.75046227 +0000 UTC m=+171.318092602" Mar 18 18:05:36 crc kubenswrapper[4830]: I0318 18:05:36.764091 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-klcdh" Mar 18 18:05:37 crc kubenswrapper[4830]: I0318 18:05:37.693679 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-58657d8f8c-gvpnx" Mar 18 18:05:37 crc kubenswrapper[4830]: I0318 18:05:37.703189 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-58657d8f8c-gvpnx" Mar 18 18:05:37 crc kubenswrapper[4830]: I0318 18:05:37.791734 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-klcdh"] Mar 18 18:05:37 crc kubenswrapper[4830]: I0318 18:05:37.987452 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-czlcm"] Mar 18 18:05:38 crc kubenswrapper[4830]: I0318 18:05:38.262308 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sk6wx" Mar 18 18:05:38 crc kubenswrapper[4830]: I0318 18:05:38.325165 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sk6wx" Mar 18 18:05:38 crc kubenswrapper[4830]: I0318 18:05:38.530439 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5rhcb" Mar 18 18:05:38 crc kubenswrapper[4830]: I0318 18:05:38.588501 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5rhcb" Mar 18 18:05:38 crc kubenswrapper[4830]: I0318 18:05:38.696292 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-czlcm" podUID="3fa33f19-15c5-4f78-88ef-1db6eb605aa7" containerName="registry-server" containerID="cri-o://472449b11f809f3c0b26735711d802627f73b1b92edb175aa677101e4dae465c" gracePeriod=2 Mar 18 18:05:39 crc kubenswrapper[4830]: I0318 18:05:39.245996 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-czlcm" Mar 18 18:05:39 crc kubenswrapper[4830]: I0318 18:05:39.325041 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fa33f19-15c5-4f78-88ef-1db6eb605aa7-utilities\") pod \"3fa33f19-15c5-4f78-88ef-1db6eb605aa7\" (UID: \"3fa33f19-15c5-4f78-88ef-1db6eb605aa7\") " Mar 18 18:05:39 crc kubenswrapper[4830]: I0318 18:05:39.325228 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fa33f19-15c5-4f78-88ef-1db6eb605aa7-catalog-content\") pod \"3fa33f19-15c5-4f78-88ef-1db6eb605aa7\" (UID: \"3fa33f19-15c5-4f78-88ef-1db6eb605aa7\") " Mar 18 18:05:39 crc kubenswrapper[4830]: I0318 18:05:39.325810 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fa33f19-15c5-4f78-88ef-1db6eb605aa7-utilities" (OuterVolumeSpecName: "utilities") pod "3fa33f19-15c5-4f78-88ef-1db6eb605aa7" (UID: "3fa33f19-15c5-4f78-88ef-1db6eb605aa7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:05:39 crc kubenswrapper[4830]: I0318 18:05:39.330248 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5q2g\" (UniqueName: \"kubernetes.io/projected/3fa33f19-15c5-4f78-88ef-1db6eb605aa7-kube-api-access-g5q2g\") pod \"3fa33f19-15c5-4f78-88ef-1db6eb605aa7\" (UID: \"3fa33f19-15c5-4f78-88ef-1db6eb605aa7\") " Mar 18 18:05:39 crc kubenswrapper[4830]: I0318 18:05:39.331380 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fa33f19-15c5-4f78-88ef-1db6eb605aa7-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:39 crc kubenswrapper[4830]: I0318 18:05:39.342065 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fa33f19-15c5-4f78-88ef-1db6eb605aa7-kube-api-access-g5q2g" (OuterVolumeSpecName: "kube-api-access-g5q2g") pod "3fa33f19-15c5-4f78-88ef-1db6eb605aa7" (UID: "3fa33f19-15c5-4f78-88ef-1db6eb605aa7"). InnerVolumeSpecName "kube-api-access-g5q2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:05:39 crc kubenswrapper[4830]: I0318 18:05:39.396048 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fa33f19-15c5-4f78-88ef-1db6eb605aa7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3fa33f19-15c5-4f78-88ef-1db6eb605aa7" (UID: "3fa33f19-15c5-4f78-88ef-1db6eb605aa7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:05:39 crc kubenswrapper[4830]: I0318 18:05:39.432934 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fa33f19-15c5-4f78-88ef-1db6eb605aa7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:39 crc kubenswrapper[4830]: I0318 18:05:39.432978 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5q2g\" (UniqueName: \"kubernetes.io/projected/3fa33f19-15c5-4f78-88ef-1db6eb605aa7-kube-api-access-g5q2g\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:39 crc kubenswrapper[4830]: I0318 18:05:39.707679 4830 generic.go:334] "Generic (PLEG): container finished" podID="3fa33f19-15c5-4f78-88ef-1db6eb605aa7" containerID="472449b11f809f3c0b26735711d802627f73b1b92edb175aa677101e4dae465c" exitCode=0 Mar 18 18:05:39 crc kubenswrapper[4830]: I0318 18:05:39.707758 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-czlcm" event={"ID":"3fa33f19-15c5-4f78-88ef-1db6eb605aa7","Type":"ContainerDied","Data":"472449b11f809f3c0b26735711d802627f73b1b92edb175aa677101e4dae465c"} Mar 18 18:05:39 crc kubenswrapper[4830]: I0318 18:05:39.707855 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-czlcm" event={"ID":"3fa33f19-15c5-4f78-88ef-1db6eb605aa7","Type":"ContainerDied","Data":"06c5295f74367633149d70f9a50caf740d43a319753a7049456b9f2f0819aedf"} Mar 18 18:05:39 crc kubenswrapper[4830]: I0318 18:05:39.707862 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-czlcm" Mar 18 18:05:39 crc kubenswrapper[4830]: I0318 18:05:39.707891 4830 scope.go:117] "RemoveContainer" containerID="472449b11f809f3c0b26735711d802627f73b1b92edb175aa677101e4dae465c" Mar 18 18:05:39 crc kubenswrapper[4830]: I0318 18:05:39.708236 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-klcdh" podUID="b1176643-c4d6-4be9-8317-a99886a32b29" containerName="registry-server" containerID="cri-o://a3aab7c607c8ae4ca0201cb2e060b65ce929a8b367ebf1f54767b1a2d39a9ad2" gracePeriod=2 Mar 18 18:05:39 crc kubenswrapper[4830]: I0318 18:05:39.741755 4830 scope.go:117] "RemoveContainer" containerID="3522fef8c822fb2a5ab4ea87689bcc0589d010c03a8df7edc75b78e5c15021bc" Mar 18 18:05:39 crc kubenswrapper[4830]: I0318 18:05:39.775409 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-czlcm"] Mar 18 18:05:39 crc kubenswrapper[4830]: I0318 18:05:39.786668 4830 scope.go:117] "RemoveContainer" containerID="4177dc22a635c9ef897f3230b387441b7b7fe41e4761235bbbfd694d82338cc8" Mar 18 18:05:39 crc kubenswrapper[4830]: I0318 18:05:39.788996 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-czlcm"] Mar 18 18:05:39 crc kubenswrapper[4830]: I0318 18:05:39.916624 4830 scope.go:117] "RemoveContainer" containerID="472449b11f809f3c0b26735711d802627f73b1b92edb175aa677101e4dae465c" Mar 18 18:05:39 crc kubenswrapper[4830]: E0318 18:05:39.917669 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"472449b11f809f3c0b26735711d802627f73b1b92edb175aa677101e4dae465c\": container with ID starting with 472449b11f809f3c0b26735711d802627f73b1b92edb175aa677101e4dae465c not found: ID does not exist" containerID="472449b11f809f3c0b26735711d802627f73b1b92edb175aa677101e4dae465c" Mar 18 18:05:39 crc kubenswrapper[4830]: I0318 18:05:39.917731 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"472449b11f809f3c0b26735711d802627f73b1b92edb175aa677101e4dae465c"} err="failed to get container status \"472449b11f809f3c0b26735711d802627f73b1b92edb175aa677101e4dae465c\": rpc error: code = NotFound desc = could not find container \"472449b11f809f3c0b26735711d802627f73b1b92edb175aa677101e4dae465c\": container with ID starting with 472449b11f809f3c0b26735711d802627f73b1b92edb175aa677101e4dae465c not found: ID does not exist" Mar 18 18:05:39 crc kubenswrapper[4830]: I0318 18:05:39.917793 4830 scope.go:117] "RemoveContainer" containerID="3522fef8c822fb2a5ab4ea87689bcc0589d010c03a8df7edc75b78e5c15021bc" Mar 18 18:05:39 crc kubenswrapper[4830]: E0318 18:05:39.918893 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3522fef8c822fb2a5ab4ea87689bcc0589d010c03a8df7edc75b78e5c15021bc\": container with ID starting with 3522fef8c822fb2a5ab4ea87689bcc0589d010c03a8df7edc75b78e5c15021bc not found: ID does not exist" containerID="3522fef8c822fb2a5ab4ea87689bcc0589d010c03a8df7edc75b78e5c15021bc" Mar 18 18:05:39 crc kubenswrapper[4830]: I0318 18:05:39.918949 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3522fef8c822fb2a5ab4ea87689bcc0589d010c03a8df7edc75b78e5c15021bc"} err="failed to get container status \"3522fef8c822fb2a5ab4ea87689bcc0589d010c03a8df7edc75b78e5c15021bc\": rpc error: code = NotFound desc = could not find container \"3522fef8c822fb2a5ab4ea87689bcc0589d010c03a8df7edc75b78e5c15021bc\": container with ID starting with 3522fef8c822fb2a5ab4ea87689bcc0589d010c03a8df7edc75b78e5c15021bc not found: ID does not exist" Mar 18 18:05:39 crc kubenswrapper[4830]: I0318 18:05:39.918985 4830 scope.go:117] "RemoveContainer" containerID="4177dc22a635c9ef897f3230b387441b7b7fe41e4761235bbbfd694d82338cc8" Mar 18 18:05:39 crc kubenswrapper[4830]: E0318 18:05:39.920559 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4177dc22a635c9ef897f3230b387441b7b7fe41e4761235bbbfd694d82338cc8\": container with ID starting with 4177dc22a635c9ef897f3230b387441b7b7fe41e4761235bbbfd694d82338cc8 not found: ID does not exist" containerID="4177dc22a635c9ef897f3230b387441b7b7fe41e4761235bbbfd694d82338cc8" Mar 18 18:05:39 crc kubenswrapper[4830]: I0318 18:05:39.920620 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4177dc22a635c9ef897f3230b387441b7b7fe41e4761235bbbfd694d82338cc8"} err="failed to get container status \"4177dc22a635c9ef897f3230b387441b7b7fe41e4761235bbbfd694d82338cc8\": rpc error: code = NotFound desc = could not find container \"4177dc22a635c9ef897f3230b387441b7b7fe41e4761235bbbfd694d82338cc8\": container with ID starting with 4177dc22a635c9ef897f3230b387441b7b7fe41e4761235bbbfd694d82338cc8 not found: ID does not exist" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.178319 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sk6wx"] Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.178598 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sk6wx" podUID="dbfe6d63-05ee-40d2-affa-03b9310a27c1" containerName="registry-server" containerID="cri-o://b93b54fca0ed3980ad98b856481a144cf02aeb1139dc101165a7406441a567ab" gracePeriod=2 Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.214711 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" podUID="84a21e6e-7bda-408b-a607-f02b4f807535" containerName="oauth-openshift" containerID="cri-o://45ac4097ed87ea50bce298a8c5c2aac3970b76fc10ec18ef9adc87fa65809345" gracePeriod=15 Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.242269 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fa33f19-15c5-4f78-88ef-1db6eb605aa7" path="/var/lib/kubelet/pods/3fa33f19-15c5-4f78-88ef-1db6eb605aa7/volumes" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.365820 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-klcdh" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.447467 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1176643-c4d6-4be9-8317-a99886a32b29-utilities\") pod \"b1176643-c4d6-4be9-8317-a99886a32b29\" (UID: \"b1176643-c4d6-4be9-8317-a99886a32b29\") " Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.447526 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1176643-c4d6-4be9-8317-a99886a32b29-catalog-content\") pod \"b1176643-c4d6-4be9-8317-a99886a32b29\" (UID: \"b1176643-c4d6-4be9-8317-a99886a32b29\") " Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.447574 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzb79\" (UniqueName: \"kubernetes.io/projected/b1176643-c4d6-4be9-8317-a99886a32b29-kube-api-access-lzb79\") pod \"b1176643-c4d6-4be9-8317-a99886a32b29\" (UID: \"b1176643-c4d6-4be9-8317-a99886a32b29\") " Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.448401 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1176643-c4d6-4be9-8317-a99886a32b29-utilities" (OuterVolumeSpecName: "utilities") pod "b1176643-c4d6-4be9-8317-a99886a32b29" (UID: "b1176643-c4d6-4be9-8317-a99886a32b29"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.451955 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1176643-c4d6-4be9-8317-a99886a32b29-kube-api-access-lzb79" (OuterVolumeSpecName: "kube-api-access-lzb79") pod "b1176643-c4d6-4be9-8317-a99886a32b29" (UID: "b1176643-c4d6-4be9-8317-a99886a32b29"). InnerVolumeSpecName "kube-api-access-lzb79". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.523605 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1176643-c4d6-4be9-8317-a99886a32b29-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1176643-c4d6-4be9-8317-a99886a32b29" (UID: "b1176643-c4d6-4be9-8317-a99886a32b29"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.548745 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1176643-c4d6-4be9-8317-a99886a32b29-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.548798 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzb79\" (UniqueName: \"kubernetes.io/projected/b1176643-c4d6-4be9-8317-a99886a32b29-kube-api-access-lzb79\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.548814 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1176643-c4d6-4be9-8317-a99886a32b29-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.715845 4830 generic.go:334] "Generic (PLEG): container finished" podID="b1176643-c4d6-4be9-8317-a99886a32b29" containerID="a3aab7c607c8ae4ca0201cb2e060b65ce929a8b367ebf1f54767b1a2d39a9ad2" exitCode=0 Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.715930 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-klcdh" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.715937 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-klcdh" event={"ID":"b1176643-c4d6-4be9-8317-a99886a32b29","Type":"ContainerDied","Data":"a3aab7c607c8ae4ca0201cb2e060b65ce929a8b367ebf1f54767b1a2d39a9ad2"} Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.716160 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-klcdh" event={"ID":"b1176643-c4d6-4be9-8317-a99886a32b29","Type":"ContainerDied","Data":"e7c7d3b3bab45bacb82d6a5a86d7bfa957c42b029cc974cd5e089fab79087c8d"} Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.716241 4830 scope.go:117] "RemoveContainer" containerID="a3aab7c607c8ae4ca0201cb2e060b65ce929a8b367ebf1f54767b1a2d39a9ad2" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.719580 4830 generic.go:334] "Generic (PLEG): container finished" podID="84a21e6e-7bda-408b-a607-f02b4f807535" containerID="45ac4097ed87ea50bce298a8c5c2aac3970b76fc10ec18ef9adc87fa65809345" exitCode=0 Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.719651 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" event={"ID":"84a21e6e-7bda-408b-a607-f02b4f807535","Type":"ContainerDied","Data":"45ac4097ed87ea50bce298a8c5c2aac3970b76fc10ec18ef9adc87fa65809345"} Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.725618 4830 generic.go:334] "Generic (PLEG): container finished" podID="dbfe6d63-05ee-40d2-affa-03b9310a27c1" containerID="b93b54fca0ed3980ad98b856481a144cf02aeb1139dc101165a7406441a567ab" exitCode=0 Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.725666 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sk6wx" event={"ID":"dbfe6d63-05ee-40d2-affa-03b9310a27c1","Type":"ContainerDied","Data":"b93b54fca0ed3980ad98b856481a144cf02aeb1139dc101165a7406441a567ab"} Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.747658 4830 scope.go:117] "RemoveContainer" containerID="ae7c46db00c8bd306e66778097e977b03646392cf9954feb93951dd4e6017270" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.761899 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-klcdh"] Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.767399 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-klcdh"] Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.770457 4830 scope.go:117] "RemoveContainer" containerID="392fafb4baa602a9290fd49e7c1bdec5921dd5581853104fc1152964d7266c1c" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.785750 4830 scope.go:117] "RemoveContainer" containerID="a3aab7c607c8ae4ca0201cb2e060b65ce929a8b367ebf1f54767b1a2d39a9ad2" Mar 18 18:05:40 crc kubenswrapper[4830]: E0318 18:05:40.786168 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3aab7c607c8ae4ca0201cb2e060b65ce929a8b367ebf1f54767b1a2d39a9ad2\": container with ID starting with a3aab7c607c8ae4ca0201cb2e060b65ce929a8b367ebf1f54767b1a2d39a9ad2 not found: ID does not exist" containerID="a3aab7c607c8ae4ca0201cb2e060b65ce929a8b367ebf1f54767b1a2d39a9ad2" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.786218 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3aab7c607c8ae4ca0201cb2e060b65ce929a8b367ebf1f54767b1a2d39a9ad2"} err="failed to get container status \"a3aab7c607c8ae4ca0201cb2e060b65ce929a8b367ebf1f54767b1a2d39a9ad2\": rpc error: code = NotFound desc = could not find container \"a3aab7c607c8ae4ca0201cb2e060b65ce929a8b367ebf1f54767b1a2d39a9ad2\": container with ID starting with a3aab7c607c8ae4ca0201cb2e060b65ce929a8b367ebf1f54767b1a2d39a9ad2 not found: ID does not exist" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.786252 4830 scope.go:117] "RemoveContainer" containerID="ae7c46db00c8bd306e66778097e977b03646392cf9954feb93951dd4e6017270" Mar 18 18:05:40 crc kubenswrapper[4830]: E0318 18:05:40.786601 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae7c46db00c8bd306e66778097e977b03646392cf9954feb93951dd4e6017270\": container with ID starting with ae7c46db00c8bd306e66778097e977b03646392cf9954feb93951dd4e6017270 not found: ID does not exist" containerID="ae7c46db00c8bd306e66778097e977b03646392cf9954feb93951dd4e6017270" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.786633 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae7c46db00c8bd306e66778097e977b03646392cf9954feb93951dd4e6017270"} err="failed to get container status \"ae7c46db00c8bd306e66778097e977b03646392cf9954feb93951dd4e6017270\": rpc error: code = NotFound desc = could not find container \"ae7c46db00c8bd306e66778097e977b03646392cf9954feb93951dd4e6017270\": container with ID starting with ae7c46db00c8bd306e66778097e977b03646392cf9954feb93951dd4e6017270 not found: ID does not exist" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.786656 4830 scope.go:117] "RemoveContainer" containerID="392fafb4baa602a9290fd49e7c1bdec5921dd5581853104fc1152964d7266c1c" Mar 18 18:05:40 crc kubenswrapper[4830]: E0318 18:05:40.786882 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"392fafb4baa602a9290fd49e7c1bdec5921dd5581853104fc1152964d7266c1c\": container with ID starting with 392fafb4baa602a9290fd49e7c1bdec5921dd5581853104fc1152964d7266c1c not found: ID does not exist" containerID="392fafb4baa602a9290fd49e7c1bdec5921dd5581853104fc1152964d7266c1c" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.786905 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"392fafb4baa602a9290fd49e7c1bdec5921dd5581853104fc1152964d7266c1c"} err="failed to get container status \"392fafb4baa602a9290fd49e7c1bdec5921dd5581853104fc1152964d7266c1c\": rpc error: code = NotFound desc = could not find container \"392fafb4baa602a9290fd49e7c1bdec5921dd5581853104fc1152964d7266c1c\": container with ID starting with 392fafb4baa602a9290fd49e7c1bdec5921dd5581853104fc1152964d7266c1c not found: ID does not exist" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.817256 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sk6wx" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.862219 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.956370 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbfe6d63-05ee-40d2-affa-03b9310a27c1-catalog-content\") pod \"dbfe6d63-05ee-40d2-affa-03b9310a27c1\" (UID: \"dbfe6d63-05ee-40d2-affa-03b9310a27c1\") " Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.956436 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-serving-cert\") pod \"84a21e6e-7bda-408b-a607-f02b4f807535\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.956477 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkzx6\" (UniqueName: \"kubernetes.io/projected/dbfe6d63-05ee-40d2-affa-03b9310a27c1-kube-api-access-dkzx6\") pod \"dbfe6d63-05ee-40d2-affa-03b9310a27c1\" (UID: \"dbfe6d63-05ee-40d2-affa-03b9310a27c1\") " Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.956499 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-user-template-error\") pod \"84a21e6e-7bda-408b-a607-f02b4f807535\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.956523 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-router-certs\") pod \"84a21e6e-7bda-408b-a607-f02b4f807535\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.956541 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-cliconfig\") pod \"84a21e6e-7bda-408b-a607-f02b4f807535\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.956575 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-trusted-ca-bundle\") pod \"84a21e6e-7bda-408b-a607-f02b4f807535\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.956598 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khtw6\" (UniqueName: \"kubernetes.io/projected/84a21e6e-7bda-408b-a607-f02b4f807535-kube-api-access-khtw6\") pod \"84a21e6e-7bda-408b-a607-f02b4f807535\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.956622 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-ocp-branding-template\") pod \"84a21e6e-7bda-408b-a607-f02b4f807535\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.956664 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbfe6d63-05ee-40d2-affa-03b9310a27c1-utilities\") pod \"dbfe6d63-05ee-40d2-affa-03b9310a27c1\" (UID: \"dbfe6d63-05ee-40d2-affa-03b9310a27c1\") " Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.956681 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/84a21e6e-7bda-408b-a607-f02b4f807535-audit-dir\") pod \"84a21e6e-7bda-408b-a607-f02b4f807535\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.956703 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-service-ca\") pod \"84a21e6e-7bda-408b-a607-f02b4f807535\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.956727 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-user-template-login\") pod \"84a21e6e-7bda-408b-a607-f02b4f807535\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.956745 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-session\") pod \"84a21e6e-7bda-408b-a607-f02b4f807535\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.956785 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-user-template-provider-selection\") pod \"84a21e6e-7bda-408b-a607-f02b4f807535\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.956803 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/84a21e6e-7bda-408b-a607-f02b4f807535-audit-policies\") pod \"84a21e6e-7bda-408b-a607-f02b4f807535\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.956846 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-user-idp-0-file-data\") pod \"84a21e6e-7bda-408b-a607-f02b4f807535\" (UID: \"84a21e6e-7bda-408b-a607-f02b4f807535\") " Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.957666 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "84a21e6e-7bda-408b-a607-f02b4f807535" (UID: "84a21e6e-7bda-408b-a607-f02b4f807535"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.957671 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "84a21e6e-7bda-408b-a607-f02b4f807535" (UID: "84a21e6e-7bda-408b-a607-f02b4f807535"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.957742 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/84a21e6e-7bda-408b-a607-f02b4f807535-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "84a21e6e-7bda-408b-a607-f02b4f807535" (UID: "84a21e6e-7bda-408b-a607-f02b4f807535"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.958246 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "84a21e6e-7bda-408b-a607-f02b4f807535" (UID: "84a21e6e-7bda-408b-a607-f02b4f807535"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.958747 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbfe6d63-05ee-40d2-affa-03b9310a27c1-utilities" (OuterVolumeSpecName: "utilities") pod "dbfe6d63-05ee-40d2-affa-03b9310a27c1" (UID: "dbfe6d63-05ee-40d2-affa-03b9310a27c1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.959555 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84a21e6e-7bda-408b-a607-f02b4f807535-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "84a21e6e-7bda-408b-a607-f02b4f807535" (UID: "84a21e6e-7bda-408b-a607-f02b4f807535"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.960887 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "84a21e6e-7bda-408b-a607-f02b4f807535" (UID: "84a21e6e-7bda-408b-a607-f02b4f807535"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.961351 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbfe6d63-05ee-40d2-affa-03b9310a27c1-kube-api-access-dkzx6" (OuterVolumeSpecName: "kube-api-access-dkzx6") pod "dbfe6d63-05ee-40d2-affa-03b9310a27c1" (UID: "dbfe6d63-05ee-40d2-affa-03b9310a27c1"). InnerVolumeSpecName "kube-api-access-dkzx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.962331 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "84a21e6e-7bda-408b-a607-f02b4f807535" (UID: "84a21e6e-7bda-408b-a607-f02b4f807535"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.965390 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84a21e6e-7bda-408b-a607-f02b4f807535-kube-api-access-khtw6" (OuterVolumeSpecName: "kube-api-access-khtw6") pod "84a21e6e-7bda-408b-a607-f02b4f807535" (UID: "84a21e6e-7bda-408b-a607-f02b4f807535"). InnerVolumeSpecName "kube-api-access-khtw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.965453 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "84a21e6e-7bda-408b-a607-f02b4f807535" (UID: "84a21e6e-7bda-408b-a607-f02b4f807535"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.966103 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "84a21e6e-7bda-408b-a607-f02b4f807535" (UID: "84a21e6e-7bda-408b-a607-f02b4f807535"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.966257 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "84a21e6e-7bda-408b-a607-f02b4f807535" (UID: "84a21e6e-7bda-408b-a607-f02b4f807535"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.966584 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "84a21e6e-7bda-408b-a607-f02b4f807535" (UID: "84a21e6e-7bda-408b-a607-f02b4f807535"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.967049 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "84a21e6e-7bda-408b-a607-f02b4f807535" (UID: "84a21e6e-7bda-408b-a607-f02b4f807535"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.967254 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "84a21e6e-7bda-408b-a607-f02b4f807535" (UID: "84a21e6e-7bda-408b-a607-f02b4f807535"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:05:40 crc kubenswrapper[4830]: I0318 18:05:40.980325 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbfe6d63-05ee-40d2-affa-03b9310a27c1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dbfe6d63-05ee-40d2-affa-03b9310a27c1" (UID: "dbfe6d63-05ee-40d2-affa-03b9310a27c1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:05:41 crc kubenswrapper[4830]: I0318 18:05:41.058322 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:41 crc kubenswrapper[4830]: I0318 18:05:41.058420 4830 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/84a21e6e-7bda-408b-a607-f02b4f807535-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:41 crc kubenswrapper[4830]: I0318 18:05:41.058507 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:41 crc kubenswrapper[4830]: I0318 18:05:41.058529 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbfe6d63-05ee-40d2-affa-03b9310a27c1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:41 crc kubenswrapper[4830]: I0318 18:05:41.058586 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:41 crc kubenswrapper[4830]: I0318 18:05:41.058606 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkzx6\" (UniqueName: \"kubernetes.io/projected/dbfe6d63-05ee-40d2-affa-03b9310a27c1-kube-api-access-dkzx6\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:41 crc kubenswrapper[4830]: I0318 18:05:41.058628 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:41 crc kubenswrapper[4830]: I0318 18:05:41.058686 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:41 crc kubenswrapper[4830]: I0318 18:05:41.058710 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:41 crc kubenswrapper[4830]: I0318 18:05:41.058728 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:41 crc kubenswrapper[4830]: I0318 18:05:41.058827 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khtw6\" (UniqueName: \"kubernetes.io/projected/84a21e6e-7bda-408b-a607-f02b4f807535-kube-api-access-khtw6\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:41 crc kubenswrapper[4830]: I0318 18:05:41.058849 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:41 crc kubenswrapper[4830]: I0318 18:05:41.058868 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbfe6d63-05ee-40d2-affa-03b9310a27c1-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:41 crc kubenswrapper[4830]: I0318 18:05:41.058924 4830 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/84a21e6e-7bda-408b-a607-f02b4f807535-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:41 crc kubenswrapper[4830]: I0318 18:05:41.058941 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:41 crc kubenswrapper[4830]: I0318 18:05:41.058996 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:41 crc kubenswrapper[4830]: I0318 18:05:41.059018 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/84a21e6e-7bda-408b-a607-f02b4f807535-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:41 crc kubenswrapper[4830]: I0318 18:05:41.733151 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" event={"ID":"84a21e6e-7bda-408b-a607-f02b4f807535","Type":"ContainerDied","Data":"377506eab5a17193ab2406749c4ee6f41d9333773dd66510a3514ec88fb22891"} Mar 18 18:05:41 crc kubenswrapper[4830]: I0318 18:05:41.733174 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-kczvm" Mar 18 18:05:41 crc kubenswrapper[4830]: I0318 18:05:41.733202 4830 scope.go:117] "RemoveContainer" containerID="45ac4097ed87ea50bce298a8c5c2aac3970b76fc10ec18ef9adc87fa65809345" Mar 18 18:05:41 crc kubenswrapper[4830]: I0318 18:05:41.737940 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sk6wx" event={"ID":"dbfe6d63-05ee-40d2-affa-03b9310a27c1","Type":"ContainerDied","Data":"641429fab93568eccc5393904a74582a2edc0a5605340276dc967a94b469507e"} Mar 18 18:05:41 crc kubenswrapper[4830]: I0318 18:05:41.738084 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sk6wx" Mar 18 18:05:41 crc kubenswrapper[4830]: I0318 18:05:41.760264 4830 scope.go:117] "RemoveContainer" containerID="b93b54fca0ed3980ad98b856481a144cf02aeb1139dc101165a7406441a567ab" Mar 18 18:05:41 crc kubenswrapper[4830]: I0318 18:05:41.775697 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kczvm"] Mar 18 18:05:41 crc kubenswrapper[4830]: I0318 18:05:41.778699 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kczvm"] Mar 18 18:05:41 crc kubenswrapper[4830]: I0318 18:05:41.790874 4830 scope.go:117] "RemoveContainer" containerID="4b99ac56390f23695d876f72270af61627a83473bda5ee2d9c5392fcc6871595" Mar 18 18:05:41 crc kubenswrapper[4830]: I0318 18:05:41.791084 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sk6wx"] Mar 18 18:05:41 crc kubenswrapper[4830]: I0318 18:05:41.795113 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sk6wx"] Mar 18 18:05:41 crc kubenswrapper[4830]: I0318 18:05:41.823632 4830 scope.go:117] "RemoveContainer" containerID="6ce5799dd83d5007902c4d4afc49d4759d15f0e4e4e54f35ea8e19b46a37d987" Mar 18 18:05:42 crc kubenswrapper[4830]: I0318 18:05:42.246100 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84a21e6e-7bda-408b-a607-f02b4f807535" path="/var/lib/kubelet/pods/84a21e6e-7bda-408b-a607-f02b4f807535/volumes" Mar 18 18:05:42 crc kubenswrapper[4830]: I0318 18:05:42.246818 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1176643-c4d6-4be9-8317-a99886a32b29" path="/var/lib/kubelet/pods/b1176643-c4d6-4be9-8317-a99886a32b29/volumes" Mar 18 18:05:42 crc kubenswrapper[4830]: I0318 18:05:42.247615 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbfe6d63-05ee-40d2-affa-03b9310a27c1" path="/var/lib/kubelet/pods/dbfe6d63-05ee-40d2-affa-03b9310a27c1/volumes" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.019820 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-85766c7959-r8vkm"] Mar 18 18:05:45 crc kubenswrapper[4830]: E0318 18:05:45.020459 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fa33f19-15c5-4f78-88ef-1db6eb605aa7" containerName="extract-content" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.020479 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fa33f19-15c5-4f78-88ef-1db6eb605aa7" containerName="extract-content" Mar 18 18:05:45 crc kubenswrapper[4830]: E0318 18:05:45.020498 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1176643-c4d6-4be9-8317-a99886a32b29" containerName="extract-utilities" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.020511 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1176643-c4d6-4be9-8317-a99886a32b29" containerName="extract-utilities" Mar 18 18:05:45 crc kubenswrapper[4830]: E0318 18:05:45.020531 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fa33f19-15c5-4f78-88ef-1db6eb605aa7" containerName="extract-utilities" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.020546 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fa33f19-15c5-4f78-88ef-1db6eb605aa7" containerName="extract-utilities" Mar 18 18:05:45 crc kubenswrapper[4830]: E0318 18:05:45.020566 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1176643-c4d6-4be9-8317-a99886a32b29" containerName="registry-server" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.020579 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1176643-c4d6-4be9-8317-a99886a32b29" containerName="registry-server" Mar 18 18:05:45 crc kubenswrapper[4830]: E0318 18:05:45.020597 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1176643-c4d6-4be9-8317-a99886a32b29" containerName="extract-content" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.020609 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1176643-c4d6-4be9-8317-a99886a32b29" containerName="extract-content" Mar 18 18:05:45 crc kubenswrapper[4830]: E0318 18:05:45.020629 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbfe6d63-05ee-40d2-affa-03b9310a27c1" containerName="extract-content" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.020641 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbfe6d63-05ee-40d2-affa-03b9310a27c1" containerName="extract-content" Mar 18 18:05:45 crc kubenswrapper[4830]: E0318 18:05:45.020655 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbfe6d63-05ee-40d2-affa-03b9310a27c1" containerName="extract-utilities" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.020666 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbfe6d63-05ee-40d2-affa-03b9310a27c1" containerName="extract-utilities" Mar 18 18:05:45 crc kubenswrapper[4830]: E0318 18:05:45.020683 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fa33f19-15c5-4f78-88ef-1db6eb605aa7" containerName="registry-server" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.020694 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fa33f19-15c5-4f78-88ef-1db6eb605aa7" containerName="registry-server" Mar 18 18:05:45 crc kubenswrapper[4830]: E0318 18:05:45.020712 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbfe6d63-05ee-40d2-affa-03b9310a27c1" containerName="registry-server" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.020724 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbfe6d63-05ee-40d2-affa-03b9310a27c1" containerName="registry-server" Mar 18 18:05:45 crc kubenswrapper[4830]: E0318 18:05:45.020745 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84a21e6e-7bda-408b-a607-f02b4f807535" containerName="oauth-openshift" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.020757 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="84a21e6e-7bda-408b-a607-f02b4f807535" containerName="oauth-openshift" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.020964 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fa33f19-15c5-4f78-88ef-1db6eb605aa7" containerName="registry-server" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.020982 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbfe6d63-05ee-40d2-affa-03b9310a27c1" containerName="registry-server" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.021000 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="84a21e6e-7bda-408b-a607-f02b4f807535" containerName="oauth-openshift" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.021020 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1176643-c4d6-4be9-8317-a99886a32b29" containerName="registry-server" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.021577 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.026108 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.026352 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.026363 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.026640 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.026722 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.027009 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.027706 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.030893 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.031508 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.036272 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.037759 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.038664 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.045566 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.047951 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.051217 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.051936 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-85766c7959-r8vkm"] Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.113075 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-audit-policies\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.113189 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.113238 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.113265 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.113385 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.113454 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.113505 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7dm8\" (UniqueName: \"kubernetes.io/projected/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-kube-api-access-z7dm8\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.113578 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-v4-0-config-user-template-error\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.113699 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.113789 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-v4-0-config-system-router-certs\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.113854 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-audit-dir\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.113996 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-v4-0-config-user-template-login\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.114100 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-v4-0-config-system-service-ca\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.114177 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-v4-0-config-system-session\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.215015 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.215095 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.215138 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7dm8\" (UniqueName: \"kubernetes.io/projected/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-kube-api-access-z7dm8\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.215192 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-v4-0-config-user-template-error\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.215245 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.215278 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-v4-0-config-system-router-certs\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.215346 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-audit-dir\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.215401 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-v4-0-config-user-template-login\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.215537 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-v4-0-config-system-service-ca\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.215646 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-v4-0-config-system-session\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.215767 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-audit-policies\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.215859 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.215937 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.216277 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.216088 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-audit-dir\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.217504 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.217656 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-v4-0-config-system-service-ca\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.217719 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.218272 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-audit-policies\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.225045 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.225176 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.225373 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-v4-0-config-system-router-certs\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.226667 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-v4-0-config-user-template-error\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.226995 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-v4-0-config-system-session\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.227490 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-v4-0-config-user-template-login\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.228382 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.229751 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.238552 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7dm8\" (UniqueName: \"kubernetes.io/projected/0e0ede53-4b1e-46e1-8db0-380277ad4ed7-kube-api-access-z7dm8\") pod \"oauth-openshift-85766c7959-r8vkm\" (UID: \"0e0ede53-4b1e-46e1-8db0-380277ad4ed7\") " pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.388967 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:45 crc kubenswrapper[4830]: I0318 18:05:45.811662 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-85766c7959-r8vkm"] Mar 18 18:05:45 crc kubenswrapper[4830]: W0318 18:05:45.816551 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e0ede53_4b1e_46e1_8db0_380277ad4ed7.slice/crio-3091dbfe9e863892db7a9e1dc745ef5e175ec4bac801bfa91a4959deae111068 WatchSource:0}: Error finding container 3091dbfe9e863892db7a9e1dc745ef5e175ec4bac801bfa91a4959deae111068: Status 404 returned error can't find the container with id 3091dbfe9e863892db7a9e1dc745ef5e175ec4bac801bfa91a4959deae111068 Mar 18 18:05:46 crc kubenswrapper[4830]: I0318 18:05:46.513155 4830 ???:1] "http: TLS handshake error from 192.168.126.11:43792: no serving certificate available for the kubelet" Mar 18 18:05:46 crc kubenswrapper[4830]: I0318 18:05:46.788101 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" event={"ID":"0e0ede53-4b1e-46e1-8db0-380277ad4ed7","Type":"ContainerStarted","Data":"979db43dadbe82f1b11929cf0eec3ab7f8ca524ebbdc8dddbee7b3a9f373053e"} Mar 18 18:05:46 crc kubenswrapper[4830]: I0318 18:05:46.788153 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" event={"ID":"0e0ede53-4b1e-46e1-8db0-380277ad4ed7","Type":"ContainerStarted","Data":"3091dbfe9e863892db7a9e1dc745ef5e175ec4bac801bfa91a4959deae111068"} Mar 18 18:05:46 crc kubenswrapper[4830]: I0318 18:05:46.788475 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:46 crc kubenswrapper[4830]: I0318 18:05:46.793960 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" Mar 18 18:05:46 crc kubenswrapper[4830]: I0318 18:05:46.807813 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-85766c7959-r8vkm" podStartSLOduration=31.807772681 podStartE2EDuration="31.807772681s" podCreationTimestamp="2026-03-18 18:05:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:46.806121357 +0000 UTC m=+181.373751689" watchObservedRunningTime="2026-03-18 18:05:46.807772681 +0000 UTC m=+181.375403023" Mar 18 18:05:53 crc kubenswrapper[4830]: I0318 18:05:53.785821 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-58657d8f8c-gvpnx"] Mar 18 18:05:53 crc kubenswrapper[4830]: I0318 18:05:53.788551 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-58657d8f8c-gvpnx" podUID="b4966489-d69c-4915-bc0d-3337a7d5067e" containerName="controller-manager" containerID="cri-o://8640141c2dca319567fbc2b63f5dfe6d2c29ac4c3c481bb7e31003a39730754c" gracePeriod=30 Mar 18 18:05:53 crc kubenswrapper[4830]: I0318 18:05:53.881877 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8795dc848-jtshw"] Mar 18 18:05:53 crc kubenswrapper[4830]: I0318 18:05:53.882532 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-8795dc848-jtshw" podUID="0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d" containerName="route-controller-manager" containerID="cri-o://e78ae3fb160cd1b0c1530b9939af98b79dea3f4eeeac9110794dc7e92cd4a75f" gracePeriod=30 Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.364277 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8795dc848-jtshw" Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.372151 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58657d8f8c-gvpnx" Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.452552 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b4966489-d69c-4915-bc0d-3337a7d5067e-client-ca\") pod \"b4966489-d69c-4915-bc0d-3337a7d5067e\" (UID: \"b4966489-d69c-4915-bc0d-3337a7d5067e\") " Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.452607 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d-serving-cert\") pod \"0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d\" (UID: \"0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d\") " Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.452656 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5fm8\" (UniqueName: \"kubernetes.io/projected/0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d-kube-api-access-n5fm8\") pod \"0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d\" (UID: \"0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d\") " Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.452685 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b4966489-d69c-4915-bc0d-3337a7d5067e-proxy-ca-bundles\") pod \"b4966489-d69c-4915-bc0d-3337a7d5067e\" (UID: \"b4966489-d69c-4915-bc0d-3337a7d5067e\") " Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.453680 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4966489-d69c-4915-bc0d-3337a7d5067e-serving-cert\") pod \"b4966489-d69c-4915-bc0d-3337a7d5067e\" (UID: \"b4966489-d69c-4915-bc0d-3337a7d5067e\") " Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.453721 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d-client-ca\") pod \"0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d\" (UID: \"0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d\") " Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.453741 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4966489-d69c-4915-bc0d-3337a7d5067e-config\") pod \"b4966489-d69c-4915-bc0d-3337a7d5067e\" (UID: \"b4966489-d69c-4915-bc0d-3337a7d5067e\") " Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.453535 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4966489-d69c-4915-bc0d-3337a7d5067e-client-ca" (OuterVolumeSpecName: "client-ca") pod "b4966489-d69c-4915-bc0d-3337a7d5067e" (UID: "b4966489-d69c-4915-bc0d-3337a7d5067e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.453545 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4966489-d69c-4915-bc0d-3337a7d5067e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b4966489-d69c-4915-bc0d-3337a7d5067e" (UID: "b4966489-d69c-4915-bc0d-3337a7d5067e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.453765 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql7rm\" (UniqueName: \"kubernetes.io/projected/b4966489-d69c-4915-bc0d-3337a7d5067e-kube-api-access-ql7rm\") pod \"b4966489-d69c-4915-bc0d-3337a7d5067e\" (UID: \"b4966489-d69c-4915-bc0d-3337a7d5067e\") " Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.454042 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d-config\") pod \"0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d\" (UID: \"0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d\") " Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.454290 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d-client-ca" (OuterVolumeSpecName: "client-ca") pod "0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d" (UID: "0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.454426 4830 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b4966489-d69c-4915-bc0d-3337a7d5067e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.454439 4830 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b4966489-d69c-4915-bc0d-3337a7d5067e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.454450 4830 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.454590 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4966489-d69c-4915-bc0d-3337a7d5067e-config" (OuterVolumeSpecName: "config") pod "b4966489-d69c-4915-bc0d-3337a7d5067e" (UID: "b4966489-d69c-4915-bc0d-3337a7d5067e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.454618 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d-config" (OuterVolumeSpecName: "config") pod "0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d" (UID: "0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.459693 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4966489-d69c-4915-bc0d-3337a7d5067e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b4966489-d69c-4915-bc0d-3337a7d5067e" (UID: "b4966489-d69c-4915-bc0d-3337a7d5067e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.472195 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d" (UID: "0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.474049 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d-kube-api-access-n5fm8" (OuterVolumeSpecName: "kube-api-access-n5fm8") pod "0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d" (UID: "0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d"). InnerVolumeSpecName "kube-api-access-n5fm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.476804 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4966489-d69c-4915-bc0d-3337a7d5067e-kube-api-access-ql7rm" (OuterVolumeSpecName: "kube-api-access-ql7rm") pod "b4966489-d69c-4915-bc0d-3337a7d5067e" (UID: "b4966489-d69c-4915-bc0d-3337a7d5067e"). InnerVolumeSpecName "kube-api-access-ql7rm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.555795 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4966489-d69c-4915-bc0d-3337a7d5067e-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.555834 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql7rm\" (UniqueName: \"kubernetes.io/projected/b4966489-d69c-4915-bc0d-3337a7d5067e-kube-api-access-ql7rm\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.555849 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.555862 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.555875 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5fm8\" (UniqueName: \"kubernetes.io/projected/0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d-kube-api-access-n5fm8\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.555888 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4966489-d69c-4915-bc0d-3337a7d5067e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.844064 4830 generic.go:334] "Generic (PLEG): container finished" podID="0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d" containerID="e78ae3fb160cd1b0c1530b9939af98b79dea3f4eeeac9110794dc7e92cd4a75f" exitCode=0 Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.844119 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8795dc848-jtshw" event={"ID":"0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d","Type":"ContainerDied","Data":"e78ae3fb160cd1b0c1530b9939af98b79dea3f4eeeac9110794dc7e92cd4a75f"} Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.844579 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8795dc848-jtshw" event={"ID":"0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d","Type":"ContainerDied","Data":"405132e2612c24140a4ecf0d1da8b5164e14d002078dc588284926f6503e35fb"} Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.844188 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8795dc848-jtshw" Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.844623 4830 scope.go:117] "RemoveContainer" containerID="e78ae3fb160cd1b0c1530b9939af98b79dea3f4eeeac9110794dc7e92cd4a75f" Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.851265 4830 generic.go:334] "Generic (PLEG): container finished" podID="b4966489-d69c-4915-bc0d-3337a7d5067e" containerID="8640141c2dca319567fbc2b63f5dfe6d2c29ac4c3c481bb7e31003a39730754c" exitCode=0 Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.851358 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58657d8f8c-gvpnx" Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.851362 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58657d8f8c-gvpnx" event={"ID":"b4966489-d69c-4915-bc0d-3337a7d5067e","Type":"ContainerDied","Data":"8640141c2dca319567fbc2b63f5dfe6d2c29ac4c3c481bb7e31003a39730754c"} Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.851446 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58657d8f8c-gvpnx" event={"ID":"b4966489-d69c-4915-bc0d-3337a7d5067e","Type":"ContainerDied","Data":"a8213183496dfba63f5413296ae789f1a43fcc7904109079a4814f4415c5b0e6"} Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.868201 4830 scope.go:117] "RemoveContainer" containerID="e78ae3fb160cd1b0c1530b9939af98b79dea3f4eeeac9110794dc7e92cd4a75f" Mar 18 18:05:54 crc kubenswrapper[4830]: E0318 18:05:54.869642 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e78ae3fb160cd1b0c1530b9939af98b79dea3f4eeeac9110794dc7e92cd4a75f\": container with ID starting with e78ae3fb160cd1b0c1530b9939af98b79dea3f4eeeac9110794dc7e92cd4a75f not found: ID does not exist" containerID="e78ae3fb160cd1b0c1530b9939af98b79dea3f4eeeac9110794dc7e92cd4a75f" Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.869690 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e78ae3fb160cd1b0c1530b9939af98b79dea3f4eeeac9110794dc7e92cd4a75f"} err="failed to get container status \"e78ae3fb160cd1b0c1530b9939af98b79dea3f4eeeac9110794dc7e92cd4a75f\": rpc error: code = NotFound desc = could not find container \"e78ae3fb160cd1b0c1530b9939af98b79dea3f4eeeac9110794dc7e92cd4a75f\": container with ID starting with e78ae3fb160cd1b0c1530b9939af98b79dea3f4eeeac9110794dc7e92cd4a75f not found: ID does not exist" Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.869718 4830 scope.go:117] "RemoveContainer" containerID="8640141c2dca319567fbc2b63f5dfe6d2c29ac4c3c481bb7e31003a39730754c" Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.882752 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8795dc848-jtshw"] Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.888231 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8795dc848-jtshw"] Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.905652 4830 scope.go:117] "RemoveContainer" containerID="8640141c2dca319567fbc2b63f5dfe6d2c29ac4c3c481bb7e31003a39730754c" Mar 18 18:05:54 crc kubenswrapper[4830]: E0318 18:05:54.906279 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8640141c2dca319567fbc2b63f5dfe6d2c29ac4c3c481bb7e31003a39730754c\": container with ID starting with 8640141c2dca319567fbc2b63f5dfe6d2c29ac4c3c481bb7e31003a39730754c not found: ID does not exist" containerID="8640141c2dca319567fbc2b63f5dfe6d2c29ac4c3c481bb7e31003a39730754c" Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.906328 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8640141c2dca319567fbc2b63f5dfe6d2c29ac4c3c481bb7e31003a39730754c"} err="failed to get container status \"8640141c2dca319567fbc2b63f5dfe6d2c29ac4c3c481bb7e31003a39730754c\": rpc error: code = NotFound desc = could not find container \"8640141c2dca319567fbc2b63f5dfe6d2c29ac4c3c481bb7e31003a39730754c\": container with ID starting with 8640141c2dca319567fbc2b63f5dfe6d2c29ac4c3c481bb7e31003a39730754c not found: ID does not exist" Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.907579 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-58657d8f8c-gvpnx"] Mar 18 18:05:54 crc kubenswrapper[4830]: I0318 18:05:54.911344 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-58657d8f8c-gvpnx"] Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.029457 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65f7d746d6-s8skv"] Mar 18 18:05:55 crc kubenswrapper[4830]: E0318 18:05:55.030607 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d" containerName="route-controller-manager" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.030634 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d" containerName="route-controller-manager" Mar 18 18:05:55 crc kubenswrapper[4830]: E0318 18:05:55.030698 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4966489-d69c-4915-bc0d-3337a7d5067e" containerName="controller-manager" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.030708 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4966489-d69c-4915-bc0d-3337a7d5067e" containerName="controller-manager" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.031078 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4966489-d69c-4915-bc0d-3337a7d5067e" containerName="controller-manager" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.031112 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d" containerName="route-controller-manager" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.031845 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65f7d746d6-s8skv" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.032013 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-57d7f4b84d-pk78f"] Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.034620 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.034731 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.034842 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.035159 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.035558 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.037881 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.042603 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-57d7f4b84d-pk78f"] Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.042740 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57d7f4b84d-pk78f" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.043056 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65f7d746d6-s8skv"] Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.078795 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.078990 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.080106 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.080231 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.080298 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.082495 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.102947 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.162198 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s95ms\" (UniqueName: \"kubernetes.io/projected/30e39130-c686-4adb-abf1-854a478ec384-kube-api-access-s95ms\") pod \"route-controller-manager-65f7d746d6-s8skv\" (UID: \"30e39130-c686-4adb-abf1-854a478ec384\") " pod="openshift-route-controller-manager/route-controller-manager-65f7d746d6-s8skv" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.162271 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30e39130-c686-4adb-abf1-854a478ec384-config\") pod \"route-controller-manager-65f7d746d6-s8skv\" (UID: \"30e39130-c686-4adb-abf1-854a478ec384\") " pod="openshift-route-controller-manager/route-controller-manager-65f7d746d6-s8skv" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.162293 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ecdc8c4-09c4-47a3-a705-dd2bce091671-config\") pod \"controller-manager-57d7f4b84d-pk78f\" (UID: \"4ecdc8c4-09c4-47a3-a705-dd2bce091671\") " pod="openshift-controller-manager/controller-manager-57d7f4b84d-pk78f" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.162595 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/30e39130-c686-4adb-abf1-854a478ec384-client-ca\") pod \"route-controller-manager-65f7d746d6-s8skv\" (UID: \"30e39130-c686-4adb-abf1-854a478ec384\") " pod="openshift-route-controller-manager/route-controller-manager-65f7d746d6-s8skv" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.162676 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vck9b\" (UniqueName: \"kubernetes.io/projected/4ecdc8c4-09c4-47a3-a705-dd2bce091671-kube-api-access-vck9b\") pod \"controller-manager-57d7f4b84d-pk78f\" (UID: \"4ecdc8c4-09c4-47a3-a705-dd2bce091671\") " pod="openshift-controller-manager/controller-manager-57d7f4b84d-pk78f" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.162729 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ecdc8c4-09c4-47a3-a705-dd2bce091671-serving-cert\") pod \"controller-manager-57d7f4b84d-pk78f\" (UID: \"4ecdc8c4-09c4-47a3-a705-dd2bce091671\") " pod="openshift-controller-manager/controller-manager-57d7f4b84d-pk78f" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.162803 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30e39130-c686-4adb-abf1-854a478ec384-serving-cert\") pod \"route-controller-manager-65f7d746d6-s8skv\" (UID: \"30e39130-c686-4adb-abf1-854a478ec384\") " pod="openshift-route-controller-manager/route-controller-manager-65f7d746d6-s8skv" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.162835 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ecdc8c4-09c4-47a3-a705-dd2bce091671-client-ca\") pod \"controller-manager-57d7f4b84d-pk78f\" (UID: \"4ecdc8c4-09c4-47a3-a705-dd2bce091671\") " pod="openshift-controller-manager/controller-manager-57d7f4b84d-pk78f" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.162888 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ecdc8c4-09c4-47a3-a705-dd2bce091671-proxy-ca-bundles\") pod \"controller-manager-57d7f4b84d-pk78f\" (UID: \"4ecdc8c4-09c4-47a3-a705-dd2bce091671\") " pod="openshift-controller-manager/controller-manager-57d7f4b84d-pk78f" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.264548 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ecdc8c4-09c4-47a3-a705-dd2bce091671-proxy-ca-bundles\") pod \"controller-manager-57d7f4b84d-pk78f\" (UID: \"4ecdc8c4-09c4-47a3-a705-dd2bce091671\") " pod="openshift-controller-manager/controller-manager-57d7f4b84d-pk78f" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.264654 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s95ms\" (UniqueName: \"kubernetes.io/projected/30e39130-c686-4adb-abf1-854a478ec384-kube-api-access-s95ms\") pod \"route-controller-manager-65f7d746d6-s8skv\" (UID: \"30e39130-c686-4adb-abf1-854a478ec384\") " pod="openshift-route-controller-manager/route-controller-manager-65f7d746d6-s8skv" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.264702 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30e39130-c686-4adb-abf1-854a478ec384-config\") pod \"route-controller-manager-65f7d746d6-s8skv\" (UID: \"30e39130-c686-4adb-abf1-854a478ec384\") " pod="openshift-route-controller-manager/route-controller-manager-65f7d746d6-s8skv" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.264735 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ecdc8c4-09c4-47a3-a705-dd2bce091671-config\") pod \"controller-manager-57d7f4b84d-pk78f\" (UID: \"4ecdc8c4-09c4-47a3-a705-dd2bce091671\") " pod="openshift-controller-manager/controller-manager-57d7f4b84d-pk78f" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.264839 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/30e39130-c686-4adb-abf1-854a478ec384-client-ca\") pod \"route-controller-manager-65f7d746d6-s8skv\" (UID: \"30e39130-c686-4adb-abf1-854a478ec384\") " pod="openshift-route-controller-manager/route-controller-manager-65f7d746d6-s8skv" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.264876 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vck9b\" (UniqueName: \"kubernetes.io/projected/4ecdc8c4-09c4-47a3-a705-dd2bce091671-kube-api-access-vck9b\") pod \"controller-manager-57d7f4b84d-pk78f\" (UID: \"4ecdc8c4-09c4-47a3-a705-dd2bce091671\") " pod="openshift-controller-manager/controller-manager-57d7f4b84d-pk78f" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.264918 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ecdc8c4-09c4-47a3-a705-dd2bce091671-serving-cert\") pod \"controller-manager-57d7f4b84d-pk78f\" (UID: \"4ecdc8c4-09c4-47a3-a705-dd2bce091671\") " pod="openshift-controller-manager/controller-manager-57d7f4b84d-pk78f" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.264962 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30e39130-c686-4adb-abf1-854a478ec384-serving-cert\") pod \"route-controller-manager-65f7d746d6-s8skv\" (UID: \"30e39130-c686-4adb-abf1-854a478ec384\") " pod="openshift-route-controller-manager/route-controller-manager-65f7d746d6-s8skv" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.264990 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ecdc8c4-09c4-47a3-a705-dd2bce091671-client-ca\") pod \"controller-manager-57d7f4b84d-pk78f\" (UID: \"4ecdc8c4-09c4-47a3-a705-dd2bce091671\") " pod="openshift-controller-manager/controller-manager-57d7f4b84d-pk78f" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.266130 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/30e39130-c686-4adb-abf1-854a478ec384-client-ca\") pod \"route-controller-manager-65f7d746d6-s8skv\" (UID: \"30e39130-c686-4adb-abf1-854a478ec384\") " pod="openshift-route-controller-manager/route-controller-manager-65f7d746d6-s8skv" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.266461 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30e39130-c686-4adb-abf1-854a478ec384-config\") pod \"route-controller-manager-65f7d746d6-s8skv\" (UID: \"30e39130-c686-4adb-abf1-854a478ec384\") " pod="openshift-route-controller-manager/route-controller-manager-65f7d746d6-s8skv" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.266602 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ecdc8c4-09c4-47a3-a705-dd2bce091671-client-ca\") pod \"controller-manager-57d7f4b84d-pk78f\" (UID: \"4ecdc8c4-09c4-47a3-a705-dd2bce091671\") " pod="openshift-controller-manager/controller-manager-57d7f4b84d-pk78f" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.266993 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ecdc8c4-09c4-47a3-a705-dd2bce091671-proxy-ca-bundles\") pod \"controller-manager-57d7f4b84d-pk78f\" (UID: \"4ecdc8c4-09c4-47a3-a705-dd2bce091671\") " pod="openshift-controller-manager/controller-manager-57d7f4b84d-pk78f" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.267503 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ecdc8c4-09c4-47a3-a705-dd2bce091671-config\") pod \"controller-manager-57d7f4b84d-pk78f\" (UID: \"4ecdc8c4-09c4-47a3-a705-dd2bce091671\") " pod="openshift-controller-manager/controller-manager-57d7f4b84d-pk78f" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.278125 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ecdc8c4-09c4-47a3-a705-dd2bce091671-serving-cert\") pod \"controller-manager-57d7f4b84d-pk78f\" (UID: \"4ecdc8c4-09c4-47a3-a705-dd2bce091671\") " pod="openshift-controller-manager/controller-manager-57d7f4b84d-pk78f" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.281809 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s95ms\" (UniqueName: \"kubernetes.io/projected/30e39130-c686-4adb-abf1-854a478ec384-kube-api-access-s95ms\") pod \"route-controller-manager-65f7d746d6-s8skv\" (UID: \"30e39130-c686-4adb-abf1-854a478ec384\") " pod="openshift-route-controller-manager/route-controller-manager-65f7d746d6-s8skv" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.284007 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30e39130-c686-4adb-abf1-854a478ec384-serving-cert\") pod \"route-controller-manager-65f7d746d6-s8skv\" (UID: \"30e39130-c686-4adb-abf1-854a478ec384\") " pod="openshift-route-controller-manager/route-controller-manager-65f7d746d6-s8skv" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.296502 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vck9b\" (UniqueName: \"kubernetes.io/projected/4ecdc8c4-09c4-47a3-a705-dd2bce091671-kube-api-access-vck9b\") pod \"controller-manager-57d7f4b84d-pk78f\" (UID: \"4ecdc8c4-09c4-47a3-a705-dd2bce091671\") " pod="openshift-controller-manager/controller-manager-57d7f4b84d-pk78f" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.397982 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65f7d746d6-s8skv" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.405073 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57d7f4b84d-pk78f" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.627670 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65f7d746d6-s8skv"] Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.860753 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65f7d746d6-s8skv" event={"ID":"30e39130-c686-4adb-abf1-854a478ec384","Type":"ContainerStarted","Data":"58d5eae5d118c4880937cc4e62ab281ff916bd120e7a74814223061cb6b7661a"} Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.861231 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65f7d746d6-s8skv" event={"ID":"30e39130-c686-4adb-abf1-854a478ec384","Type":"ContainerStarted","Data":"d584f038acbb2b0ccc4e7bad4de2cfc98828d2672e1c1914bbb1ed6af0d19cd6"} Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.861260 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-65f7d746d6-s8skv" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.883625 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-57d7f4b84d-pk78f"] Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.886862 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-65f7d746d6-s8skv" podStartSLOduration=2.886837174 podStartE2EDuration="2.886837174s" podCreationTimestamp="2026-03-18 18:05:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:55.878196328 +0000 UTC m=+190.445826670" watchObservedRunningTime="2026-03-18 18:05:55.886837174 +0000 UTC m=+190.454467546" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.892767 4830 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.893205 4830 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.893228 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://b9580e49a2c4635e81e24f0db9f7240909ebf0b8a3129b88ffa7732693524f1e" gracePeriod=15 Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.893320 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://52bf7ea9535b8631243ce56c3b3f185a0bf45834f3aae18d9c17cdd00e664d75" gracePeriod=15 Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.893287 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://4f7dc1c185e795944593de077f9e79ba62fab1373d37016cacb2a7bd48ad096f" gracePeriod=15 Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.893308 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://3cb94fab85c6cbe1b987435f9965c724430c2674a665bde3e1f284e2a62adb20" gracePeriod=15 Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.893342 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://7f9a34d3b0ae6f483b34fef4c3e9595efe271b15f58d9991918ce33ede99551e" gracePeriod=15 Mar 18 18:05:55 crc kubenswrapper[4830]: E0318 18:05:55.894901 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.894945 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 18:05:55 crc kubenswrapper[4830]: E0318 18:05:55.894964 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.894979 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 18:05:55 crc kubenswrapper[4830]: E0318 18:05:55.894997 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.895011 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 18:05:55 crc kubenswrapper[4830]: E0318 18:05:55.895046 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.895060 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 18:05:55 crc kubenswrapper[4830]: E0318 18:05:55.895079 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.895092 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 18:05:55 crc kubenswrapper[4830]: E0318 18:05:55.895114 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.895128 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 18:05:55 crc kubenswrapper[4830]: E0318 18:05:55.895147 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.895160 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 18 18:05:55 crc kubenswrapper[4830]: E0318 18:05:55.895183 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.895196 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.895381 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.895404 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.895419 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.895438 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.895452 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.895473 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.895491 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 18:05:55 crc kubenswrapper[4830]: E0318 18:05:55.895656 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.895671 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.895878 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.898194 4830 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.900302 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:05:55 crc kubenswrapper[4830]: W0318 18:05:55.903236 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ecdc8c4_09c4_47a3_a705_dd2bce091671.slice/crio-9ffc4f03b55baed737d2891152523a275f363792ed4f88f2248324a0962210f3 WatchSource:0}: Error finding container 9ffc4f03b55baed737d2891152523a275f363792ed4f88f2248324a0962210f3: Status 404 returned error can't find the container with id 9ffc4f03b55baed737d2891152523a275f363792ed4f88f2248324a0962210f3 Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.915336 4830 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.972756 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.972819 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.972839 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.972875 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.972895 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.972929 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.972951 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:05:55 crc kubenswrapper[4830]: I0318 18:05:55.972975 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:05:55 crc kubenswrapper[4830]: E0318 18:05:55.986575 4830 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.39:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:05:56 crc kubenswrapper[4830]: E0318 18:05:56.035194 4830 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/events\": dial tcp 38.102.83.39:6443: connect: connection refused" event="&Event{ObjectMeta:{controller-manager-57d7f4b84d-pk78f.189e01af63b96d84 openshift-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-controller-manager,Name:controller-manager-57d7f4b84d-pk78f,UID:4ecdc8c4-09c4-47a3-a705-dd2bce091671,APIVersion:v1,ResourceVersion:29866,FieldPath:spec.containers{controller-manager},},Reason:Created,Message:Created container controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:05:56.034555268 +0000 UTC m=+190.602185610,LastTimestamp:2026-03-18 18:05:56.034555268 +0000 UTC m=+190.602185610,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.073857 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.073926 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.073979 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.074009 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.074033 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.074069 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.074095 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.074148 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.074231 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.074285 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.074317 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.074348 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.074373 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.074402 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.074429 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.074457 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.239299 4830 status_manager.go:851] "Failed to get status for pod" podUID="4ecdc8c4-09c4-47a3-a705-dd2bce091671" pod="openshift-controller-manager/controller-manager-57d7f4b84d-pk78f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-57d7f4b84d-pk78f\": dial tcp 38.102.83.39:6443: connect: connection refused" Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.244874 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d" path="/var/lib/kubelet/pods/0b7e93c3-4ed7-4ea6-b3e8-6cc80cc7c05d/volumes" Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.245758 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4966489-d69c-4915-bc0d-3337a7d5067e" path="/var/lib/kubelet/pods/b4966489-d69c-4915-bc0d-3337a7d5067e/volumes" Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.287760 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.861096 4830 patch_prober.go:28] interesting pod/route-controller-manager-65f7d746d6-s8skv container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.861537 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-65f7d746d6-s8skv" podUID="30e39130-c686-4adb-abf1-854a478ec384" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.872480 4830 generic.go:334] "Generic (PLEG): container finished" podID="5a1d7a8b-7a09-41f3-828b-83e57be4bac7" containerID="180be0aef1f3145a96040a06d16cfbdfbf4a972fc09ee5a2b38354e0903a5550" exitCode=0 Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.872564 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5a1d7a8b-7a09-41f3-828b-83e57be4bac7","Type":"ContainerDied","Data":"180be0aef1f3145a96040a06d16cfbdfbf4a972fc09ee5a2b38354e0903a5550"} Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.873536 4830 status_manager.go:851] "Failed to get status for pod" podUID="5a1d7a8b-7a09-41f3-828b-83e57be4bac7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.874040 4830 status_manager.go:851] "Failed to get status for pod" podUID="4ecdc8c4-09c4-47a3-a705-dd2bce091671" pod="openshift-controller-manager/controller-manager-57d7f4b84d-pk78f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-57d7f4b84d-pk78f\": dial tcp 38.102.83.39:6443: connect: connection refused" Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.877129 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.879308 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.880846 4830 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4f7dc1c185e795944593de077f9e79ba62fab1373d37016cacb2a7bd48ad096f" exitCode=0 Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.880902 4830 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3cb94fab85c6cbe1b987435f9965c724430c2674a665bde3e1f284e2a62adb20" exitCode=0 Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.880923 4830 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7f9a34d3b0ae6f483b34fef4c3e9595efe271b15f58d9991918ce33ede99551e" exitCode=0 Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.880942 4830 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="52bf7ea9535b8631243ce56c3b3f185a0bf45834f3aae18d9c17cdd00e664d75" exitCode=2 Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.881003 4830 scope.go:117] "RemoveContainer" containerID="811467b40c9890aea83b831baeb4e3799cdbf79ed318366ac0cc6e3a89dbda08" Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.884183 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"75d69fbbb950d1a6e04f1412e46caeb6dd8f8e1fe102e6aed90390def08f887c"} Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.884259 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"8f65922e3d0d625561c8a988ce3b571093439407a116146be929293db34fcb27"} Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.885263 4830 status_manager.go:851] "Failed to get status for pod" podUID="5a1d7a8b-7a09-41f3-828b-83e57be4bac7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Mar 18 18:05:56 crc kubenswrapper[4830]: E0318 18:05:56.885457 4830 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.39:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.885889 4830 status_manager.go:851] "Failed to get status for pod" podUID="4ecdc8c4-09c4-47a3-a705-dd2bce091671" pod="openshift-controller-manager/controller-manager-57d7f4b84d-pk78f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-57d7f4b84d-pk78f\": dial tcp 38.102.83.39:6443: connect: connection refused" Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.887903 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57d7f4b84d-pk78f" event={"ID":"4ecdc8c4-09c4-47a3-a705-dd2bce091671","Type":"ContainerStarted","Data":"153e0eb5b788e8f84f342993fc01cfeedd19280a3e8f719eb2f59deda7e3845d"} Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.887948 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57d7f4b84d-pk78f" event={"ID":"4ecdc8c4-09c4-47a3-a705-dd2bce091671","Type":"ContainerStarted","Data":"9ffc4f03b55baed737d2891152523a275f363792ed4f88f2248324a0962210f3"} Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.888621 4830 status_manager.go:851] "Failed to get status for pod" podUID="5a1d7a8b-7a09-41f3-828b-83e57be4bac7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Mar 18 18:05:56 crc kubenswrapper[4830]: I0318 18:05:56.889026 4830 status_manager.go:851] "Failed to get status for pod" podUID="4ecdc8c4-09c4-47a3-a705-dd2bce091671" pod="openshift-controller-manager/controller-manager-57d7f4b84d-pk78f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-57d7f4b84d-pk78f\": dial tcp 38.102.83.39:6443: connect: connection refused" Mar 18 18:05:57 crc kubenswrapper[4830]: I0318 18:05:57.888362 4830 patch_prober.go:28] interesting pod/route-controller-manager-65f7d746d6-s8skv container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 18:05:57 crc kubenswrapper[4830]: I0318 18:05:57.888908 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-65f7d746d6-s8skv" podUID="30e39130-c686-4adb-abf1-854a478ec384" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 18:05:57 crc kubenswrapper[4830]: I0318 18:05:57.902995 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 18:05:57 crc kubenswrapper[4830]: I0318 18:05:57.905017 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-57d7f4b84d-pk78f" Mar 18 18:05:57 crc kubenswrapper[4830]: I0318 18:05:57.912179 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-57d7f4b84d-pk78f" Mar 18 18:05:57 crc kubenswrapper[4830]: I0318 18:05:57.912863 4830 status_manager.go:851] "Failed to get status for pod" podUID="5a1d7a8b-7a09-41f3-828b-83e57be4bac7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Mar 18 18:05:57 crc kubenswrapper[4830]: I0318 18:05:57.913213 4830 status_manager.go:851] "Failed to get status for pod" podUID="4ecdc8c4-09c4-47a3-a705-dd2bce091671" pod="openshift-controller-manager/controller-manager-57d7f4b84d-pk78f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-57d7f4b84d-pk78f\": dial tcp 38.102.83.39:6443: connect: connection refused" Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.251223 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.252903 4830 status_manager.go:851] "Failed to get status for pod" podUID="4ecdc8c4-09c4-47a3-a705-dd2bce091671" pod="openshift-controller-manager/controller-manager-57d7f4b84d-pk78f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-57d7f4b84d-pk78f\": dial tcp 38.102.83.39:6443: connect: connection refused" Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.253403 4830 status_manager.go:851] "Failed to get status for pod" podUID="5a1d7a8b-7a09-41f3-828b-83e57be4bac7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.258202 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.259418 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.260133 4830 status_manager.go:851] "Failed to get status for pod" podUID="5a1d7a8b-7a09-41f3-828b-83e57be4bac7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.260765 4830 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.261172 4830 status_manager.go:851] "Failed to get status for pod" podUID="4ecdc8c4-09c4-47a3-a705-dd2bce091671" pod="openshift-controller-manager/controller-manager-57d7f4b84d-pk78f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-57d7f4b84d-pk78f\": dial tcp 38.102.83.39:6443: connect: connection refused" Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.306340 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5a1d7a8b-7a09-41f3-828b-83e57be4bac7-kubelet-dir\") pod \"5a1d7a8b-7a09-41f3-828b-83e57be4bac7\" (UID: \"5a1d7a8b-7a09-41f3-828b-83e57be4bac7\") " Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.306414 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.306450 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.306487 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5a1d7a8b-7a09-41f3-828b-83e57be4bac7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5a1d7a8b-7a09-41f3-828b-83e57be4bac7" (UID: "5a1d7a8b-7a09-41f3-828b-83e57be4bac7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.306541 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5a1d7a8b-7a09-41f3-828b-83e57be4bac7-var-lock\") pod \"5a1d7a8b-7a09-41f3-828b-83e57be4bac7\" (UID: \"5a1d7a8b-7a09-41f3-828b-83e57be4bac7\") " Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.306594 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.306616 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.306670 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.306700 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5a1d7a8b-7a09-41f3-828b-83e57be4bac7-var-lock" (OuterVolumeSpecName: "var-lock") pod "5a1d7a8b-7a09-41f3-828b-83e57be4bac7" (UID: "5a1d7a8b-7a09-41f3-828b-83e57be4bac7"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.306715 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a1d7a8b-7a09-41f3-828b-83e57be4bac7-kube-api-access\") pod \"5a1d7a8b-7a09-41f3-828b-83e57be4bac7\" (UID: \"5a1d7a8b-7a09-41f3-828b-83e57be4bac7\") " Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.306809 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.307418 4830 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5a1d7a8b-7a09-41f3-828b-83e57be4bac7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.307450 4830 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.307464 4830 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.307476 4830 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5a1d7a8b-7a09-41f3-828b-83e57be4bac7-var-lock\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.307488 4830 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.316795 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a1d7a8b-7a09-41f3-828b-83e57be4bac7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5a1d7a8b-7a09-41f3-828b-83e57be4bac7" (UID: "5a1d7a8b-7a09-41f3-828b-83e57be4bac7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.408939 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a1d7a8b-7a09-41f3-828b-83e57be4bac7-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.918001 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.919938 4830 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b9580e49a2c4635e81e24f0db9f7240909ebf0b8a3129b88ffa7732693524f1e" exitCode=0 Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.920106 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.920122 4830 scope.go:117] "RemoveContainer" containerID="4f7dc1c185e795944593de077f9e79ba62fab1373d37016cacb2a7bd48ad096f" Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.924197 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5a1d7a8b-7a09-41f3-828b-83e57be4bac7","Type":"ContainerDied","Data":"05f025272629d54e1c05bb8c488b3201eb0bbd160f40d78f76f604936181530c"} Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.924260 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05f025272629d54e1c05bb8c488b3201eb0bbd160f40d78f76f604936181530c" Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.924224 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.948132 4830 status_manager.go:851] "Failed to get status for pod" podUID="5a1d7a8b-7a09-41f3-828b-83e57be4bac7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.948578 4830 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.949308 4830 status_manager.go:851] "Failed to get status for pod" podUID="4ecdc8c4-09c4-47a3-a705-dd2bce091671" pod="openshift-controller-manager/controller-manager-57d7f4b84d-pk78f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-57d7f4b84d-pk78f\": dial tcp 38.102.83.39:6443: connect: connection refused" Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.951425 4830 scope.go:117] "RemoveContainer" containerID="3cb94fab85c6cbe1b987435f9965c724430c2674a665bde3e1f284e2a62adb20" Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.956720 4830 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.957494 4830 status_manager.go:851] "Failed to get status for pod" podUID="4ecdc8c4-09c4-47a3-a705-dd2bce091671" pod="openshift-controller-manager/controller-manager-57d7f4b84d-pk78f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-57d7f4b84d-pk78f\": dial tcp 38.102.83.39:6443: connect: connection refused" Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.958021 4830 status_manager.go:851] "Failed to get status for pod" podUID="5a1d7a8b-7a09-41f3-828b-83e57be4bac7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Mar 18 18:05:58 crc kubenswrapper[4830]: E0318 18:05:58.964836 4830 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" Mar 18 18:05:58 crc kubenswrapper[4830]: E0318 18:05:58.966544 4830 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" Mar 18 18:05:58 crc kubenswrapper[4830]: E0318 18:05:58.967196 4830 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" Mar 18 18:05:58 crc kubenswrapper[4830]: E0318 18:05:58.967809 4830 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" Mar 18 18:05:58 crc kubenswrapper[4830]: E0318 18:05:58.968235 4830 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.968276 4830 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 18 18:05:58 crc kubenswrapper[4830]: E0318 18:05:58.968627 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" interval="200ms" Mar 18 18:05:58 crc kubenswrapper[4830]: I0318 18:05:58.980348 4830 scope.go:117] "RemoveContainer" containerID="7f9a34d3b0ae6f483b34fef4c3e9595efe271b15f58d9991918ce33ede99551e" Mar 18 18:05:59 crc kubenswrapper[4830]: I0318 18:05:59.005466 4830 scope.go:117] "RemoveContainer" containerID="52bf7ea9535b8631243ce56c3b3f185a0bf45834f3aae18d9c17cdd00e664d75" Mar 18 18:05:59 crc kubenswrapper[4830]: I0318 18:05:59.027549 4830 scope.go:117] "RemoveContainer" containerID="b9580e49a2c4635e81e24f0db9f7240909ebf0b8a3129b88ffa7732693524f1e" Mar 18 18:05:59 crc kubenswrapper[4830]: I0318 18:05:59.055260 4830 scope.go:117] "RemoveContainer" containerID="4e79885891f39b4348094dcd3e2043eb0759994e0596c427418a4b16a16af03b" Mar 18 18:05:59 crc kubenswrapper[4830]: I0318 18:05:59.095621 4830 scope.go:117] "RemoveContainer" containerID="4f7dc1c185e795944593de077f9e79ba62fab1373d37016cacb2a7bd48ad096f" Mar 18 18:05:59 crc kubenswrapper[4830]: E0318 18:05:59.096282 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f7dc1c185e795944593de077f9e79ba62fab1373d37016cacb2a7bd48ad096f\": container with ID starting with 4f7dc1c185e795944593de077f9e79ba62fab1373d37016cacb2a7bd48ad096f not found: ID does not exist" containerID="4f7dc1c185e795944593de077f9e79ba62fab1373d37016cacb2a7bd48ad096f" Mar 18 18:05:59 crc kubenswrapper[4830]: I0318 18:05:59.096390 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f7dc1c185e795944593de077f9e79ba62fab1373d37016cacb2a7bd48ad096f"} err="failed to get container status \"4f7dc1c185e795944593de077f9e79ba62fab1373d37016cacb2a7bd48ad096f\": rpc error: code = NotFound desc = could not find container \"4f7dc1c185e795944593de077f9e79ba62fab1373d37016cacb2a7bd48ad096f\": container with ID starting with 4f7dc1c185e795944593de077f9e79ba62fab1373d37016cacb2a7bd48ad096f not found: ID does not exist" Mar 18 18:05:59 crc kubenswrapper[4830]: I0318 18:05:59.096434 4830 scope.go:117] "RemoveContainer" containerID="3cb94fab85c6cbe1b987435f9965c724430c2674a665bde3e1f284e2a62adb20" Mar 18 18:05:59 crc kubenswrapper[4830]: E0318 18:05:59.097205 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cb94fab85c6cbe1b987435f9965c724430c2674a665bde3e1f284e2a62adb20\": container with ID starting with 3cb94fab85c6cbe1b987435f9965c724430c2674a665bde3e1f284e2a62adb20 not found: ID does not exist" containerID="3cb94fab85c6cbe1b987435f9965c724430c2674a665bde3e1f284e2a62adb20" Mar 18 18:05:59 crc kubenswrapper[4830]: I0318 18:05:59.097264 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cb94fab85c6cbe1b987435f9965c724430c2674a665bde3e1f284e2a62adb20"} err="failed to get container status \"3cb94fab85c6cbe1b987435f9965c724430c2674a665bde3e1f284e2a62adb20\": rpc error: code = NotFound desc = could not find container \"3cb94fab85c6cbe1b987435f9965c724430c2674a665bde3e1f284e2a62adb20\": container with ID starting with 3cb94fab85c6cbe1b987435f9965c724430c2674a665bde3e1f284e2a62adb20 not found: ID does not exist" Mar 18 18:05:59 crc kubenswrapper[4830]: I0318 18:05:59.097306 4830 scope.go:117] "RemoveContainer" containerID="7f9a34d3b0ae6f483b34fef4c3e9595efe271b15f58d9991918ce33ede99551e" Mar 18 18:05:59 crc kubenswrapper[4830]: E0318 18:05:59.098022 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f9a34d3b0ae6f483b34fef4c3e9595efe271b15f58d9991918ce33ede99551e\": container with ID starting with 7f9a34d3b0ae6f483b34fef4c3e9595efe271b15f58d9991918ce33ede99551e not found: ID does not exist" containerID="7f9a34d3b0ae6f483b34fef4c3e9595efe271b15f58d9991918ce33ede99551e" Mar 18 18:05:59 crc kubenswrapper[4830]: I0318 18:05:59.098094 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f9a34d3b0ae6f483b34fef4c3e9595efe271b15f58d9991918ce33ede99551e"} err="failed to get container status \"7f9a34d3b0ae6f483b34fef4c3e9595efe271b15f58d9991918ce33ede99551e\": rpc error: code = NotFound desc = could not find container \"7f9a34d3b0ae6f483b34fef4c3e9595efe271b15f58d9991918ce33ede99551e\": container with ID starting with 7f9a34d3b0ae6f483b34fef4c3e9595efe271b15f58d9991918ce33ede99551e not found: ID does not exist" Mar 18 18:05:59 crc kubenswrapper[4830]: I0318 18:05:59.098114 4830 scope.go:117] "RemoveContainer" containerID="52bf7ea9535b8631243ce56c3b3f185a0bf45834f3aae18d9c17cdd00e664d75" Mar 18 18:05:59 crc kubenswrapper[4830]: E0318 18:05:59.098879 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52bf7ea9535b8631243ce56c3b3f185a0bf45834f3aae18d9c17cdd00e664d75\": container with ID starting with 52bf7ea9535b8631243ce56c3b3f185a0bf45834f3aae18d9c17cdd00e664d75 not found: ID does not exist" containerID="52bf7ea9535b8631243ce56c3b3f185a0bf45834f3aae18d9c17cdd00e664d75" Mar 18 18:05:59 crc kubenswrapper[4830]: I0318 18:05:59.099354 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52bf7ea9535b8631243ce56c3b3f185a0bf45834f3aae18d9c17cdd00e664d75"} err="failed to get container status \"52bf7ea9535b8631243ce56c3b3f185a0bf45834f3aae18d9c17cdd00e664d75\": rpc error: code = NotFound desc = could not find container \"52bf7ea9535b8631243ce56c3b3f185a0bf45834f3aae18d9c17cdd00e664d75\": container with ID starting with 52bf7ea9535b8631243ce56c3b3f185a0bf45834f3aae18d9c17cdd00e664d75 not found: ID does not exist" Mar 18 18:05:59 crc kubenswrapper[4830]: I0318 18:05:59.099422 4830 scope.go:117] "RemoveContainer" containerID="b9580e49a2c4635e81e24f0db9f7240909ebf0b8a3129b88ffa7732693524f1e" Mar 18 18:05:59 crc kubenswrapper[4830]: E0318 18:05:59.100196 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9580e49a2c4635e81e24f0db9f7240909ebf0b8a3129b88ffa7732693524f1e\": container with ID starting with b9580e49a2c4635e81e24f0db9f7240909ebf0b8a3129b88ffa7732693524f1e not found: ID does not exist" containerID="b9580e49a2c4635e81e24f0db9f7240909ebf0b8a3129b88ffa7732693524f1e" Mar 18 18:05:59 crc kubenswrapper[4830]: I0318 18:05:59.100241 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9580e49a2c4635e81e24f0db9f7240909ebf0b8a3129b88ffa7732693524f1e"} err="failed to get container status \"b9580e49a2c4635e81e24f0db9f7240909ebf0b8a3129b88ffa7732693524f1e\": rpc error: code = NotFound desc = could not find container \"b9580e49a2c4635e81e24f0db9f7240909ebf0b8a3129b88ffa7732693524f1e\": container with ID starting with b9580e49a2c4635e81e24f0db9f7240909ebf0b8a3129b88ffa7732693524f1e not found: ID does not exist" Mar 18 18:05:59 crc kubenswrapper[4830]: I0318 18:05:59.100393 4830 scope.go:117] "RemoveContainer" containerID="4e79885891f39b4348094dcd3e2043eb0759994e0596c427418a4b16a16af03b" Mar 18 18:05:59 crc kubenswrapper[4830]: E0318 18:05:59.101682 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e79885891f39b4348094dcd3e2043eb0759994e0596c427418a4b16a16af03b\": container with ID starting with 4e79885891f39b4348094dcd3e2043eb0759994e0596c427418a4b16a16af03b not found: ID does not exist" containerID="4e79885891f39b4348094dcd3e2043eb0759994e0596c427418a4b16a16af03b" Mar 18 18:05:59 crc kubenswrapper[4830]: I0318 18:05:59.101738 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e79885891f39b4348094dcd3e2043eb0759994e0596c427418a4b16a16af03b"} err="failed to get container status \"4e79885891f39b4348094dcd3e2043eb0759994e0596c427418a4b16a16af03b\": rpc error: code = NotFound desc = could not find container \"4e79885891f39b4348094dcd3e2043eb0759994e0596c427418a4b16a16af03b\": container with ID starting with 4e79885891f39b4348094dcd3e2043eb0759994e0596c427418a4b16a16af03b not found: ID does not exist" Mar 18 18:05:59 crc kubenswrapper[4830]: E0318 18:05:59.169325 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" interval="400ms" Mar 18 18:05:59 crc kubenswrapper[4830]: E0318 18:05:59.570218 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" interval="800ms" Mar 18 18:06:00 crc kubenswrapper[4830]: I0318 18:06:00.246700 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 18 18:06:00 crc kubenswrapper[4830]: E0318 18:06:00.371798 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" interval="1.6s" Mar 18 18:06:00 crc kubenswrapper[4830]: E0318 18:06:00.974446 4830 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/events\": dial tcp 38.102.83.39:6443: connect: connection refused" event="&Event{ObjectMeta:{controller-manager-57d7f4b84d-pk78f.189e01af63b96d84 openshift-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-controller-manager,Name:controller-manager-57d7f4b84d-pk78f,UID:4ecdc8c4-09c4-47a3-a705-dd2bce091671,APIVersion:v1,ResourceVersion:29866,FieldPath:spec.containers{controller-manager},},Reason:Created,Message:Created container controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:05:56.034555268 +0000 UTC m=+190.602185610,LastTimestamp:2026-03-18 18:05:56.034555268 +0000 UTC m=+190.602185610,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:06:01 crc kubenswrapper[4830]: E0318 18:06:01.977017 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" interval="3.2s" Mar 18 18:06:05 crc kubenswrapper[4830]: E0318 18:06:05.178414 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" interval="6.4s" Mar 18 18:06:06 crc kubenswrapper[4830]: I0318 18:06:06.243635 4830 status_manager.go:851] "Failed to get status for pod" podUID="4ecdc8c4-09c4-47a3-a705-dd2bce091671" pod="openshift-controller-manager/controller-manager-57d7f4b84d-pk78f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-57d7f4b84d-pk78f\": dial tcp 38.102.83.39:6443: connect: connection refused" Mar 18 18:06:06 crc kubenswrapper[4830]: I0318 18:06:06.245475 4830 status_manager.go:851] "Failed to get status for pod" podUID="5a1d7a8b-7a09-41f3-828b-83e57be4bac7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Mar 18 18:06:06 crc kubenswrapper[4830]: I0318 18:06:06.404242 4830 patch_prober.go:28] interesting pod/route-controller-manager-65f7d746d6-s8skv container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 18:06:06 crc kubenswrapper[4830]: I0318 18:06:06.404337 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-65f7d746d6-s8skv" podUID="30e39130-c686-4adb-abf1-854a478ec384" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 18:06:08 crc kubenswrapper[4830]: I0318 18:06:08.234160 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:06:08 crc kubenswrapper[4830]: I0318 18:06:08.236157 4830 status_manager.go:851] "Failed to get status for pod" podUID="4ecdc8c4-09c4-47a3-a705-dd2bce091671" pod="openshift-controller-manager/controller-manager-57d7f4b84d-pk78f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-57d7f4b84d-pk78f\": dial tcp 38.102.83.39:6443: connect: connection refused" Mar 18 18:06:08 crc kubenswrapper[4830]: I0318 18:06:08.238006 4830 status_manager.go:851] "Failed to get status for pod" podUID="5a1d7a8b-7a09-41f3-828b-83e57be4bac7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Mar 18 18:06:08 crc kubenswrapper[4830]: I0318 18:06:08.255071 4830 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3cd4d209-2ecf-4749-bf99-6819f6608a4b" Mar 18 18:06:08 crc kubenswrapper[4830]: I0318 18:06:08.255132 4830 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3cd4d209-2ecf-4749-bf99-6819f6608a4b" Mar 18 18:06:08 crc kubenswrapper[4830]: E0318 18:06:08.255838 4830 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:06:08 crc kubenswrapper[4830]: I0318 18:06:08.256546 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:06:08 crc kubenswrapper[4830]: W0318 18:06:08.284221 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-524b1672d73e4fca8c77f679abbbf7af751e579b52e02ecf6e66ea74eac60f0c WatchSource:0}: Error finding container 524b1672d73e4fca8c77f679abbbf7af751e579b52e02ecf6e66ea74eac60f0c: Status 404 returned error can't find the container with id 524b1672d73e4fca8c77f679abbbf7af751e579b52e02ecf6e66ea74eac60f0c Mar 18 18:06:09 crc kubenswrapper[4830]: I0318 18:06:09.006175 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 18:06:09 crc kubenswrapper[4830]: I0318 18:06:09.007303 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 18 18:06:09 crc kubenswrapper[4830]: I0318 18:06:09.007498 4830 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="db6e385c732bc785580a68becee1b84a1c19c39ac93deca38e9800fb37658efd" exitCode=1 Mar 18 18:06:09 crc kubenswrapper[4830]: I0318 18:06:09.007558 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"db6e385c732bc785580a68becee1b84a1c19c39ac93deca38e9800fb37658efd"} Mar 18 18:06:09 crc kubenswrapper[4830]: I0318 18:06:09.008409 4830 scope.go:117] "RemoveContainer" containerID="db6e385c732bc785580a68becee1b84a1c19c39ac93deca38e9800fb37658efd" Mar 18 18:06:09 crc kubenswrapper[4830]: I0318 18:06:09.008660 4830 status_manager.go:851] "Failed to get status for pod" podUID="4ecdc8c4-09c4-47a3-a705-dd2bce091671" pod="openshift-controller-manager/controller-manager-57d7f4b84d-pk78f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-57d7f4b84d-pk78f\": dial tcp 38.102.83.39:6443: connect: connection refused" Mar 18 18:06:09 crc kubenswrapper[4830]: I0318 18:06:09.009222 4830 status_manager.go:851] "Failed to get status for pod" podUID="5a1d7a8b-7a09-41f3-828b-83e57be4bac7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Mar 18 18:06:09 crc kubenswrapper[4830]: I0318 18:06:09.009653 4830 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Mar 18 18:06:09 crc kubenswrapper[4830]: I0318 18:06:09.010956 4830 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="b4c62c0211859c375491aea93a10bc86c38d80c6adb086a5897c14a65f824e98" exitCode=0 Mar 18 18:06:09 crc kubenswrapper[4830]: I0318 18:06:09.011026 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"b4c62c0211859c375491aea93a10bc86c38d80c6adb086a5897c14a65f824e98"} Mar 18 18:06:09 crc kubenswrapper[4830]: I0318 18:06:09.011057 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"524b1672d73e4fca8c77f679abbbf7af751e579b52e02ecf6e66ea74eac60f0c"} Mar 18 18:06:09 crc kubenswrapper[4830]: I0318 18:06:09.011610 4830 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3cd4d209-2ecf-4749-bf99-6819f6608a4b" Mar 18 18:06:09 crc kubenswrapper[4830]: I0318 18:06:09.011631 4830 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3cd4d209-2ecf-4749-bf99-6819f6608a4b" Mar 18 18:06:09 crc kubenswrapper[4830]: I0318 18:06:09.012100 4830 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Mar 18 18:06:09 crc kubenswrapper[4830]: E0318 18:06:09.012298 4830 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:06:09 crc kubenswrapper[4830]: I0318 18:06:09.012316 4830 status_manager.go:851] "Failed to get status for pod" podUID="4ecdc8c4-09c4-47a3-a705-dd2bce091671" pod="openshift-controller-manager/controller-manager-57d7f4b84d-pk78f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-57d7f4b84d-pk78f\": dial tcp 38.102.83.39:6443: connect: connection refused" Mar 18 18:06:09 crc kubenswrapper[4830]: I0318 18:06:09.012651 4830 status_manager.go:851] "Failed to get status for pod" podUID="5a1d7a8b-7a09-41f3-828b-83e57be4bac7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Mar 18 18:06:10 crc kubenswrapper[4830]: I0318 18:06:10.027689 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 18:06:10 crc kubenswrapper[4830]: I0318 18:06:10.029530 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 18 18:06:10 crc kubenswrapper[4830]: I0318 18:06:10.029639 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d5743a850948a2e4e0bd22b0c983c3c499dde828a22896528d94a8971eb021aa"} Mar 18 18:06:10 crc kubenswrapper[4830]: I0318 18:06:10.037395 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d658ec37c9f39fde312a76c3899f55641bbb8589bf669084d17f493ecb1e5390"} Mar 18 18:06:10 crc kubenswrapper[4830]: I0318 18:06:10.037450 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e86fd01b9b47bbd553bac350c4904fe00519bce443be719ba2b33787fbe1f3f1"} Mar 18 18:06:10 crc kubenswrapper[4830]: I0318 18:06:10.037467 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3f497116db98f26de4c8c2a83e36519004feed667e66f77740f7c8bc46ddc972"} Mar 18 18:06:11 crc kubenswrapper[4830]: I0318 18:06:11.046584 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"46e1bae9675bf27855bad64ad56cd072c38ab2ffcbe11020d54623e7c5bb0751"} Mar 18 18:06:11 crc kubenswrapper[4830]: I0318 18:06:11.047849 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:06:11 crc kubenswrapper[4830]: I0318 18:06:11.047981 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"21677f7f3b22a6c9f34c12cdf9b519cfe01f0dee069dbbbdd93c0e980ff54d91"} Mar 18 18:06:11 crc kubenswrapper[4830]: I0318 18:06:11.047052 4830 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3cd4d209-2ecf-4749-bf99-6819f6608a4b" Mar 18 18:06:11 crc kubenswrapper[4830]: I0318 18:06:11.048138 4830 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3cd4d209-2ecf-4749-bf99-6819f6608a4b" Mar 18 18:06:13 crc kubenswrapper[4830]: I0318 18:06:13.257098 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:06:13 crc kubenswrapper[4830]: I0318 18:06:13.257182 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:06:13 crc kubenswrapper[4830]: I0318 18:06:13.264864 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:06:14 crc kubenswrapper[4830]: I0318 18:06:14.260402 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 18:06:14 crc kubenswrapper[4830]: I0318 18:06:14.265159 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 18:06:15 crc kubenswrapper[4830]: I0318 18:06:15.072511 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 18:06:16 crc kubenswrapper[4830]: I0318 18:06:16.057562 4830 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:06:16 crc kubenswrapper[4830]: I0318 18:06:16.078206 4830 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3cd4d209-2ecf-4749-bf99-6819f6608a4b" Mar 18 18:06:16 crc kubenswrapper[4830]: I0318 18:06:16.078262 4830 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3cd4d209-2ecf-4749-bf99-6819f6608a4b" Mar 18 18:06:16 crc kubenswrapper[4830]: I0318 18:06:16.082018 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:06:16 crc kubenswrapper[4830]: I0318 18:06:16.266988 4830 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="6adb45aa-4263-4d50-9ce1-54d458d8cbf4" Mar 18 18:06:16 crc kubenswrapper[4830]: I0318 18:06:16.399401 4830 patch_prober.go:28] interesting pod/route-controller-manager-65f7d746d6-s8skv container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 18:06:16 crc kubenswrapper[4830]: I0318 18:06:16.399451 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-65f7d746d6-s8skv" podUID="30e39130-c686-4adb-abf1-854a478ec384" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 18:06:17 crc kubenswrapper[4830]: I0318 18:06:17.084151 4830 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3cd4d209-2ecf-4749-bf99-6819f6608a4b" Mar 18 18:06:17 crc kubenswrapper[4830]: I0318 18:06:17.084199 4830 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3cd4d209-2ecf-4749-bf99-6819f6608a4b" Mar 18 18:06:17 crc kubenswrapper[4830]: I0318 18:06:17.089852 4830 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="6adb45aa-4263-4d50-9ce1-54d458d8cbf4" Mar 18 18:06:25 crc kubenswrapper[4830]: I0318 18:06:25.361854 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 18:06:25 crc kubenswrapper[4830]: I0318 18:06:25.969199 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 18 18:06:26 crc kubenswrapper[4830]: I0318 18:06:26.152744 4830 patch_prober.go:28] interesting pod/route-controller-manager-65f7d746d6-s8skv container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": read tcp 10.217.0.2:47294->10.217.0.67:8443: read: connection reset by peer" start-of-body= Mar 18 18:06:26 crc kubenswrapper[4830]: I0318 18:06:26.153213 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-65f7d746d6-s8skv" podUID="30e39130-c686-4adb-abf1-854a478ec384" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": read tcp 10.217.0.2:47294->10.217.0.67:8443: read: connection reset by peer" Mar 18 18:06:26 crc kubenswrapper[4830]: I0318 18:06:26.157180 4830 patch_prober.go:28] interesting pod/route-controller-manager-65f7d746d6-s8skv container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": read tcp 10.217.0.2:47296->10.217.0.67:8443: read: connection reset by peer" start-of-body= Mar 18 18:06:26 crc kubenswrapper[4830]: I0318 18:06:26.157364 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-65f7d746d6-s8skv" podUID="30e39130-c686-4adb-abf1-854a478ec384" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": read tcp 10.217.0.2:47296->10.217.0.67:8443: read: connection reset by peer" Mar 18 18:06:26 crc kubenswrapper[4830]: I0318 18:06:26.923143 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 18 18:06:27 crc kubenswrapper[4830]: I0318 18:06:27.154910 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-65f7d746d6-s8skv_30e39130-c686-4adb-abf1-854a478ec384/route-controller-manager/0.log" Mar 18 18:06:27 crc kubenswrapper[4830]: I0318 18:06:27.154969 4830 generic.go:334] "Generic (PLEG): container finished" podID="30e39130-c686-4adb-abf1-854a478ec384" containerID="58d5eae5d118c4880937cc4e62ab281ff916bd120e7a74814223061cb6b7661a" exitCode=255 Mar 18 18:06:27 crc kubenswrapper[4830]: I0318 18:06:27.155011 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65f7d746d6-s8skv" event={"ID":"30e39130-c686-4adb-abf1-854a478ec384","Type":"ContainerDied","Data":"58d5eae5d118c4880937cc4e62ab281ff916bd120e7a74814223061cb6b7661a"} Mar 18 18:06:27 crc kubenswrapper[4830]: I0318 18:06:27.155599 4830 scope.go:117] "RemoveContainer" containerID="58d5eae5d118c4880937cc4e62ab281ff916bd120e7a74814223061cb6b7661a" Mar 18 18:06:27 crc kubenswrapper[4830]: I0318 18:06:27.377484 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 18 18:06:27 crc kubenswrapper[4830]: I0318 18:06:27.451211 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 18 18:06:27 crc kubenswrapper[4830]: I0318 18:06:27.498336 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 18 18:06:27 crc kubenswrapper[4830]: I0318 18:06:27.530408 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 18 18:06:27 crc kubenswrapper[4830]: I0318 18:06:27.583383 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 18 18:06:27 crc kubenswrapper[4830]: I0318 18:06:27.691323 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 18 18:06:27 crc kubenswrapper[4830]: I0318 18:06:27.739556 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 18 18:06:27 crc kubenswrapper[4830]: I0318 18:06:27.757072 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 18 18:06:27 crc kubenswrapper[4830]: I0318 18:06:27.902707 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 18 18:06:28 crc kubenswrapper[4830]: I0318 18:06:28.165436 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-65f7d746d6-s8skv_30e39130-c686-4adb-abf1-854a478ec384/route-controller-manager/0.log" Mar 18 18:06:28 crc kubenswrapper[4830]: I0318 18:06:28.165955 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65f7d746d6-s8skv" event={"ID":"30e39130-c686-4adb-abf1-854a478ec384","Type":"ContainerStarted","Data":"5a401d97974e1791eec00a0cadbcaebe9fa70be0c787a392c045714369cdcc84"} Mar 18 18:06:28 crc kubenswrapper[4830]: I0318 18:06:28.167074 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-65f7d746d6-s8skv" Mar 18 18:06:28 crc kubenswrapper[4830]: I0318 18:06:28.176903 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-65f7d746d6-s8skv" Mar 18 18:06:28 crc kubenswrapper[4830]: I0318 18:06:28.319495 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 18 18:06:28 crc kubenswrapper[4830]: I0318 18:06:28.472925 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 18 18:06:28 crc kubenswrapper[4830]: I0318 18:06:28.500469 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 18 18:06:28 crc kubenswrapper[4830]: I0318 18:06:28.738027 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 18 18:06:28 crc kubenswrapper[4830]: I0318 18:06:28.787514 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 18 18:06:28 crc kubenswrapper[4830]: I0318 18:06:28.875136 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 18 18:06:29 crc kubenswrapper[4830]: I0318 18:06:29.021820 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 18 18:06:29 crc kubenswrapper[4830]: I0318 18:06:29.090522 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 18 18:06:29 crc kubenswrapper[4830]: I0318 18:06:29.308912 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 18:06:29 crc kubenswrapper[4830]: I0318 18:06:29.323907 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 18 18:06:29 crc kubenswrapper[4830]: I0318 18:06:29.497754 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 18 18:06:29 crc kubenswrapper[4830]: I0318 18:06:29.509414 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:06:29 crc kubenswrapper[4830]: I0318 18:06:29.509511 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:06:29 crc kubenswrapper[4830]: I0318 18:06:29.578692 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 18 18:06:29 crc kubenswrapper[4830]: I0318 18:06:29.663486 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 18 18:06:29 crc kubenswrapper[4830]: I0318 18:06:29.709403 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 18:06:29 crc kubenswrapper[4830]: I0318 18:06:29.715437 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 18 18:06:29 crc kubenswrapper[4830]: I0318 18:06:29.725556 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 18 18:06:29 crc kubenswrapper[4830]: I0318 18:06:29.837620 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 18 18:06:29 crc kubenswrapper[4830]: I0318 18:06:29.991606 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 18:06:30 crc kubenswrapper[4830]: I0318 18:06:30.092276 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 18 18:06:30 crc kubenswrapper[4830]: I0318 18:06:30.198863 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 18 18:06:30 crc kubenswrapper[4830]: I0318 18:06:30.220594 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 18 18:06:30 crc kubenswrapper[4830]: I0318 18:06:30.247514 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 18:06:30 crc kubenswrapper[4830]: I0318 18:06:30.268761 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 18 18:06:30 crc kubenswrapper[4830]: I0318 18:06:30.275690 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 18:06:30 crc kubenswrapper[4830]: I0318 18:06:30.303620 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 18 18:06:30 crc kubenswrapper[4830]: I0318 18:06:30.382060 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 18 18:06:30 crc kubenswrapper[4830]: I0318 18:06:30.495853 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 18 18:06:30 crc kubenswrapper[4830]: I0318 18:06:30.504733 4830 ???:1] "http: TLS handshake error from 192.168.126.11:37674: no serving certificate available for the kubelet" Mar 18 18:06:30 crc kubenswrapper[4830]: I0318 18:06:30.524957 4830 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 18 18:06:30 crc kubenswrapper[4830]: I0318 18:06:30.525906 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-57d7f4b84d-pk78f" podStartSLOduration=37.525885643 podStartE2EDuration="37.525885643s" podCreationTimestamp="2026-03-18 18:05:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:06:15.780259267 +0000 UTC m=+210.347889609" watchObservedRunningTime="2026-03-18 18:06:30.525885643 +0000 UTC m=+225.093515975" Mar 18 18:06:30 crc kubenswrapper[4830]: I0318 18:06:30.530524 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 18:06:30 crc kubenswrapper[4830]: I0318 18:06:30.530574 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 18:06:30 crc kubenswrapper[4830]: I0318 18:06:30.536356 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:06:30 crc kubenswrapper[4830]: I0318 18:06:30.546986 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 18 18:06:30 crc kubenswrapper[4830]: I0318 18:06:30.553341 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.553328079 podStartE2EDuration="14.553328079s" podCreationTimestamp="2026-03-18 18:06:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:06:30.549953456 +0000 UTC m=+225.117583788" watchObservedRunningTime="2026-03-18 18:06:30.553328079 +0000 UTC m=+225.120958411" Mar 18 18:06:30 crc kubenswrapper[4830]: I0318 18:06:30.560457 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 18:06:30 crc kubenswrapper[4830]: I0318 18:06:30.622306 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 18 18:06:30 crc kubenswrapper[4830]: I0318 18:06:30.782395 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 18 18:06:30 crc kubenswrapper[4830]: I0318 18:06:30.796262 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 18 18:06:30 crc kubenswrapper[4830]: I0318 18:06:30.850851 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 18 18:06:30 crc kubenswrapper[4830]: I0318 18:06:30.904663 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 18 18:06:31 crc kubenswrapper[4830]: I0318 18:06:31.004174 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 18 18:06:31 crc kubenswrapper[4830]: I0318 18:06:31.021460 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 18 18:06:31 crc kubenswrapper[4830]: I0318 18:06:31.066883 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 18 18:06:31 crc kubenswrapper[4830]: I0318 18:06:31.387831 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 18 18:06:31 crc kubenswrapper[4830]: I0318 18:06:31.411707 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 18 18:06:31 crc kubenswrapper[4830]: I0318 18:06:31.544527 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 18:06:31 crc kubenswrapper[4830]: I0318 18:06:31.553325 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 18 18:06:31 crc kubenswrapper[4830]: I0318 18:06:31.582064 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 18 18:06:31 crc kubenswrapper[4830]: I0318 18:06:31.695526 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 18 18:06:31 crc kubenswrapper[4830]: I0318 18:06:31.740192 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 18 18:06:31 crc kubenswrapper[4830]: I0318 18:06:31.869173 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 18 18:06:31 crc kubenswrapper[4830]: I0318 18:06:31.925157 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 18 18:06:31 crc kubenswrapper[4830]: I0318 18:06:31.991971 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 18 18:06:32 crc kubenswrapper[4830]: I0318 18:06:32.030118 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 18 18:06:32 crc kubenswrapper[4830]: I0318 18:06:32.080119 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 18 18:06:32 crc kubenswrapper[4830]: I0318 18:06:32.153767 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 18 18:06:32 crc kubenswrapper[4830]: I0318 18:06:32.162484 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 18:06:32 crc kubenswrapper[4830]: I0318 18:06:32.167755 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 18 18:06:32 crc kubenswrapper[4830]: I0318 18:06:32.259123 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 18:06:32 crc kubenswrapper[4830]: I0318 18:06:32.283092 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 18 18:06:32 crc kubenswrapper[4830]: I0318 18:06:32.324697 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 18 18:06:32 crc kubenswrapper[4830]: I0318 18:06:32.327429 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 18 18:06:32 crc kubenswrapper[4830]: I0318 18:06:32.478559 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 18 18:06:32 crc kubenswrapper[4830]: I0318 18:06:32.602250 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 18:06:32 crc kubenswrapper[4830]: I0318 18:06:32.637510 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 18 18:06:32 crc kubenswrapper[4830]: I0318 18:06:32.646548 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 18:06:32 crc kubenswrapper[4830]: I0318 18:06:32.702505 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 18 18:06:32 crc kubenswrapper[4830]: I0318 18:06:32.741175 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 18 18:06:32 crc kubenswrapper[4830]: I0318 18:06:32.766103 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 18 18:06:32 crc kubenswrapper[4830]: I0318 18:06:32.797496 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 18 18:06:32 crc kubenswrapper[4830]: I0318 18:06:32.803506 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 18 18:06:32 crc kubenswrapper[4830]: I0318 18:06:32.990561 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 18:06:33 crc kubenswrapper[4830]: I0318 18:06:33.082235 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 18 18:06:33 crc kubenswrapper[4830]: I0318 18:06:33.178859 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 18 18:06:33 crc kubenswrapper[4830]: I0318 18:06:33.240028 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 18 18:06:33 crc kubenswrapper[4830]: I0318 18:06:33.313686 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 18:06:33 crc kubenswrapper[4830]: I0318 18:06:33.356064 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 18 18:06:33 crc kubenswrapper[4830]: I0318 18:06:33.481460 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 18 18:06:33 crc kubenswrapper[4830]: I0318 18:06:33.490582 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 18 18:06:33 crc kubenswrapper[4830]: I0318 18:06:33.534670 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 18 18:06:33 crc kubenswrapper[4830]: I0318 18:06:33.579491 4830 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 18 18:06:33 crc kubenswrapper[4830]: I0318 18:06:33.738069 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 18 18:06:33 crc kubenswrapper[4830]: I0318 18:06:33.753282 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 18 18:06:33 crc kubenswrapper[4830]: I0318 18:06:33.830696 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 18:06:33 crc kubenswrapper[4830]: I0318 18:06:33.979906 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 18 18:06:34 crc kubenswrapper[4830]: I0318 18:06:34.012904 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 18 18:06:34 crc kubenswrapper[4830]: I0318 18:06:34.045150 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 18 18:06:34 crc kubenswrapper[4830]: I0318 18:06:34.046275 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 18 18:06:34 crc kubenswrapper[4830]: I0318 18:06:34.103069 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 18 18:06:34 crc kubenswrapper[4830]: I0318 18:06:34.112478 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 18 18:06:34 crc kubenswrapper[4830]: I0318 18:06:34.171399 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 18 18:06:34 crc kubenswrapper[4830]: I0318 18:06:34.211145 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 18:06:34 crc kubenswrapper[4830]: I0318 18:06:34.357077 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 18 18:06:34 crc kubenswrapper[4830]: I0318 18:06:34.464189 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 18:06:34 crc kubenswrapper[4830]: I0318 18:06:34.464364 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 18 18:06:34 crc kubenswrapper[4830]: I0318 18:06:34.513541 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 18:06:34 crc kubenswrapper[4830]: I0318 18:06:34.557711 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 18 18:06:34 crc kubenswrapper[4830]: I0318 18:06:34.627489 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 18 18:06:34 crc kubenswrapper[4830]: I0318 18:06:34.637969 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 18:06:34 crc kubenswrapper[4830]: I0318 18:06:34.682547 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 18 18:06:34 crc kubenswrapper[4830]: I0318 18:06:34.753168 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 18 18:06:34 crc kubenswrapper[4830]: I0318 18:06:34.794760 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 18 18:06:34 crc kubenswrapper[4830]: I0318 18:06:34.835441 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 18 18:06:34 crc kubenswrapper[4830]: I0318 18:06:34.848521 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 18 18:06:34 crc kubenswrapper[4830]: I0318 18:06:34.880430 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 18 18:06:34 crc kubenswrapper[4830]: I0318 18:06:34.988899 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 18 18:06:35 crc kubenswrapper[4830]: I0318 18:06:35.017966 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 18:06:35 crc kubenswrapper[4830]: I0318 18:06:35.102056 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 18 18:06:35 crc kubenswrapper[4830]: I0318 18:06:35.210736 4830 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 18 18:06:35 crc kubenswrapper[4830]: I0318 18:06:35.294950 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 18 18:06:35 crc kubenswrapper[4830]: I0318 18:06:35.388990 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 18 18:06:35 crc kubenswrapper[4830]: I0318 18:06:35.409431 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 18 18:06:35 crc kubenswrapper[4830]: I0318 18:06:35.546488 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 18 18:06:35 crc kubenswrapper[4830]: I0318 18:06:35.553723 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 18 18:06:35 crc kubenswrapper[4830]: I0318 18:06:35.641656 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 18 18:06:35 crc kubenswrapper[4830]: I0318 18:06:35.689809 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 18:06:35 crc kubenswrapper[4830]: I0318 18:06:35.690039 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 18 18:06:35 crc kubenswrapper[4830]: I0318 18:06:35.725222 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 18:06:35 crc kubenswrapper[4830]: I0318 18:06:35.771899 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 18:06:35 crc kubenswrapper[4830]: I0318 18:06:35.794016 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 18 18:06:35 crc kubenswrapper[4830]: I0318 18:06:35.825491 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 18 18:06:35 crc kubenswrapper[4830]: I0318 18:06:35.924674 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 18 18:06:35 crc kubenswrapper[4830]: I0318 18:06:35.987118 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 18 18:06:35 crc kubenswrapper[4830]: I0318 18:06:35.997164 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 18 18:06:36 crc kubenswrapper[4830]: I0318 18:06:36.049290 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 18 18:06:36 crc kubenswrapper[4830]: I0318 18:06:36.086082 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 18 18:06:36 crc kubenswrapper[4830]: I0318 18:06:36.110864 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 18 18:06:36 crc kubenswrapper[4830]: I0318 18:06:36.144968 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 18 18:06:36 crc kubenswrapper[4830]: I0318 18:06:36.181446 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 18 18:06:36 crc kubenswrapper[4830]: I0318 18:06:36.252439 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 18 18:06:36 crc kubenswrapper[4830]: I0318 18:06:36.329586 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 18 18:06:36 crc kubenswrapper[4830]: I0318 18:06:36.350706 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 18 18:06:36 crc kubenswrapper[4830]: I0318 18:06:36.352405 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 18 18:06:36 crc kubenswrapper[4830]: I0318 18:06:36.394201 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 18 18:06:36 crc kubenswrapper[4830]: I0318 18:06:36.459888 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 18 18:06:36 crc kubenswrapper[4830]: I0318 18:06:36.484568 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 18 18:06:36 crc kubenswrapper[4830]: I0318 18:06:36.551556 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 18 18:06:36 crc kubenswrapper[4830]: I0318 18:06:36.615734 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 18 18:06:36 crc kubenswrapper[4830]: I0318 18:06:36.713372 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 18:06:36 crc kubenswrapper[4830]: I0318 18:06:36.822373 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 18 18:06:36 crc kubenswrapper[4830]: I0318 18:06:36.857613 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 18 18:06:36 crc kubenswrapper[4830]: I0318 18:06:36.861398 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 18 18:06:36 crc kubenswrapper[4830]: I0318 18:06:36.912165 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 18 18:06:36 crc kubenswrapper[4830]: I0318 18:06:36.926050 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 18 18:06:37 crc kubenswrapper[4830]: I0318 18:06:37.007737 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 18 18:06:37 crc kubenswrapper[4830]: I0318 18:06:37.053167 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 18 18:06:37 crc kubenswrapper[4830]: I0318 18:06:37.063132 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 18 18:06:37 crc kubenswrapper[4830]: I0318 18:06:37.121879 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 18 18:06:37 crc kubenswrapper[4830]: I0318 18:06:37.212044 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 18:06:37 crc kubenswrapper[4830]: I0318 18:06:37.273650 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 18 18:06:37 crc kubenswrapper[4830]: I0318 18:06:37.365443 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 18 18:06:37 crc kubenswrapper[4830]: I0318 18:06:37.424991 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564286-l7vpb"] Mar 18 18:06:37 crc kubenswrapper[4830]: E0318 18:06:37.425226 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a1d7a8b-7a09-41f3-828b-83e57be4bac7" containerName="installer" Mar 18 18:06:37 crc kubenswrapper[4830]: I0318 18:06:37.425237 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a1d7a8b-7a09-41f3-828b-83e57be4bac7" containerName="installer" Mar 18 18:06:37 crc kubenswrapper[4830]: I0318 18:06:37.425318 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a1d7a8b-7a09-41f3-828b-83e57be4bac7" containerName="installer" Mar 18 18:06:37 crc kubenswrapper[4830]: I0318 18:06:37.425668 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564286-l7vpb" Mar 18 18:06:37 crc kubenswrapper[4830]: I0318 18:06:37.430021 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:06:37 crc kubenswrapper[4830]: I0318 18:06:37.430033 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 18:06:37 crc kubenswrapper[4830]: I0318 18:06:37.430192 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:06:37 crc kubenswrapper[4830]: I0318 18:06:37.438056 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 18:06:37 crc kubenswrapper[4830]: I0318 18:06:37.472290 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 18 18:06:37 crc kubenswrapper[4830]: I0318 18:06:37.476424 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w876s\" (UniqueName: \"kubernetes.io/projected/9199b38a-eef8-4a83-a1b8-0f6fd6faaffd-kube-api-access-w876s\") pod \"auto-csr-approver-29564286-l7vpb\" (UID: \"9199b38a-eef8-4a83-a1b8-0f6fd6faaffd\") " pod="openshift-infra/auto-csr-approver-29564286-l7vpb" Mar 18 18:06:37 crc kubenswrapper[4830]: I0318 18:06:37.483096 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 18:06:37 crc kubenswrapper[4830]: I0318 18:06:37.530560 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 18 18:06:37 crc kubenswrapper[4830]: I0318 18:06:37.578008 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w876s\" (UniqueName: \"kubernetes.io/projected/9199b38a-eef8-4a83-a1b8-0f6fd6faaffd-kube-api-access-w876s\") pod \"auto-csr-approver-29564286-l7vpb\" (UID: \"9199b38a-eef8-4a83-a1b8-0f6fd6faaffd\") " pod="openshift-infra/auto-csr-approver-29564286-l7vpb" Mar 18 18:06:37 crc kubenswrapper[4830]: I0318 18:06:37.597929 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w876s\" (UniqueName: \"kubernetes.io/projected/9199b38a-eef8-4a83-a1b8-0f6fd6faaffd-kube-api-access-w876s\") pod \"auto-csr-approver-29564286-l7vpb\" (UID: \"9199b38a-eef8-4a83-a1b8-0f6fd6faaffd\") " pod="openshift-infra/auto-csr-approver-29564286-l7vpb" Mar 18 18:06:37 crc kubenswrapper[4830]: I0318 18:06:37.623216 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 18 18:06:37 crc kubenswrapper[4830]: I0318 18:06:37.678383 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 18 18:06:37 crc kubenswrapper[4830]: I0318 18:06:37.724848 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 18 18:06:37 crc kubenswrapper[4830]: I0318 18:06:37.734346 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 18 18:06:37 crc kubenswrapper[4830]: I0318 18:06:37.739509 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564286-l7vpb" Mar 18 18:06:37 crc kubenswrapper[4830]: I0318 18:06:37.767056 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 18 18:06:37 crc kubenswrapper[4830]: I0318 18:06:37.795089 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 18 18:06:37 crc kubenswrapper[4830]: I0318 18:06:37.799917 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 18 18:06:37 crc kubenswrapper[4830]: I0318 18:06:37.806759 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 18 18:06:37 crc kubenswrapper[4830]: I0318 18:06:37.869179 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 18 18:06:37 crc kubenswrapper[4830]: I0318 18:06:37.982162 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 18 18:06:38 crc kubenswrapper[4830]: I0318 18:06:38.019146 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 18 18:06:38 crc kubenswrapper[4830]: I0318 18:06:38.108859 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 18 18:06:38 crc kubenswrapper[4830]: I0318 18:06:38.133455 4830 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 18 18:06:38 crc kubenswrapper[4830]: I0318 18:06:38.203354 4830 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 18 18:06:38 crc kubenswrapper[4830]: I0318 18:06:38.256310 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 18 18:06:38 crc kubenswrapper[4830]: I0318 18:06:38.332538 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 18 18:06:38 crc kubenswrapper[4830]: I0318 18:06:38.422412 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 18:06:38 crc kubenswrapper[4830]: I0318 18:06:38.445321 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 18:06:38 crc kubenswrapper[4830]: I0318 18:06:38.445541 4830 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 18:06:38 crc kubenswrapper[4830]: I0318 18:06:38.445764 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://75d69fbbb950d1a6e04f1412e46caeb6dd8f8e1fe102e6aed90390def08f887c" gracePeriod=5 Mar 18 18:06:38 crc kubenswrapper[4830]: I0318 18:06:38.452736 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 18 18:06:38 crc kubenswrapper[4830]: I0318 18:06:38.527525 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 18 18:06:38 crc kubenswrapper[4830]: I0318 18:06:38.561516 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 18 18:06:38 crc kubenswrapper[4830]: I0318 18:06:38.612190 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 18:06:38 crc kubenswrapper[4830]: I0318 18:06:38.630799 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 18 18:06:38 crc kubenswrapper[4830]: I0318 18:06:38.718832 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 18 18:06:38 crc kubenswrapper[4830]: I0318 18:06:38.726822 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 18 18:06:38 crc kubenswrapper[4830]: I0318 18:06:38.727046 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 18 18:06:38 crc kubenswrapper[4830]: I0318 18:06:38.833924 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 18 18:06:38 crc kubenswrapper[4830]: I0318 18:06:38.839697 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 18 18:06:38 crc kubenswrapper[4830]: I0318 18:06:38.846288 4830 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 18 18:06:38 crc kubenswrapper[4830]: I0318 18:06:38.854343 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 18 18:06:38 crc kubenswrapper[4830]: I0318 18:06:38.865016 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 18 18:06:38 crc kubenswrapper[4830]: I0318 18:06:38.920515 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 18 18:06:38 crc kubenswrapper[4830]: I0318 18:06:38.944351 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 18 18:06:38 crc kubenswrapper[4830]: I0318 18:06:38.958856 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 18 18:06:38 crc kubenswrapper[4830]: I0318 18:06:38.975621 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 18 18:06:38 crc kubenswrapper[4830]: I0318 18:06:38.994397 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 18 18:06:39 crc kubenswrapper[4830]: I0318 18:06:39.030279 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 18 18:06:39 crc kubenswrapper[4830]: I0318 18:06:39.164343 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 18 18:06:39 crc kubenswrapper[4830]: I0318 18:06:39.190054 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 18 18:06:39 crc kubenswrapper[4830]: I0318 18:06:39.226215 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 18:06:39 crc kubenswrapper[4830]: I0318 18:06:39.236316 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 18 18:06:39 crc kubenswrapper[4830]: I0318 18:06:39.236585 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 18 18:06:39 crc kubenswrapper[4830]: I0318 18:06:39.243633 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 18 18:06:39 crc kubenswrapper[4830]: I0318 18:06:39.396101 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 18 18:06:39 crc kubenswrapper[4830]: I0318 18:06:39.400750 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 18 18:06:39 crc kubenswrapper[4830]: I0318 18:06:39.431696 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 18 18:06:39 crc kubenswrapper[4830]: I0318 18:06:39.489522 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 18 18:06:39 crc kubenswrapper[4830]: I0318 18:06:39.547454 4830 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 18 18:06:39 crc kubenswrapper[4830]: I0318 18:06:39.631564 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 18 18:06:39 crc kubenswrapper[4830]: I0318 18:06:39.633609 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 18:06:39 crc kubenswrapper[4830]: I0318 18:06:39.689950 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 18 18:06:39 crc kubenswrapper[4830]: I0318 18:06:39.765182 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 18 18:06:39 crc kubenswrapper[4830]: I0318 18:06:39.796183 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 18 18:06:39 crc kubenswrapper[4830]: I0318 18:06:39.806410 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 18 18:06:39 crc kubenswrapper[4830]: I0318 18:06:39.908934 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 18 18:06:39 crc kubenswrapper[4830]: I0318 18:06:39.938978 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 18 18:06:39 crc kubenswrapper[4830]: I0318 18:06:39.941803 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564286-l7vpb"] Mar 18 18:06:39 crc kubenswrapper[4830]: I0318 18:06:39.942472 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 18 18:06:40 crc kubenswrapper[4830]: I0318 18:06:40.112543 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 18 18:06:40 crc kubenswrapper[4830]: I0318 18:06:40.215838 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 18 18:06:40 crc kubenswrapper[4830]: I0318 18:06:40.284948 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 18 18:06:40 crc kubenswrapper[4830]: I0318 18:06:40.303893 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 18 18:06:40 crc kubenswrapper[4830]: I0318 18:06:40.310614 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 18 18:06:40 crc kubenswrapper[4830]: I0318 18:06:40.319565 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 18 18:06:40 crc kubenswrapper[4830]: I0318 18:06:40.431290 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 18 18:06:40 crc kubenswrapper[4830]: I0318 18:06:40.456668 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564286-l7vpb"] Mar 18 18:06:40 crc kubenswrapper[4830]: I0318 18:06:40.535651 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 18:06:40 crc kubenswrapper[4830]: I0318 18:06:40.551228 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 18:06:40 crc kubenswrapper[4830]: I0318 18:06:40.730156 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 18:06:40 crc kubenswrapper[4830]: I0318 18:06:40.760364 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 18 18:06:40 crc kubenswrapper[4830]: I0318 18:06:40.767137 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 18 18:06:40 crc kubenswrapper[4830]: I0318 18:06:40.777151 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 18 18:06:40 crc kubenswrapper[4830]: I0318 18:06:40.789430 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 18 18:06:40 crc kubenswrapper[4830]: I0318 18:06:40.910925 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 18:06:40 crc kubenswrapper[4830]: I0318 18:06:40.958318 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 18 18:06:40 crc kubenswrapper[4830]: I0318 18:06:40.961938 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 18 18:06:41 crc kubenswrapper[4830]: I0318 18:06:41.045992 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 18:06:41 crc kubenswrapper[4830]: I0318 18:06:41.115031 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 18 18:06:41 crc kubenswrapper[4830]: I0318 18:06:41.241035 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564286-l7vpb" event={"ID":"9199b38a-eef8-4a83-a1b8-0f6fd6faaffd","Type":"ContainerStarted","Data":"a5d7b29ffd486103333c76bf01db498469a81629fe6499e8b8402c09e296f460"} Mar 18 18:06:41 crc kubenswrapper[4830]: I0318 18:06:41.308915 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 18 18:06:41 crc kubenswrapper[4830]: I0318 18:06:41.379196 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 18 18:06:41 crc kubenswrapper[4830]: I0318 18:06:41.528021 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 18 18:06:41 crc kubenswrapper[4830]: I0318 18:06:41.653363 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 18 18:06:41 crc kubenswrapper[4830]: I0318 18:06:41.769170 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 18 18:06:41 crc kubenswrapper[4830]: I0318 18:06:41.939913 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.110699 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.177559 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.244924 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.277256 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.302882 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2tsg6"] Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.303114 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2tsg6" podUID="4d8e7b87-f442-4d60-bd65-35eacd097689" containerName="registry-server" containerID="cri-o://b40570a257725b0ded8998c2f6f61f7ab3b0c55b679b190aaefa939b3cd1eef6" gracePeriod=30 Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.313757 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-szdp2"] Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.314793 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-szdp2" podUID="b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b" containerName="registry-server" containerID="cri-o://276129449ac1ad9821a5efb88f1be1d11cf98fe4dfc24d32a7405056790f4d4e" gracePeriod=30 Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.341461 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j2j42"] Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.341731 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-j2j42" podUID="7332042a-dffc-4c3e-94eb-2a1dedc58062" containerName="marketplace-operator" containerID="cri-o://7a5a7383097e5257a71b6aa3da8a366746db9f368caa6f41e4d909a957e6fb28" gracePeriod=30 Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.348178 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pgl85"] Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.348405 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pgl85" podUID="28acc7fe-7976-4396-89b7-c17a9e836b22" containerName="registry-server" containerID="cri-o://f854cd2996b7a5f26ab5e48779f51b79c2a7a32754e3ff242dda78ccf37b6fe7" gracePeriod=30 Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.353565 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xqf2s"] Mar 18 18:06:42 crc kubenswrapper[4830]: E0318 18:06:42.353797 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.353812 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.353898 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.354263 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xqf2s" Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.382258 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5rhcb"] Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.382611 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5rhcb" podUID="b9160fc9-aa00-4ce7-9ea2-15aac1e11e00" containerName="registry-server" containerID="cri-o://35eb2be93b7d0764a58ab2a57d8f800a513854a8cc2da33c69f8cae68f95351c" gracePeriod=30 Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.383179 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.386729 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xqf2s"] Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.449727 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/77c8fe94-c2c8-419b-a4c0-a1bc5f4d011f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xqf2s\" (UID: \"77c8fe94-c2c8-419b-a4c0-a1bc5f4d011f\") " pod="openshift-marketplace/marketplace-operator-79b997595-xqf2s" Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.449806 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ssf5\" (UniqueName: \"kubernetes.io/projected/77c8fe94-c2c8-419b-a4c0-a1bc5f4d011f-kube-api-access-4ssf5\") pod \"marketplace-operator-79b997595-xqf2s\" (UID: \"77c8fe94-c2c8-419b-a4c0-a1bc5f4d011f\") " pod="openshift-marketplace/marketplace-operator-79b997595-xqf2s" Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.449854 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/77c8fe94-c2c8-419b-a4c0-a1bc5f4d011f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xqf2s\" (UID: \"77c8fe94-c2c8-419b-a4c0-a1bc5f4d011f\") " pod="openshift-marketplace/marketplace-operator-79b997595-xqf2s" Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.550566 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/77c8fe94-c2c8-419b-a4c0-a1bc5f4d011f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xqf2s\" (UID: \"77c8fe94-c2c8-419b-a4c0-a1bc5f4d011f\") " pod="openshift-marketplace/marketplace-operator-79b997595-xqf2s" Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.551063 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ssf5\" (UniqueName: \"kubernetes.io/projected/77c8fe94-c2c8-419b-a4c0-a1bc5f4d011f-kube-api-access-4ssf5\") pod \"marketplace-operator-79b997595-xqf2s\" (UID: \"77c8fe94-c2c8-419b-a4c0-a1bc5f4d011f\") " pod="openshift-marketplace/marketplace-operator-79b997595-xqf2s" Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.551114 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/77c8fe94-c2c8-419b-a4c0-a1bc5f4d011f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xqf2s\" (UID: \"77c8fe94-c2c8-419b-a4c0-a1bc5f4d011f\") " pod="openshift-marketplace/marketplace-operator-79b997595-xqf2s" Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.552313 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/77c8fe94-c2c8-419b-a4c0-a1bc5f4d011f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xqf2s\" (UID: \"77c8fe94-c2c8-419b-a4c0-a1bc5f4d011f\") " pod="openshift-marketplace/marketplace-operator-79b997595-xqf2s" Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.559612 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/77c8fe94-c2c8-419b-a4c0-a1bc5f4d011f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xqf2s\" (UID: \"77c8fe94-c2c8-419b-a4c0-a1bc5f4d011f\") " pod="openshift-marketplace/marketplace-operator-79b997595-xqf2s" Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.568331 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ssf5\" (UniqueName: \"kubernetes.io/projected/77c8fe94-c2c8-419b-a4c0-a1bc5f4d011f-kube-api-access-4ssf5\") pod \"marketplace-operator-79b997595-xqf2s\" (UID: \"77c8fe94-c2c8-419b-a4c0-a1bc5f4d011f\") " pod="openshift-marketplace/marketplace-operator-79b997595-xqf2s" Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.738203 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xqf2s" Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.830678 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2tsg6" Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.955580 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d8e7b87-f442-4d60-bd65-35eacd097689-utilities\") pod \"4d8e7b87-f442-4d60-bd65-35eacd097689\" (UID: \"4d8e7b87-f442-4d60-bd65-35eacd097689\") " Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.955633 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d8e7b87-f442-4d60-bd65-35eacd097689-catalog-content\") pod \"4d8e7b87-f442-4d60-bd65-35eacd097689\" (UID: \"4d8e7b87-f442-4d60-bd65-35eacd097689\") " Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.955723 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqkg9\" (UniqueName: \"kubernetes.io/projected/4d8e7b87-f442-4d60-bd65-35eacd097689-kube-api-access-dqkg9\") pod \"4d8e7b87-f442-4d60-bd65-35eacd097689\" (UID: \"4d8e7b87-f442-4d60-bd65-35eacd097689\") " Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.959196 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d8e7b87-f442-4d60-bd65-35eacd097689-kube-api-access-dqkg9" (OuterVolumeSpecName: "kube-api-access-dqkg9") pod "4d8e7b87-f442-4d60-bd65-35eacd097689" (UID: "4d8e7b87-f442-4d60-bd65-35eacd097689"). InnerVolumeSpecName "kube-api-access-dqkg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:06:42 crc kubenswrapper[4830]: I0318 18:06:42.960433 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d8e7b87-f442-4d60-bd65-35eacd097689-utilities" (OuterVolumeSpecName: "utilities") pod "4d8e7b87-f442-4d60-bd65-35eacd097689" (UID: "4d8e7b87-f442-4d60-bd65-35eacd097689"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.011977 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-szdp2" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.017253 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5rhcb" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.018384 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d8e7b87-f442-4d60-bd65-35eacd097689-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d8e7b87-f442-4d60-bd65-35eacd097689" (UID: "4d8e7b87-f442-4d60-bd65-35eacd097689"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.018410 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pgl85" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.026274 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-j2j42" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.056704 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqkg9\" (UniqueName: \"kubernetes.io/projected/4d8e7b87-f442-4d60-bd65-35eacd097689-kube-api-access-dqkg9\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.057312 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d8e7b87-f442-4d60-bd65-35eacd097689-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.057323 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d8e7b87-f442-4d60-bd65-35eacd097689-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.163864 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-th75n\" (UniqueName: \"kubernetes.io/projected/b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b-kube-api-access-th75n\") pod \"b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b\" (UID: \"b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b\") " Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.163956 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbpht\" (UniqueName: \"kubernetes.io/projected/28acc7fe-7976-4396-89b7-c17a9e836b22-kube-api-access-jbpht\") pod \"28acc7fe-7976-4396-89b7-c17a9e836b22\" (UID: \"28acc7fe-7976-4396-89b7-c17a9e836b22\") " Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.164004 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b-utilities\") pod \"b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b\" (UID: \"b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b\") " Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.164042 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7332042a-dffc-4c3e-94eb-2a1dedc58062-marketplace-operator-metrics\") pod \"7332042a-dffc-4c3e-94eb-2a1dedc58062\" (UID: \"7332042a-dffc-4c3e-94eb-2a1dedc58062\") " Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.164072 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7332042a-dffc-4c3e-94eb-2a1dedc58062-marketplace-trusted-ca\") pod \"7332042a-dffc-4c3e-94eb-2a1dedc58062\" (UID: \"7332042a-dffc-4c3e-94eb-2a1dedc58062\") " Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.164107 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9160fc9-aa00-4ce7-9ea2-15aac1e11e00-utilities\") pod \"b9160fc9-aa00-4ce7-9ea2-15aac1e11e00\" (UID: \"b9160fc9-aa00-4ce7-9ea2-15aac1e11e00\") " Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.164124 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-476hk\" (UniqueName: \"kubernetes.io/projected/b9160fc9-aa00-4ce7-9ea2-15aac1e11e00-kube-api-access-476hk\") pod \"b9160fc9-aa00-4ce7-9ea2-15aac1e11e00\" (UID: \"b9160fc9-aa00-4ce7-9ea2-15aac1e11e00\") " Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.164146 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28acc7fe-7976-4396-89b7-c17a9e836b22-catalog-content\") pod \"28acc7fe-7976-4396-89b7-c17a9e836b22\" (UID: \"28acc7fe-7976-4396-89b7-c17a9e836b22\") " Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.164169 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9160fc9-aa00-4ce7-9ea2-15aac1e11e00-catalog-content\") pod \"b9160fc9-aa00-4ce7-9ea2-15aac1e11e00\" (UID: \"b9160fc9-aa00-4ce7-9ea2-15aac1e11e00\") " Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.164196 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-855js\" (UniqueName: \"kubernetes.io/projected/7332042a-dffc-4c3e-94eb-2a1dedc58062-kube-api-access-855js\") pod \"7332042a-dffc-4c3e-94eb-2a1dedc58062\" (UID: \"7332042a-dffc-4c3e-94eb-2a1dedc58062\") " Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.164230 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28acc7fe-7976-4396-89b7-c17a9e836b22-utilities\") pod \"28acc7fe-7976-4396-89b7-c17a9e836b22\" (UID: \"28acc7fe-7976-4396-89b7-c17a9e836b22\") " Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.164250 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b-catalog-content\") pod \"b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b\" (UID: \"b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b\") " Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.164828 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b-utilities" (OuterVolumeSpecName: "utilities") pod "b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b" (UID: "b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.165316 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7332042a-dffc-4c3e-94eb-2a1dedc58062-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "7332042a-dffc-4c3e-94eb-2a1dedc58062" (UID: "7332042a-dffc-4c3e-94eb-2a1dedc58062"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.166961 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28acc7fe-7976-4396-89b7-c17a9e836b22-kube-api-access-jbpht" (OuterVolumeSpecName: "kube-api-access-jbpht") pod "28acc7fe-7976-4396-89b7-c17a9e836b22" (UID: "28acc7fe-7976-4396-89b7-c17a9e836b22"). InnerVolumeSpecName "kube-api-access-jbpht". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.167073 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28acc7fe-7976-4396-89b7-c17a9e836b22-utilities" (OuterVolumeSpecName: "utilities") pod "28acc7fe-7976-4396-89b7-c17a9e836b22" (UID: "28acc7fe-7976-4396-89b7-c17a9e836b22"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.167483 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9160fc9-aa00-4ce7-9ea2-15aac1e11e00-utilities" (OuterVolumeSpecName: "utilities") pod "b9160fc9-aa00-4ce7-9ea2-15aac1e11e00" (UID: "b9160fc9-aa00-4ce7-9ea2-15aac1e11e00"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.170828 4830 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7332042a-dffc-4c3e-94eb-2a1dedc58062-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.171151 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.175429 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7332042a-dffc-4c3e-94eb-2a1dedc58062-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "7332042a-dffc-4c3e-94eb-2a1dedc58062" (UID: "7332042a-dffc-4c3e-94eb-2a1dedc58062"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.176026 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b-kube-api-access-th75n" (OuterVolumeSpecName: "kube-api-access-th75n") pod "b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b" (UID: "b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b"). InnerVolumeSpecName "kube-api-access-th75n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.178732 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7332042a-dffc-4c3e-94eb-2a1dedc58062-kube-api-access-855js" (OuterVolumeSpecName: "kube-api-access-855js") pod "7332042a-dffc-4c3e-94eb-2a1dedc58062" (UID: "7332042a-dffc-4c3e-94eb-2a1dedc58062"). InnerVolumeSpecName "kube-api-access-855js". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.182637 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9160fc9-aa00-4ce7-9ea2-15aac1e11e00-kube-api-access-476hk" (OuterVolumeSpecName: "kube-api-access-476hk") pod "b9160fc9-aa00-4ce7-9ea2-15aac1e11e00" (UID: "b9160fc9-aa00-4ce7-9ea2-15aac1e11e00"). InnerVolumeSpecName "kube-api-access-476hk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.211608 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28acc7fe-7976-4396-89b7-c17a9e836b22-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "28acc7fe-7976-4396-89b7-c17a9e836b22" (UID: "28acc7fe-7976-4396-89b7-c17a9e836b22"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.246386 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b" (UID: "b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.253957 4830 generic.go:334] "Generic (PLEG): container finished" podID="4d8e7b87-f442-4d60-bd65-35eacd097689" containerID="b40570a257725b0ded8998c2f6f61f7ab3b0c55b679b190aaefa939b3cd1eef6" exitCode=0 Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.254042 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2tsg6" event={"ID":"4d8e7b87-f442-4d60-bd65-35eacd097689","Type":"ContainerDied","Data":"b40570a257725b0ded8998c2f6f61f7ab3b0c55b679b190aaefa939b3cd1eef6"} Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.254078 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2tsg6" event={"ID":"4d8e7b87-f442-4d60-bd65-35eacd097689","Type":"ContainerDied","Data":"71d5582447836975691b3f07089271b4406186ddc756751f426b635cc694078b"} Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.254099 4830 scope.go:117] "RemoveContainer" containerID="b40570a257725b0ded8998c2f6f61f7ab3b0c55b679b190aaefa939b3cd1eef6" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.254295 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2tsg6" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.258690 4830 generic.go:334] "Generic (PLEG): container finished" podID="b9160fc9-aa00-4ce7-9ea2-15aac1e11e00" containerID="35eb2be93b7d0764a58ab2a57d8f800a513854a8cc2da33c69f8cae68f95351c" exitCode=0 Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.258891 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rhcb" event={"ID":"b9160fc9-aa00-4ce7-9ea2-15aac1e11e00","Type":"ContainerDied","Data":"35eb2be93b7d0764a58ab2a57d8f800a513854a8cc2da33c69f8cae68f95351c"} Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.260853 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rhcb" event={"ID":"b9160fc9-aa00-4ce7-9ea2-15aac1e11e00","Type":"ContainerDied","Data":"81ee93b655560cebea802e010ceae8cd1a58abe1d5512b542da024ce9497bc30"} Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.258988 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5rhcb" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.268233 4830 generic.go:334] "Generic (PLEG): container finished" podID="28acc7fe-7976-4396-89b7-c17a9e836b22" containerID="f854cd2996b7a5f26ab5e48779f51b79c2a7a32754e3ff242dda78ccf37b6fe7" exitCode=0 Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.268357 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pgl85" event={"ID":"28acc7fe-7976-4396-89b7-c17a9e836b22","Type":"ContainerDied","Data":"f854cd2996b7a5f26ab5e48779f51b79c2a7a32754e3ff242dda78ccf37b6fe7"} Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.268395 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pgl85" event={"ID":"28acc7fe-7976-4396-89b7-c17a9e836b22","Type":"ContainerDied","Data":"6320f1125d9bfab72190e3c7917331467a5a0d30f90223ee475c0666444555ab"} Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.268473 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pgl85" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.275865 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xqf2s"] Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.275958 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-th75n\" (UniqueName: \"kubernetes.io/projected/b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b-kube-api-access-th75n\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.275978 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbpht\" (UniqueName: \"kubernetes.io/projected/28acc7fe-7976-4396-89b7-c17a9e836b22-kube-api-access-jbpht\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.275990 4830 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7332042a-dffc-4c3e-94eb-2a1dedc58062-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.276000 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-476hk\" (UniqueName: \"kubernetes.io/projected/b9160fc9-aa00-4ce7-9ea2-15aac1e11e00-kube-api-access-476hk\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.276010 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9160fc9-aa00-4ce7-9ea2-15aac1e11e00-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.276020 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28acc7fe-7976-4396-89b7-c17a9e836b22-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.276030 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-855js\" (UniqueName: \"kubernetes.io/projected/7332042a-dffc-4c3e-94eb-2a1dedc58062-kube-api-access-855js\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.276039 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28acc7fe-7976-4396-89b7-c17a9e836b22-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.276048 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.279348 4830 generic.go:334] "Generic (PLEG): container finished" podID="b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b" containerID="276129449ac1ad9821a5efb88f1be1d11cf98fe4dfc24d32a7405056790f4d4e" exitCode=0 Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.279532 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-szdp2" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.279587 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szdp2" event={"ID":"b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b","Type":"ContainerDied","Data":"276129449ac1ad9821a5efb88f1be1d11cf98fe4dfc24d32a7405056790f4d4e"} Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.279694 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szdp2" event={"ID":"b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b","Type":"ContainerDied","Data":"c4679f51866bb6dfe4b090db3a1f5996d6f4217a9f48336c793076441a85c87e"} Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.282211 4830 generic.go:334] "Generic (PLEG): container finished" podID="7332042a-dffc-4c3e-94eb-2a1dedc58062" containerID="7a5a7383097e5257a71b6aa3da8a366746db9f368caa6f41e4d909a957e6fb28" exitCode=0 Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.282252 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-j2j42" event={"ID":"7332042a-dffc-4c3e-94eb-2a1dedc58062","Type":"ContainerDied","Data":"7a5a7383097e5257a71b6aa3da8a366746db9f368caa6f41e4d909a957e6fb28"} Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.282270 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-j2j42" event={"ID":"7332042a-dffc-4c3e-94eb-2a1dedc58062","Type":"ContainerDied","Data":"e3b3476a4f611f80c90c6f9c9f4cd6702275d3470a1c6289c245f21adcc96e36"} Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.282328 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-j2j42" Mar 18 18:06:43 crc kubenswrapper[4830]: W0318 18:06:43.285595 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77c8fe94_c2c8_419b_a4c0_a1bc5f4d011f.slice/crio-e79ca74605d08991f2242f1b5b0997972c4db64c2f1745971a85f0ee4485d121 WatchSource:0}: Error finding container e79ca74605d08991f2242f1b5b0997972c4db64c2f1745971a85f0ee4485d121: Status 404 returned error can't find the container with id e79ca74605d08991f2242f1b5b0997972c4db64c2f1745971a85f0ee4485d121 Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.288077 4830 scope.go:117] "RemoveContainer" containerID="b827e2a6908d53b4c0a9cb794f216f84c284e15b68f5923e2bb47c52f5a2b7ad" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.289731 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.322964 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9160fc9-aa00-4ce7-9ea2-15aac1e11e00-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b9160fc9-aa00-4ce7-9ea2-15aac1e11e00" (UID: "b9160fc9-aa00-4ce7-9ea2-15aac1e11e00"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.329266 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pgl85"] Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.335207 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pgl85"] Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.339303 4830 scope.go:117] "RemoveContainer" containerID="a412c2964a8fd96a07c4fec16c5e15e2062037a306dcb2afb25431fe647d9fad" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.350482 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j2j42"] Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.356805 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j2j42"] Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.362822 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.363836 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-szdp2"] Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.364544 4830 scope.go:117] "RemoveContainer" containerID="b40570a257725b0ded8998c2f6f61f7ab3b0c55b679b190aaefa939b3cd1eef6" Mar 18 18:06:43 crc kubenswrapper[4830]: E0318 18:06:43.365554 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b40570a257725b0ded8998c2f6f61f7ab3b0c55b679b190aaefa939b3cd1eef6\": container with ID starting with b40570a257725b0ded8998c2f6f61f7ab3b0c55b679b190aaefa939b3cd1eef6 not found: ID does not exist" containerID="b40570a257725b0ded8998c2f6f61f7ab3b0c55b679b190aaefa939b3cd1eef6" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.365592 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b40570a257725b0ded8998c2f6f61f7ab3b0c55b679b190aaefa939b3cd1eef6"} err="failed to get container status \"b40570a257725b0ded8998c2f6f61f7ab3b0c55b679b190aaefa939b3cd1eef6\": rpc error: code = NotFound desc = could not find container \"b40570a257725b0ded8998c2f6f61f7ab3b0c55b679b190aaefa939b3cd1eef6\": container with ID starting with b40570a257725b0ded8998c2f6f61f7ab3b0c55b679b190aaefa939b3cd1eef6 not found: ID does not exist" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.365621 4830 scope.go:117] "RemoveContainer" containerID="b827e2a6908d53b4c0a9cb794f216f84c284e15b68f5923e2bb47c52f5a2b7ad" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.366978 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-szdp2"] Mar 18 18:06:43 crc kubenswrapper[4830]: E0318 18:06:43.367346 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b827e2a6908d53b4c0a9cb794f216f84c284e15b68f5923e2bb47c52f5a2b7ad\": container with ID starting with b827e2a6908d53b4c0a9cb794f216f84c284e15b68f5923e2bb47c52f5a2b7ad not found: ID does not exist" containerID="b827e2a6908d53b4c0a9cb794f216f84c284e15b68f5923e2bb47c52f5a2b7ad" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.367400 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b827e2a6908d53b4c0a9cb794f216f84c284e15b68f5923e2bb47c52f5a2b7ad"} err="failed to get container status \"b827e2a6908d53b4c0a9cb794f216f84c284e15b68f5923e2bb47c52f5a2b7ad\": rpc error: code = NotFound desc = could not find container \"b827e2a6908d53b4c0a9cb794f216f84c284e15b68f5923e2bb47c52f5a2b7ad\": container with ID starting with b827e2a6908d53b4c0a9cb794f216f84c284e15b68f5923e2bb47c52f5a2b7ad not found: ID does not exist" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.367428 4830 scope.go:117] "RemoveContainer" containerID="a412c2964a8fd96a07c4fec16c5e15e2062037a306dcb2afb25431fe647d9fad" Mar 18 18:06:43 crc kubenswrapper[4830]: E0318 18:06:43.369569 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a412c2964a8fd96a07c4fec16c5e15e2062037a306dcb2afb25431fe647d9fad\": container with ID starting with a412c2964a8fd96a07c4fec16c5e15e2062037a306dcb2afb25431fe647d9fad not found: ID does not exist" containerID="a412c2964a8fd96a07c4fec16c5e15e2062037a306dcb2afb25431fe647d9fad" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.369605 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a412c2964a8fd96a07c4fec16c5e15e2062037a306dcb2afb25431fe647d9fad"} err="failed to get container status \"a412c2964a8fd96a07c4fec16c5e15e2062037a306dcb2afb25431fe647d9fad\": rpc error: code = NotFound desc = could not find container \"a412c2964a8fd96a07c4fec16c5e15e2062037a306dcb2afb25431fe647d9fad\": container with ID starting with a412c2964a8fd96a07c4fec16c5e15e2062037a306dcb2afb25431fe647d9fad not found: ID does not exist" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.369634 4830 scope.go:117] "RemoveContainer" containerID="35eb2be93b7d0764a58ab2a57d8f800a513854a8cc2da33c69f8cae68f95351c" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.370254 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2tsg6"] Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.373002 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2tsg6"] Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.379145 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9160fc9-aa00-4ce7-9ea2-15aac1e11e00-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.394934 4830 scope.go:117] "RemoveContainer" containerID="ea24c23eb2065a637c2699e54f61c0d7e6bf6d3e2a62993a38f72bf1cab2049c" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.420685 4830 scope.go:117] "RemoveContainer" containerID="3a85770aba5f302eed9e1195ccf3291ad1f43d7562babdeb6847147d9a4077b6" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.451800 4830 scope.go:117] "RemoveContainer" containerID="35eb2be93b7d0764a58ab2a57d8f800a513854a8cc2da33c69f8cae68f95351c" Mar 18 18:06:43 crc kubenswrapper[4830]: E0318 18:06:43.452484 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35eb2be93b7d0764a58ab2a57d8f800a513854a8cc2da33c69f8cae68f95351c\": container with ID starting with 35eb2be93b7d0764a58ab2a57d8f800a513854a8cc2da33c69f8cae68f95351c not found: ID does not exist" containerID="35eb2be93b7d0764a58ab2a57d8f800a513854a8cc2da33c69f8cae68f95351c" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.452539 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35eb2be93b7d0764a58ab2a57d8f800a513854a8cc2da33c69f8cae68f95351c"} err="failed to get container status \"35eb2be93b7d0764a58ab2a57d8f800a513854a8cc2da33c69f8cae68f95351c\": rpc error: code = NotFound desc = could not find container \"35eb2be93b7d0764a58ab2a57d8f800a513854a8cc2da33c69f8cae68f95351c\": container with ID starting with 35eb2be93b7d0764a58ab2a57d8f800a513854a8cc2da33c69f8cae68f95351c not found: ID does not exist" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.452571 4830 scope.go:117] "RemoveContainer" containerID="ea24c23eb2065a637c2699e54f61c0d7e6bf6d3e2a62993a38f72bf1cab2049c" Mar 18 18:06:43 crc kubenswrapper[4830]: E0318 18:06:43.452884 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea24c23eb2065a637c2699e54f61c0d7e6bf6d3e2a62993a38f72bf1cab2049c\": container with ID starting with ea24c23eb2065a637c2699e54f61c0d7e6bf6d3e2a62993a38f72bf1cab2049c not found: ID does not exist" containerID="ea24c23eb2065a637c2699e54f61c0d7e6bf6d3e2a62993a38f72bf1cab2049c" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.452920 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea24c23eb2065a637c2699e54f61c0d7e6bf6d3e2a62993a38f72bf1cab2049c"} err="failed to get container status \"ea24c23eb2065a637c2699e54f61c0d7e6bf6d3e2a62993a38f72bf1cab2049c\": rpc error: code = NotFound desc = could not find container \"ea24c23eb2065a637c2699e54f61c0d7e6bf6d3e2a62993a38f72bf1cab2049c\": container with ID starting with ea24c23eb2065a637c2699e54f61c0d7e6bf6d3e2a62993a38f72bf1cab2049c not found: ID does not exist" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.452945 4830 scope.go:117] "RemoveContainer" containerID="3a85770aba5f302eed9e1195ccf3291ad1f43d7562babdeb6847147d9a4077b6" Mar 18 18:06:43 crc kubenswrapper[4830]: E0318 18:06:43.453142 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a85770aba5f302eed9e1195ccf3291ad1f43d7562babdeb6847147d9a4077b6\": container with ID starting with 3a85770aba5f302eed9e1195ccf3291ad1f43d7562babdeb6847147d9a4077b6 not found: ID does not exist" containerID="3a85770aba5f302eed9e1195ccf3291ad1f43d7562babdeb6847147d9a4077b6" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.453170 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a85770aba5f302eed9e1195ccf3291ad1f43d7562babdeb6847147d9a4077b6"} err="failed to get container status \"3a85770aba5f302eed9e1195ccf3291ad1f43d7562babdeb6847147d9a4077b6\": rpc error: code = NotFound desc = could not find container \"3a85770aba5f302eed9e1195ccf3291ad1f43d7562babdeb6847147d9a4077b6\": container with ID starting with 3a85770aba5f302eed9e1195ccf3291ad1f43d7562babdeb6847147d9a4077b6 not found: ID does not exist" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.453186 4830 scope.go:117] "RemoveContainer" containerID="f854cd2996b7a5f26ab5e48779f51b79c2a7a32754e3ff242dda78ccf37b6fe7" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.552679 4830 scope.go:117] "RemoveContainer" containerID="960b5a884be83df21b88728efc6fd7d0af09db0ce1ca8f1c58b2b2f996cf2ea1" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.563687 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.577978 4830 scope.go:117] "RemoveContainer" containerID="31c12d81c60a6aa9a7f1b5897ace75b50a2ca587dfbd86d39e10f62dd3756f63" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.599529 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5rhcb"] Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.601232 4830 scope.go:117] "RemoveContainer" containerID="f854cd2996b7a5f26ab5e48779f51b79c2a7a32754e3ff242dda78ccf37b6fe7" Mar 18 18:06:43 crc kubenswrapper[4830]: E0318 18:06:43.601915 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f854cd2996b7a5f26ab5e48779f51b79c2a7a32754e3ff242dda78ccf37b6fe7\": container with ID starting with f854cd2996b7a5f26ab5e48779f51b79c2a7a32754e3ff242dda78ccf37b6fe7 not found: ID does not exist" containerID="f854cd2996b7a5f26ab5e48779f51b79c2a7a32754e3ff242dda78ccf37b6fe7" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.601962 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f854cd2996b7a5f26ab5e48779f51b79c2a7a32754e3ff242dda78ccf37b6fe7"} err="failed to get container status \"f854cd2996b7a5f26ab5e48779f51b79c2a7a32754e3ff242dda78ccf37b6fe7\": rpc error: code = NotFound desc = could not find container \"f854cd2996b7a5f26ab5e48779f51b79c2a7a32754e3ff242dda78ccf37b6fe7\": container with ID starting with f854cd2996b7a5f26ab5e48779f51b79c2a7a32754e3ff242dda78ccf37b6fe7 not found: ID does not exist" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.601993 4830 scope.go:117] "RemoveContainer" containerID="960b5a884be83df21b88728efc6fd7d0af09db0ce1ca8f1c58b2b2f996cf2ea1" Mar 18 18:06:43 crc kubenswrapper[4830]: E0318 18:06:43.602512 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"960b5a884be83df21b88728efc6fd7d0af09db0ce1ca8f1c58b2b2f996cf2ea1\": container with ID starting with 960b5a884be83df21b88728efc6fd7d0af09db0ce1ca8f1c58b2b2f996cf2ea1 not found: ID does not exist" containerID="960b5a884be83df21b88728efc6fd7d0af09db0ce1ca8f1c58b2b2f996cf2ea1" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.602561 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"960b5a884be83df21b88728efc6fd7d0af09db0ce1ca8f1c58b2b2f996cf2ea1"} err="failed to get container status \"960b5a884be83df21b88728efc6fd7d0af09db0ce1ca8f1c58b2b2f996cf2ea1\": rpc error: code = NotFound desc = could not find container \"960b5a884be83df21b88728efc6fd7d0af09db0ce1ca8f1c58b2b2f996cf2ea1\": container with ID starting with 960b5a884be83df21b88728efc6fd7d0af09db0ce1ca8f1c58b2b2f996cf2ea1 not found: ID does not exist" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.602599 4830 scope.go:117] "RemoveContainer" containerID="31c12d81c60a6aa9a7f1b5897ace75b50a2ca587dfbd86d39e10f62dd3756f63" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.602734 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5rhcb"] Mar 18 18:06:43 crc kubenswrapper[4830]: E0318 18:06:43.603403 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31c12d81c60a6aa9a7f1b5897ace75b50a2ca587dfbd86d39e10f62dd3756f63\": container with ID starting with 31c12d81c60a6aa9a7f1b5897ace75b50a2ca587dfbd86d39e10f62dd3756f63 not found: ID does not exist" containerID="31c12d81c60a6aa9a7f1b5897ace75b50a2ca587dfbd86d39e10f62dd3756f63" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.603479 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31c12d81c60a6aa9a7f1b5897ace75b50a2ca587dfbd86d39e10f62dd3756f63"} err="failed to get container status \"31c12d81c60a6aa9a7f1b5897ace75b50a2ca587dfbd86d39e10f62dd3756f63\": rpc error: code = NotFound desc = could not find container \"31c12d81c60a6aa9a7f1b5897ace75b50a2ca587dfbd86d39e10f62dd3756f63\": container with ID starting with 31c12d81c60a6aa9a7f1b5897ace75b50a2ca587dfbd86d39e10f62dd3756f63 not found: ID does not exist" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.603513 4830 scope.go:117] "RemoveContainer" containerID="276129449ac1ad9821a5efb88f1be1d11cf98fe4dfc24d32a7405056790f4d4e" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.619368 4830 scope.go:117] "RemoveContainer" containerID="d6884eefe6dc0814e6117e6448cd1e441b1930cffc4960080c7f9d8921b62cf9" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.641556 4830 scope.go:117] "RemoveContainer" containerID="8cc3adf8017632f8d233674fe40a69eff7e10960961467c569441ff2b858da46" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.671029 4830 scope.go:117] "RemoveContainer" containerID="276129449ac1ad9821a5efb88f1be1d11cf98fe4dfc24d32a7405056790f4d4e" Mar 18 18:06:43 crc kubenswrapper[4830]: E0318 18:06:43.672956 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"276129449ac1ad9821a5efb88f1be1d11cf98fe4dfc24d32a7405056790f4d4e\": container with ID starting with 276129449ac1ad9821a5efb88f1be1d11cf98fe4dfc24d32a7405056790f4d4e not found: ID does not exist" containerID="276129449ac1ad9821a5efb88f1be1d11cf98fe4dfc24d32a7405056790f4d4e" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.673006 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"276129449ac1ad9821a5efb88f1be1d11cf98fe4dfc24d32a7405056790f4d4e"} err="failed to get container status \"276129449ac1ad9821a5efb88f1be1d11cf98fe4dfc24d32a7405056790f4d4e\": rpc error: code = NotFound desc = could not find container \"276129449ac1ad9821a5efb88f1be1d11cf98fe4dfc24d32a7405056790f4d4e\": container with ID starting with 276129449ac1ad9821a5efb88f1be1d11cf98fe4dfc24d32a7405056790f4d4e not found: ID does not exist" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.673038 4830 scope.go:117] "RemoveContainer" containerID="d6884eefe6dc0814e6117e6448cd1e441b1930cffc4960080c7f9d8921b62cf9" Mar 18 18:06:43 crc kubenswrapper[4830]: E0318 18:06:43.673895 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6884eefe6dc0814e6117e6448cd1e441b1930cffc4960080c7f9d8921b62cf9\": container with ID starting with d6884eefe6dc0814e6117e6448cd1e441b1930cffc4960080c7f9d8921b62cf9 not found: ID does not exist" containerID="d6884eefe6dc0814e6117e6448cd1e441b1930cffc4960080c7f9d8921b62cf9" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.673961 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6884eefe6dc0814e6117e6448cd1e441b1930cffc4960080c7f9d8921b62cf9"} err="failed to get container status \"d6884eefe6dc0814e6117e6448cd1e441b1930cffc4960080c7f9d8921b62cf9\": rpc error: code = NotFound desc = could not find container \"d6884eefe6dc0814e6117e6448cd1e441b1930cffc4960080c7f9d8921b62cf9\": container with ID starting with d6884eefe6dc0814e6117e6448cd1e441b1930cffc4960080c7f9d8921b62cf9 not found: ID does not exist" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.674000 4830 scope.go:117] "RemoveContainer" containerID="8cc3adf8017632f8d233674fe40a69eff7e10960961467c569441ff2b858da46" Mar 18 18:06:43 crc kubenswrapper[4830]: E0318 18:06:43.674488 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cc3adf8017632f8d233674fe40a69eff7e10960961467c569441ff2b858da46\": container with ID starting with 8cc3adf8017632f8d233674fe40a69eff7e10960961467c569441ff2b858da46 not found: ID does not exist" containerID="8cc3adf8017632f8d233674fe40a69eff7e10960961467c569441ff2b858da46" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.674548 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cc3adf8017632f8d233674fe40a69eff7e10960961467c569441ff2b858da46"} err="failed to get container status \"8cc3adf8017632f8d233674fe40a69eff7e10960961467c569441ff2b858da46\": rpc error: code = NotFound desc = could not find container \"8cc3adf8017632f8d233674fe40a69eff7e10960961467c569441ff2b858da46\": container with ID starting with 8cc3adf8017632f8d233674fe40a69eff7e10960961467c569441ff2b858da46 not found: ID does not exist" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.674583 4830 scope.go:117] "RemoveContainer" containerID="7a5a7383097e5257a71b6aa3da8a366746db9f368caa6f41e4d909a957e6fb28" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.694739 4830 scope.go:117] "RemoveContainer" containerID="7a5a7383097e5257a71b6aa3da8a366746db9f368caa6f41e4d909a957e6fb28" Mar 18 18:06:43 crc kubenswrapper[4830]: E0318 18:06:43.695470 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a5a7383097e5257a71b6aa3da8a366746db9f368caa6f41e4d909a957e6fb28\": container with ID starting with 7a5a7383097e5257a71b6aa3da8a366746db9f368caa6f41e4d909a957e6fb28 not found: ID does not exist" containerID="7a5a7383097e5257a71b6aa3da8a366746db9f368caa6f41e4d909a957e6fb28" Mar 18 18:06:43 crc kubenswrapper[4830]: I0318 18:06:43.695517 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a5a7383097e5257a71b6aa3da8a366746db9f368caa6f41e4d909a957e6fb28"} err="failed to get container status \"7a5a7383097e5257a71b6aa3da8a366746db9f368caa6f41e4d909a957e6fb28\": rpc error: code = NotFound desc = could not find container \"7a5a7383097e5257a71b6aa3da8a366746db9f368caa6f41e4d909a957e6fb28\": container with ID starting with 7a5a7383097e5257a71b6aa3da8a366746db9f368caa6f41e4d909a957e6fb28 not found: ID does not exist" Mar 18 18:06:44 crc kubenswrapper[4830]: I0318 18:06:44.242366 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28acc7fe-7976-4396-89b7-c17a9e836b22" path="/var/lib/kubelet/pods/28acc7fe-7976-4396-89b7-c17a9e836b22/volumes" Mar 18 18:06:44 crc kubenswrapper[4830]: I0318 18:06:44.243492 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d8e7b87-f442-4d60-bd65-35eacd097689" path="/var/lib/kubelet/pods/4d8e7b87-f442-4d60-bd65-35eacd097689/volumes" Mar 18 18:06:44 crc kubenswrapper[4830]: I0318 18:06:44.244137 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7332042a-dffc-4c3e-94eb-2a1dedc58062" path="/var/lib/kubelet/pods/7332042a-dffc-4c3e-94eb-2a1dedc58062/volumes" Mar 18 18:06:44 crc kubenswrapper[4830]: I0318 18:06:44.245168 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b" path="/var/lib/kubelet/pods/b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b/volumes" Mar 18 18:06:44 crc kubenswrapper[4830]: I0318 18:06:44.245754 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9160fc9-aa00-4ce7-9ea2-15aac1e11e00" path="/var/lib/kubelet/pods/b9160fc9-aa00-4ce7-9ea2-15aac1e11e00/volumes" Mar 18 18:06:44 crc kubenswrapper[4830]: I0318 18:06:44.291052 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xqf2s" event={"ID":"77c8fe94-c2c8-419b-a4c0-a1bc5f4d011f","Type":"ContainerStarted","Data":"1a1afc24b6eb6035367cadc1c9885f5b26ae31b64f0352382fee1c36fe4e754b"} Mar 18 18:06:44 crc kubenswrapper[4830]: I0318 18:06:44.291092 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xqf2s" event={"ID":"77c8fe94-c2c8-419b-a4c0-a1bc5f4d011f","Type":"ContainerStarted","Data":"e79ca74605d08991f2242f1b5b0997972c4db64c2f1745971a85f0ee4485d121"} Mar 18 18:06:44 crc kubenswrapper[4830]: I0318 18:06:44.291395 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-xqf2s" Mar 18 18:06:44 crc kubenswrapper[4830]: I0318 18:06:44.297727 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-xqf2s" Mar 18 18:06:44 crc kubenswrapper[4830]: I0318 18:06:44.299132 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 18:06:44 crc kubenswrapper[4830]: I0318 18:06:44.299184 4830 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="75d69fbbb950d1a6e04f1412e46caeb6dd8f8e1fe102e6aed90390def08f887c" exitCode=137 Mar 18 18:06:44 crc kubenswrapper[4830]: I0318 18:06:44.306724 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-xqf2s" podStartSLOduration=2.306708117 podStartE2EDuration="2.306708117s" podCreationTimestamp="2026-03-18 18:06:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:06:44.304482286 +0000 UTC m=+238.872112618" watchObservedRunningTime="2026-03-18 18:06:44.306708117 +0000 UTC m=+238.874338449" Mar 18 18:06:44 crc kubenswrapper[4830]: I0318 18:06:44.604930 4830 csr.go:261] certificate signing request csr-wpndp is approved, waiting to be issued Mar 18 18:06:44 crc kubenswrapper[4830]: I0318 18:06:44.629936 4830 csr.go:257] certificate signing request csr-wpndp is issued Mar 18 18:06:44 crc kubenswrapper[4830]: I0318 18:06:44.711383 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 18 18:06:45 crc kubenswrapper[4830]: I0318 18:06:45.338486 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 18 18:06:45 crc kubenswrapper[4830]: I0318 18:06:45.632140 4830 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-24 03:54:22.492730408 +0000 UTC Mar 18 18:06:45 crc kubenswrapper[4830]: I0318 18:06:45.632181 4830 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6729h47m36.86055218s for next certificate rotation Mar 18 18:06:46 crc kubenswrapper[4830]: I0318 18:06:46.138110 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 18:06:46 crc kubenswrapper[4830]: I0318 18:06:46.138175 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:06:46 crc kubenswrapper[4830]: I0318 18:06:46.317465 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 18:06:46 crc kubenswrapper[4830]: I0318 18:06:46.317510 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 18:06:46 crc kubenswrapper[4830]: I0318 18:06:46.317531 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 18:06:46 crc kubenswrapper[4830]: I0318 18:06:46.317550 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 18:06:46 crc kubenswrapper[4830]: I0318 18:06:46.317608 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:06:46 crc kubenswrapper[4830]: I0318 18:06:46.317625 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:06:46 crc kubenswrapper[4830]: I0318 18:06:46.317686 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:06:46 crc kubenswrapper[4830]: I0318 18:06:46.317887 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 18:06:46 crc kubenswrapper[4830]: I0318 18:06:46.318050 4830 scope.go:117] "RemoveContainer" containerID="75d69fbbb950d1a6e04f1412e46caeb6dd8f8e1fe102e6aed90390def08f887c" Mar 18 18:06:46 crc kubenswrapper[4830]: I0318 18:06:46.318086 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:06:46 crc kubenswrapper[4830]: I0318 18:06:46.318982 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 18:06:46 crc kubenswrapper[4830]: I0318 18:06:46.319072 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:06:46 crc kubenswrapper[4830]: I0318 18:06:46.319551 4830 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:46 crc kubenswrapper[4830]: I0318 18:06:46.319568 4830 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:46 crc kubenswrapper[4830]: I0318 18:06:46.319578 4830 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:46 crc kubenswrapper[4830]: I0318 18:06:46.319604 4830 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:46 crc kubenswrapper[4830]: I0318 18:06:46.344974 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:06:46 crc kubenswrapper[4830]: I0318 18:06:46.420645 4830 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:46 crc kubenswrapper[4830]: I0318 18:06:46.633647 4830 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-08 01:13:57.607491029 +0000 UTC Mar 18 18:06:46 crc kubenswrapper[4830]: I0318 18:06:46.633864 4830 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7087h7m10.973633205s for next certificate rotation Mar 18 18:06:48 crc kubenswrapper[4830]: I0318 18:06:48.247611 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 18 18:06:48 crc kubenswrapper[4830]: I0318 18:06:48.334714 4830 generic.go:334] "Generic (PLEG): container finished" podID="9199b38a-eef8-4a83-a1b8-0f6fd6faaffd" containerID="fd9f357484bb5699efad9a8d926a5b96868a1510b7a97d470ebc5dbd41c4c863" exitCode=0 Mar 18 18:06:48 crc kubenswrapper[4830]: I0318 18:06:48.334824 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564286-l7vpb" event={"ID":"9199b38a-eef8-4a83-a1b8-0f6fd6faaffd","Type":"ContainerDied","Data":"fd9f357484bb5699efad9a8d926a5b96868a1510b7a97d470ebc5dbd41c4c863"} Mar 18 18:06:49 crc kubenswrapper[4830]: I0318 18:06:49.872533 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564286-l7vpb" Mar 18 18:06:49 crc kubenswrapper[4830]: I0318 18:06:49.985841 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w876s\" (UniqueName: \"kubernetes.io/projected/9199b38a-eef8-4a83-a1b8-0f6fd6faaffd-kube-api-access-w876s\") pod \"9199b38a-eef8-4a83-a1b8-0f6fd6faaffd\" (UID: \"9199b38a-eef8-4a83-a1b8-0f6fd6faaffd\") " Mar 18 18:06:49 crc kubenswrapper[4830]: I0318 18:06:49.995084 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9199b38a-eef8-4a83-a1b8-0f6fd6faaffd-kube-api-access-w876s" (OuterVolumeSpecName: "kube-api-access-w876s") pod "9199b38a-eef8-4a83-a1b8-0f6fd6faaffd" (UID: "9199b38a-eef8-4a83-a1b8-0f6fd6faaffd"). InnerVolumeSpecName "kube-api-access-w876s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:06:50 crc kubenswrapper[4830]: I0318 18:06:50.087912 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w876s\" (UniqueName: \"kubernetes.io/projected/9199b38a-eef8-4a83-a1b8-0f6fd6faaffd-kube-api-access-w876s\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:50 crc kubenswrapper[4830]: I0318 18:06:50.348217 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564286-l7vpb" event={"ID":"9199b38a-eef8-4a83-a1b8-0f6fd6faaffd","Type":"ContainerDied","Data":"a5d7b29ffd486103333c76bf01db498469a81629fe6499e8b8402c09e296f460"} Mar 18 18:06:50 crc kubenswrapper[4830]: I0318 18:06:50.348256 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5d7b29ffd486103333c76bf01db498469a81629fe6499e8b8402c09e296f460" Mar 18 18:06:50 crc kubenswrapper[4830]: I0318 18:06:50.348329 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564286-l7vpb" Mar 18 18:06:59 crc kubenswrapper[4830]: I0318 18:06:59.509527 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:06:59 crc kubenswrapper[4830]: I0318 18:06:59.511026 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:07:21 crc kubenswrapper[4830]: I0318 18:07:21.943747 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wnjfd"] Mar 18 18:07:21 crc kubenswrapper[4830]: E0318 18:07:21.944539 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9160fc9-aa00-4ce7-9ea2-15aac1e11e00" containerName="extract-content" Mar 18 18:07:21 crc kubenswrapper[4830]: I0318 18:07:21.944551 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9160fc9-aa00-4ce7-9ea2-15aac1e11e00" containerName="extract-content" Mar 18 18:07:21 crc kubenswrapper[4830]: E0318 18:07:21.944564 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b" containerName="extract-utilities" Mar 18 18:07:21 crc kubenswrapper[4830]: I0318 18:07:21.944570 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b" containerName="extract-utilities" Mar 18 18:07:21 crc kubenswrapper[4830]: E0318 18:07:21.944579 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9199b38a-eef8-4a83-a1b8-0f6fd6faaffd" containerName="oc" Mar 18 18:07:21 crc kubenswrapper[4830]: I0318 18:07:21.944585 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="9199b38a-eef8-4a83-a1b8-0f6fd6faaffd" containerName="oc" Mar 18 18:07:21 crc kubenswrapper[4830]: E0318 18:07:21.944592 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d8e7b87-f442-4d60-bd65-35eacd097689" containerName="registry-server" Mar 18 18:07:21 crc kubenswrapper[4830]: I0318 18:07:21.944597 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d8e7b87-f442-4d60-bd65-35eacd097689" containerName="registry-server" Mar 18 18:07:21 crc kubenswrapper[4830]: E0318 18:07:21.944607 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d8e7b87-f442-4d60-bd65-35eacd097689" containerName="extract-content" Mar 18 18:07:21 crc kubenswrapper[4830]: I0318 18:07:21.944612 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d8e7b87-f442-4d60-bd65-35eacd097689" containerName="extract-content" Mar 18 18:07:21 crc kubenswrapper[4830]: E0318 18:07:21.944620 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9160fc9-aa00-4ce7-9ea2-15aac1e11e00" containerName="extract-utilities" Mar 18 18:07:21 crc kubenswrapper[4830]: I0318 18:07:21.944626 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9160fc9-aa00-4ce7-9ea2-15aac1e11e00" containerName="extract-utilities" Mar 18 18:07:21 crc kubenswrapper[4830]: E0318 18:07:21.944636 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9160fc9-aa00-4ce7-9ea2-15aac1e11e00" containerName="registry-server" Mar 18 18:07:21 crc kubenswrapper[4830]: I0318 18:07:21.944641 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9160fc9-aa00-4ce7-9ea2-15aac1e11e00" containerName="registry-server" Mar 18 18:07:21 crc kubenswrapper[4830]: E0318 18:07:21.944649 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b" containerName="extract-content" Mar 18 18:07:21 crc kubenswrapper[4830]: I0318 18:07:21.944654 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b" containerName="extract-content" Mar 18 18:07:21 crc kubenswrapper[4830]: E0318 18:07:21.944663 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28acc7fe-7976-4396-89b7-c17a9e836b22" containerName="extract-content" Mar 18 18:07:21 crc kubenswrapper[4830]: I0318 18:07:21.944668 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="28acc7fe-7976-4396-89b7-c17a9e836b22" containerName="extract-content" Mar 18 18:07:21 crc kubenswrapper[4830]: E0318 18:07:21.944674 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28acc7fe-7976-4396-89b7-c17a9e836b22" containerName="extract-utilities" Mar 18 18:07:21 crc kubenswrapper[4830]: I0318 18:07:21.944680 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="28acc7fe-7976-4396-89b7-c17a9e836b22" containerName="extract-utilities" Mar 18 18:07:21 crc kubenswrapper[4830]: E0318 18:07:21.944688 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d8e7b87-f442-4d60-bd65-35eacd097689" containerName="extract-utilities" Mar 18 18:07:21 crc kubenswrapper[4830]: I0318 18:07:21.944694 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d8e7b87-f442-4d60-bd65-35eacd097689" containerName="extract-utilities" Mar 18 18:07:21 crc kubenswrapper[4830]: E0318 18:07:21.944702 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7332042a-dffc-4c3e-94eb-2a1dedc58062" containerName="marketplace-operator" Mar 18 18:07:21 crc kubenswrapper[4830]: I0318 18:07:21.944707 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="7332042a-dffc-4c3e-94eb-2a1dedc58062" containerName="marketplace-operator" Mar 18 18:07:21 crc kubenswrapper[4830]: E0318 18:07:21.944717 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28acc7fe-7976-4396-89b7-c17a9e836b22" containerName="registry-server" Mar 18 18:07:21 crc kubenswrapper[4830]: I0318 18:07:21.944723 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="28acc7fe-7976-4396-89b7-c17a9e836b22" containerName="registry-server" Mar 18 18:07:21 crc kubenswrapper[4830]: E0318 18:07:21.944730 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b" containerName="registry-server" Mar 18 18:07:21 crc kubenswrapper[4830]: I0318 18:07:21.944735 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b" containerName="registry-server" Mar 18 18:07:21 crc kubenswrapper[4830]: I0318 18:07:21.944845 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="b69ad1d4-2ff3-44f3-8f73-aef02a79bb6b" containerName="registry-server" Mar 18 18:07:21 crc kubenswrapper[4830]: I0318 18:07:21.944853 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9160fc9-aa00-4ce7-9ea2-15aac1e11e00" containerName="registry-server" Mar 18 18:07:21 crc kubenswrapper[4830]: I0318 18:07:21.944869 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="9199b38a-eef8-4a83-a1b8-0f6fd6faaffd" containerName="oc" Mar 18 18:07:21 crc kubenswrapper[4830]: I0318 18:07:21.944876 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d8e7b87-f442-4d60-bd65-35eacd097689" containerName="registry-server" Mar 18 18:07:21 crc kubenswrapper[4830]: I0318 18:07:21.944883 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="7332042a-dffc-4c3e-94eb-2a1dedc58062" containerName="marketplace-operator" Mar 18 18:07:21 crc kubenswrapper[4830]: I0318 18:07:21.944891 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="28acc7fe-7976-4396-89b7-c17a9e836b22" containerName="registry-server" Mar 18 18:07:21 crc kubenswrapper[4830]: I0318 18:07:21.945551 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wnjfd" Mar 18 18:07:21 crc kubenswrapper[4830]: I0318 18:07:21.947978 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 18:07:21 crc kubenswrapper[4830]: I0318 18:07:21.964212 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wnjfd"] Mar 18 18:07:22 crc kubenswrapper[4830]: I0318 18:07:22.110547 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-88skg"] Mar 18 18:07:22 crc kubenswrapper[4830]: I0318 18:07:22.111468 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-88skg" Mar 18 18:07:22 crc kubenswrapper[4830]: I0318 18:07:22.115310 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 18:07:22 crc kubenswrapper[4830]: I0318 18:07:22.127618 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-88skg"] Mar 18 18:07:22 crc kubenswrapper[4830]: I0318 18:07:22.132885 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68e35d1e-15cb-4293-aa87-cb04b6dc1a72-utilities\") pod \"community-operators-wnjfd\" (UID: \"68e35d1e-15cb-4293-aa87-cb04b6dc1a72\") " pod="openshift-marketplace/community-operators-wnjfd" Mar 18 18:07:22 crc kubenswrapper[4830]: I0318 18:07:22.132978 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-962dd\" (UniqueName: \"kubernetes.io/projected/68e35d1e-15cb-4293-aa87-cb04b6dc1a72-kube-api-access-962dd\") pod \"community-operators-wnjfd\" (UID: \"68e35d1e-15cb-4293-aa87-cb04b6dc1a72\") " pod="openshift-marketplace/community-operators-wnjfd" Mar 18 18:07:22 crc kubenswrapper[4830]: I0318 18:07:22.133211 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68e35d1e-15cb-4293-aa87-cb04b6dc1a72-catalog-content\") pod \"community-operators-wnjfd\" (UID: \"68e35d1e-15cb-4293-aa87-cb04b6dc1a72\") " pod="openshift-marketplace/community-operators-wnjfd" Mar 18 18:07:22 crc kubenswrapper[4830]: I0318 18:07:22.235481 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68e35d1e-15cb-4293-aa87-cb04b6dc1a72-catalog-content\") pod \"community-operators-wnjfd\" (UID: \"68e35d1e-15cb-4293-aa87-cb04b6dc1a72\") " pod="openshift-marketplace/community-operators-wnjfd" Mar 18 18:07:22 crc kubenswrapper[4830]: I0318 18:07:22.234822 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68e35d1e-15cb-4293-aa87-cb04b6dc1a72-catalog-content\") pod \"community-operators-wnjfd\" (UID: \"68e35d1e-15cb-4293-aa87-cb04b6dc1a72\") " pod="openshift-marketplace/community-operators-wnjfd" Mar 18 18:07:22 crc kubenswrapper[4830]: I0318 18:07:22.235557 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7518e45-59c3-47b1-bd28-fc7f74a2dfaa-utilities\") pod \"certified-operators-88skg\" (UID: \"f7518e45-59c3-47b1-bd28-fc7f74a2dfaa\") " pod="openshift-marketplace/certified-operators-88skg" Mar 18 18:07:22 crc kubenswrapper[4830]: I0318 18:07:22.235598 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7518e45-59c3-47b1-bd28-fc7f74a2dfaa-catalog-content\") pod \"certified-operators-88skg\" (UID: \"f7518e45-59c3-47b1-bd28-fc7f74a2dfaa\") " pod="openshift-marketplace/certified-operators-88skg" Mar 18 18:07:22 crc kubenswrapper[4830]: I0318 18:07:22.235632 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68e35d1e-15cb-4293-aa87-cb04b6dc1a72-utilities\") pod \"community-operators-wnjfd\" (UID: \"68e35d1e-15cb-4293-aa87-cb04b6dc1a72\") " pod="openshift-marketplace/community-operators-wnjfd" Mar 18 18:07:22 crc kubenswrapper[4830]: I0318 18:07:22.235679 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdqnn\" (UniqueName: \"kubernetes.io/projected/f7518e45-59c3-47b1-bd28-fc7f74a2dfaa-kube-api-access-cdqnn\") pod \"certified-operators-88skg\" (UID: \"f7518e45-59c3-47b1-bd28-fc7f74a2dfaa\") " pod="openshift-marketplace/certified-operators-88skg" Mar 18 18:07:22 crc kubenswrapper[4830]: I0318 18:07:22.235708 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-962dd\" (UniqueName: \"kubernetes.io/projected/68e35d1e-15cb-4293-aa87-cb04b6dc1a72-kube-api-access-962dd\") pod \"community-operators-wnjfd\" (UID: \"68e35d1e-15cb-4293-aa87-cb04b6dc1a72\") " pod="openshift-marketplace/community-operators-wnjfd" Mar 18 18:07:22 crc kubenswrapper[4830]: I0318 18:07:22.236427 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68e35d1e-15cb-4293-aa87-cb04b6dc1a72-utilities\") pod \"community-operators-wnjfd\" (UID: \"68e35d1e-15cb-4293-aa87-cb04b6dc1a72\") " pod="openshift-marketplace/community-operators-wnjfd" Mar 18 18:07:22 crc kubenswrapper[4830]: I0318 18:07:22.260465 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-962dd\" (UniqueName: \"kubernetes.io/projected/68e35d1e-15cb-4293-aa87-cb04b6dc1a72-kube-api-access-962dd\") pod \"community-operators-wnjfd\" (UID: \"68e35d1e-15cb-4293-aa87-cb04b6dc1a72\") " pod="openshift-marketplace/community-operators-wnjfd" Mar 18 18:07:22 crc kubenswrapper[4830]: I0318 18:07:22.260986 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wnjfd" Mar 18 18:07:22 crc kubenswrapper[4830]: I0318 18:07:22.337328 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7518e45-59c3-47b1-bd28-fc7f74a2dfaa-utilities\") pod \"certified-operators-88skg\" (UID: \"f7518e45-59c3-47b1-bd28-fc7f74a2dfaa\") " pod="openshift-marketplace/certified-operators-88skg" Mar 18 18:07:22 crc kubenswrapper[4830]: I0318 18:07:22.337383 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7518e45-59c3-47b1-bd28-fc7f74a2dfaa-catalog-content\") pod \"certified-operators-88skg\" (UID: \"f7518e45-59c3-47b1-bd28-fc7f74a2dfaa\") " pod="openshift-marketplace/certified-operators-88skg" Mar 18 18:07:22 crc kubenswrapper[4830]: I0318 18:07:22.337428 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdqnn\" (UniqueName: \"kubernetes.io/projected/f7518e45-59c3-47b1-bd28-fc7f74a2dfaa-kube-api-access-cdqnn\") pod \"certified-operators-88skg\" (UID: \"f7518e45-59c3-47b1-bd28-fc7f74a2dfaa\") " pod="openshift-marketplace/certified-operators-88skg" Mar 18 18:07:22 crc kubenswrapper[4830]: I0318 18:07:22.338047 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7518e45-59c3-47b1-bd28-fc7f74a2dfaa-catalog-content\") pod \"certified-operators-88skg\" (UID: \"f7518e45-59c3-47b1-bd28-fc7f74a2dfaa\") " pod="openshift-marketplace/certified-operators-88skg" Mar 18 18:07:22 crc kubenswrapper[4830]: I0318 18:07:22.338147 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7518e45-59c3-47b1-bd28-fc7f74a2dfaa-utilities\") pod \"certified-operators-88skg\" (UID: \"f7518e45-59c3-47b1-bd28-fc7f74a2dfaa\") " pod="openshift-marketplace/certified-operators-88skg" Mar 18 18:07:22 crc kubenswrapper[4830]: I0318 18:07:22.359564 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdqnn\" (UniqueName: \"kubernetes.io/projected/f7518e45-59c3-47b1-bd28-fc7f74a2dfaa-kube-api-access-cdqnn\") pod \"certified-operators-88skg\" (UID: \"f7518e45-59c3-47b1-bd28-fc7f74a2dfaa\") " pod="openshift-marketplace/certified-operators-88skg" Mar 18 18:07:22 crc kubenswrapper[4830]: I0318 18:07:22.424870 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-88skg" Mar 18 18:07:22 crc kubenswrapper[4830]: I0318 18:07:22.698613 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wnjfd"] Mar 18 18:07:22 crc kubenswrapper[4830]: W0318 18:07:22.813270 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7518e45_59c3_47b1_bd28_fc7f74a2dfaa.slice/crio-87b6ddaaa3fca31d881e5323c88c78e8c9dd2bb110521aa991ec1cffff695197 WatchSource:0}: Error finding container 87b6ddaaa3fca31d881e5323c88c78e8c9dd2bb110521aa991ec1cffff695197: Status 404 returned error can't find the container with id 87b6ddaaa3fca31d881e5323c88c78e8c9dd2bb110521aa991ec1cffff695197 Mar 18 18:07:22 crc kubenswrapper[4830]: I0318 18:07:22.813724 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-88skg"] Mar 18 18:07:23 crc kubenswrapper[4830]: I0318 18:07:23.541401 4830 generic.go:334] "Generic (PLEG): container finished" podID="f7518e45-59c3-47b1-bd28-fc7f74a2dfaa" containerID="08115030fc976468ae68c1989a9347bc70b0cac7b4b0c094d58108254544aec2" exitCode=0 Mar 18 18:07:23 crc kubenswrapper[4830]: I0318 18:07:23.541465 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-88skg" event={"ID":"f7518e45-59c3-47b1-bd28-fc7f74a2dfaa","Type":"ContainerDied","Data":"08115030fc976468ae68c1989a9347bc70b0cac7b4b0c094d58108254544aec2"} Mar 18 18:07:23 crc kubenswrapper[4830]: I0318 18:07:23.541835 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-88skg" event={"ID":"f7518e45-59c3-47b1-bd28-fc7f74a2dfaa","Type":"ContainerStarted","Data":"87b6ddaaa3fca31d881e5323c88c78e8c9dd2bb110521aa991ec1cffff695197"} Mar 18 18:07:23 crc kubenswrapper[4830]: I0318 18:07:23.545062 4830 generic.go:334] "Generic (PLEG): container finished" podID="68e35d1e-15cb-4293-aa87-cb04b6dc1a72" containerID="1354a137995dd996d04bc85cb9377a1ca7d43a67999902732cb61bfdd2f0d746" exitCode=0 Mar 18 18:07:23 crc kubenswrapper[4830]: I0318 18:07:23.545115 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wnjfd" event={"ID":"68e35d1e-15cb-4293-aa87-cb04b6dc1a72","Type":"ContainerDied","Data":"1354a137995dd996d04bc85cb9377a1ca7d43a67999902732cb61bfdd2f0d746"} Mar 18 18:07:23 crc kubenswrapper[4830]: I0318 18:07:23.545157 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wnjfd" event={"ID":"68e35d1e-15cb-4293-aa87-cb04b6dc1a72","Type":"ContainerStarted","Data":"b3a60b82c95f01a99682c27c8020243164682f6c0c61372551167a13372c469a"} Mar 18 18:07:23 crc kubenswrapper[4830]: I0318 18:07:23.714546 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7w7m4"] Mar 18 18:07:23 crc kubenswrapper[4830]: I0318 18:07:23.715923 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7w7m4" Mar 18 18:07:23 crc kubenswrapper[4830]: I0318 18:07:23.717621 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 18:07:23 crc kubenswrapper[4830]: I0318 18:07:23.728479 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7w7m4"] Mar 18 18:07:23 crc kubenswrapper[4830]: I0318 18:07:23.759461 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9ff1801-8502-4f37-aa28-55a689cbb2c1-utilities\") pod \"redhat-marketplace-7w7m4\" (UID: \"c9ff1801-8502-4f37-aa28-55a689cbb2c1\") " pod="openshift-marketplace/redhat-marketplace-7w7m4" Mar 18 18:07:23 crc kubenswrapper[4830]: I0318 18:07:23.759529 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9ff1801-8502-4f37-aa28-55a689cbb2c1-catalog-content\") pod \"redhat-marketplace-7w7m4\" (UID: \"c9ff1801-8502-4f37-aa28-55a689cbb2c1\") " pod="openshift-marketplace/redhat-marketplace-7w7m4" Mar 18 18:07:23 crc kubenswrapper[4830]: I0318 18:07:23.759571 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4sj6\" (UniqueName: \"kubernetes.io/projected/c9ff1801-8502-4f37-aa28-55a689cbb2c1-kube-api-access-t4sj6\") pod \"redhat-marketplace-7w7m4\" (UID: \"c9ff1801-8502-4f37-aa28-55a689cbb2c1\") " pod="openshift-marketplace/redhat-marketplace-7w7m4" Mar 18 18:07:23 crc kubenswrapper[4830]: I0318 18:07:23.860571 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9ff1801-8502-4f37-aa28-55a689cbb2c1-utilities\") pod \"redhat-marketplace-7w7m4\" (UID: \"c9ff1801-8502-4f37-aa28-55a689cbb2c1\") " pod="openshift-marketplace/redhat-marketplace-7w7m4" Mar 18 18:07:23 crc kubenswrapper[4830]: I0318 18:07:23.860633 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9ff1801-8502-4f37-aa28-55a689cbb2c1-catalog-content\") pod \"redhat-marketplace-7w7m4\" (UID: \"c9ff1801-8502-4f37-aa28-55a689cbb2c1\") " pod="openshift-marketplace/redhat-marketplace-7w7m4" Mar 18 18:07:23 crc kubenswrapper[4830]: I0318 18:07:23.860655 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4sj6\" (UniqueName: \"kubernetes.io/projected/c9ff1801-8502-4f37-aa28-55a689cbb2c1-kube-api-access-t4sj6\") pod \"redhat-marketplace-7w7m4\" (UID: \"c9ff1801-8502-4f37-aa28-55a689cbb2c1\") " pod="openshift-marketplace/redhat-marketplace-7w7m4" Mar 18 18:07:23 crc kubenswrapper[4830]: I0318 18:07:23.861161 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9ff1801-8502-4f37-aa28-55a689cbb2c1-utilities\") pod \"redhat-marketplace-7w7m4\" (UID: \"c9ff1801-8502-4f37-aa28-55a689cbb2c1\") " pod="openshift-marketplace/redhat-marketplace-7w7m4" Mar 18 18:07:23 crc kubenswrapper[4830]: I0318 18:07:23.861273 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9ff1801-8502-4f37-aa28-55a689cbb2c1-catalog-content\") pod \"redhat-marketplace-7w7m4\" (UID: \"c9ff1801-8502-4f37-aa28-55a689cbb2c1\") " pod="openshift-marketplace/redhat-marketplace-7w7m4" Mar 18 18:07:23 crc kubenswrapper[4830]: I0318 18:07:23.879326 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4sj6\" (UniqueName: \"kubernetes.io/projected/c9ff1801-8502-4f37-aa28-55a689cbb2c1-kube-api-access-t4sj6\") pod \"redhat-marketplace-7w7m4\" (UID: \"c9ff1801-8502-4f37-aa28-55a689cbb2c1\") " pod="openshift-marketplace/redhat-marketplace-7w7m4" Mar 18 18:07:24 crc kubenswrapper[4830]: I0318 18:07:24.035392 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7w7m4" Mar 18 18:07:24 crc kubenswrapper[4830]: I0318 18:07:24.473988 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7w7m4"] Mar 18 18:07:24 crc kubenswrapper[4830]: I0318 18:07:24.555821 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-88skg" event={"ID":"f7518e45-59c3-47b1-bd28-fc7f74a2dfaa","Type":"ContainerStarted","Data":"6e981d8680391ab5bf6748e767f4cf9fc0d9a7c53ce3eb52b679e10c9accbb84"} Mar 18 18:07:24 crc kubenswrapper[4830]: I0318 18:07:24.556951 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7w7m4" event={"ID":"c9ff1801-8502-4f37-aa28-55a689cbb2c1","Type":"ContainerStarted","Data":"380036df2eef85b02ebdf6f3b12bb4efdc7da7d2008ab58e750eb38c01bda333"} Mar 18 18:07:24 crc kubenswrapper[4830]: I0318 18:07:24.713563 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6pt2b"] Mar 18 18:07:24 crc kubenswrapper[4830]: I0318 18:07:24.714616 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6pt2b" Mar 18 18:07:24 crc kubenswrapper[4830]: I0318 18:07:24.716424 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 18:07:24 crc kubenswrapper[4830]: I0318 18:07:24.730934 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6pt2b"] Mar 18 18:07:24 crc kubenswrapper[4830]: I0318 18:07:24.873802 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdmpw\" (UniqueName: \"kubernetes.io/projected/f1147eda-0b31-4c1e-9923-f8d73c80f9a6-kube-api-access-jdmpw\") pod \"redhat-operators-6pt2b\" (UID: \"f1147eda-0b31-4c1e-9923-f8d73c80f9a6\") " pod="openshift-marketplace/redhat-operators-6pt2b" Mar 18 18:07:24 crc kubenswrapper[4830]: I0318 18:07:24.873904 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1147eda-0b31-4c1e-9923-f8d73c80f9a6-catalog-content\") pod \"redhat-operators-6pt2b\" (UID: \"f1147eda-0b31-4c1e-9923-f8d73c80f9a6\") " pod="openshift-marketplace/redhat-operators-6pt2b" Mar 18 18:07:24 crc kubenswrapper[4830]: I0318 18:07:24.873930 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1147eda-0b31-4c1e-9923-f8d73c80f9a6-utilities\") pod \"redhat-operators-6pt2b\" (UID: \"f1147eda-0b31-4c1e-9923-f8d73c80f9a6\") " pod="openshift-marketplace/redhat-operators-6pt2b" Mar 18 18:07:24 crc kubenswrapper[4830]: I0318 18:07:24.975438 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdmpw\" (UniqueName: \"kubernetes.io/projected/f1147eda-0b31-4c1e-9923-f8d73c80f9a6-kube-api-access-jdmpw\") pod \"redhat-operators-6pt2b\" (UID: \"f1147eda-0b31-4c1e-9923-f8d73c80f9a6\") " pod="openshift-marketplace/redhat-operators-6pt2b" Mar 18 18:07:24 crc kubenswrapper[4830]: I0318 18:07:24.975537 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1147eda-0b31-4c1e-9923-f8d73c80f9a6-catalog-content\") pod \"redhat-operators-6pt2b\" (UID: \"f1147eda-0b31-4c1e-9923-f8d73c80f9a6\") " pod="openshift-marketplace/redhat-operators-6pt2b" Mar 18 18:07:24 crc kubenswrapper[4830]: I0318 18:07:24.975560 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1147eda-0b31-4c1e-9923-f8d73c80f9a6-utilities\") pod \"redhat-operators-6pt2b\" (UID: \"f1147eda-0b31-4c1e-9923-f8d73c80f9a6\") " pod="openshift-marketplace/redhat-operators-6pt2b" Mar 18 18:07:24 crc kubenswrapper[4830]: I0318 18:07:24.976183 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1147eda-0b31-4c1e-9923-f8d73c80f9a6-catalog-content\") pod \"redhat-operators-6pt2b\" (UID: \"f1147eda-0b31-4c1e-9923-f8d73c80f9a6\") " pod="openshift-marketplace/redhat-operators-6pt2b" Mar 18 18:07:24 crc kubenswrapper[4830]: I0318 18:07:24.976396 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1147eda-0b31-4c1e-9923-f8d73c80f9a6-utilities\") pod \"redhat-operators-6pt2b\" (UID: \"f1147eda-0b31-4c1e-9923-f8d73c80f9a6\") " pod="openshift-marketplace/redhat-operators-6pt2b" Mar 18 18:07:24 crc kubenswrapper[4830]: I0318 18:07:24.994076 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdmpw\" (UniqueName: \"kubernetes.io/projected/f1147eda-0b31-4c1e-9923-f8d73c80f9a6-kube-api-access-jdmpw\") pod \"redhat-operators-6pt2b\" (UID: \"f1147eda-0b31-4c1e-9923-f8d73c80f9a6\") " pod="openshift-marketplace/redhat-operators-6pt2b" Mar 18 18:07:25 crc kubenswrapper[4830]: I0318 18:07:25.030679 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6pt2b" Mar 18 18:07:25 crc kubenswrapper[4830]: I0318 18:07:25.424533 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6pt2b"] Mar 18 18:07:25 crc kubenswrapper[4830]: I0318 18:07:25.563075 4830 generic.go:334] "Generic (PLEG): container finished" podID="c9ff1801-8502-4f37-aa28-55a689cbb2c1" containerID="db64f27b8bbd1c4831f43a6f8f0e67b03d99b4f3eb92766e2b86485c36f5badc" exitCode=0 Mar 18 18:07:25 crc kubenswrapper[4830]: I0318 18:07:25.563130 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7w7m4" event={"ID":"c9ff1801-8502-4f37-aa28-55a689cbb2c1","Type":"ContainerDied","Data":"db64f27b8bbd1c4831f43a6f8f0e67b03d99b4f3eb92766e2b86485c36f5badc"} Mar 18 18:07:25 crc kubenswrapper[4830]: I0318 18:07:25.565517 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6pt2b" event={"ID":"f1147eda-0b31-4c1e-9923-f8d73c80f9a6","Type":"ContainerStarted","Data":"4dfdd39dc5c2ed52ede416e3e7e3894896196cfcc81ddf5ea58cdf0eaa90ff26"} Mar 18 18:07:25 crc kubenswrapper[4830]: I0318 18:07:25.568443 4830 generic.go:334] "Generic (PLEG): container finished" podID="68e35d1e-15cb-4293-aa87-cb04b6dc1a72" containerID="c867f3efca4bca64cac593716648787dc0869c825a389adc5a8c22d93d744c92" exitCode=0 Mar 18 18:07:25 crc kubenswrapper[4830]: I0318 18:07:25.568535 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wnjfd" event={"ID":"68e35d1e-15cb-4293-aa87-cb04b6dc1a72","Type":"ContainerDied","Data":"c867f3efca4bca64cac593716648787dc0869c825a389adc5a8c22d93d744c92"} Mar 18 18:07:25 crc kubenswrapper[4830]: I0318 18:07:25.570294 4830 generic.go:334] "Generic (PLEG): container finished" podID="f7518e45-59c3-47b1-bd28-fc7f74a2dfaa" containerID="6e981d8680391ab5bf6748e767f4cf9fc0d9a7c53ce3eb52b679e10c9accbb84" exitCode=0 Mar 18 18:07:25 crc kubenswrapper[4830]: I0318 18:07:25.570314 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-88skg" event={"ID":"f7518e45-59c3-47b1-bd28-fc7f74a2dfaa","Type":"ContainerDied","Data":"6e981d8680391ab5bf6748e767f4cf9fc0d9a7c53ce3eb52b679e10c9accbb84"} Mar 18 18:07:26 crc kubenswrapper[4830]: I0318 18:07:26.577371 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wnjfd" event={"ID":"68e35d1e-15cb-4293-aa87-cb04b6dc1a72","Type":"ContainerStarted","Data":"96d6af14fb350c67cdccd2d0ac046c2940f0b06e86127450e7923c4970fff509"} Mar 18 18:07:26 crc kubenswrapper[4830]: I0318 18:07:26.579690 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-88skg" event={"ID":"f7518e45-59c3-47b1-bd28-fc7f74a2dfaa","Type":"ContainerStarted","Data":"2c8bb7b4b732a09a38994b0ffc6f037fc09a77da6700912a92c67d9045fb5f89"} Mar 18 18:07:26 crc kubenswrapper[4830]: I0318 18:07:26.581611 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7w7m4" event={"ID":"c9ff1801-8502-4f37-aa28-55a689cbb2c1","Type":"ContainerStarted","Data":"c856cfb69d61d2b13feedcc27d13a5db30c5e35546e20734107b07be69148a56"} Mar 18 18:07:26 crc kubenswrapper[4830]: I0318 18:07:26.583426 4830 generic.go:334] "Generic (PLEG): container finished" podID="f1147eda-0b31-4c1e-9923-f8d73c80f9a6" containerID="ae8c5a5ad571b6105f30a4336b704b788ae434da17d4e7236153ab9178fe4f99" exitCode=0 Mar 18 18:07:26 crc kubenswrapper[4830]: I0318 18:07:26.583472 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6pt2b" event={"ID":"f1147eda-0b31-4c1e-9923-f8d73c80f9a6","Type":"ContainerDied","Data":"ae8c5a5ad571b6105f30a4336b704b788ae434da17d4e7236153ab9178fe4f99"} Mar 18 18:07:26 crc kubenswrapper[4830]: I0318 18:07:26.599901 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wnjfd" podStartSLOduration=3.186194441 podStartE2EDuration="5.599884049s" podCreationTimestamp="2026-03-18 18:07:21 +0000 UTC" firstStartedPulling="2026-03-18 18:07:23.546652595 +0000 UTC m=+278.114282927" lastFinishedPulling="2026-03-18 18:07:25.960342203 +0000 UTC m=+280.527972535" observedRunningTime="2026-03-18 18:07:26.595926961 +0000 UTC m=+281.163557413" watchObservedRunningTime="2026-03-18 18:07:26.599884049 +0000 UTC m=+281.167514381" Mar 18 18:07:26 crc kubenswrapper[4830]: I0318 18:07:26.667730 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-88skg" podStartSLOduration=2.1901407 podStartE2EDuration="4.667715998s" podCreationTimestamp="2026-03-18 18:07:22 +0000 UTC" firstStartedPulling="2026-03-18 18:07:23.542909804 +0000 UTC m=+278.110540136" lastFinishedPulling="2026-03-18 18:07:26.020485102 +0000 UTC m=+280.588115434" observedRunningTime="2026-03-18 18:07:26.666256997 +0000 UTC m=+281.233887329" watchObservedRunningTime="2026-03-18 18:07:26.667715998 +0000 UTC m=+281.235346330" Mar 18 18:07:27 crc kubenswrapper[4830]: I0318 18:07:27.591612 4830 generic.go:334] "Generic (PLEG): container finished" podID="c9ff1801-8502-4f37-aa28-55a689cbb2c1" containerID="c856cfb69d61d2b13feedcc27d13a5db30c5e35546e20734107b07be69148a56" exitCode=0 Mar 18 18:07:27 crc kubenswrapper[4830]: I0318 18:07:27.591733 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7w7m4" event={"ID":"c9ff1801-8502-4f37-aa28-55a689cbb2c1","Type":"ContainerDied","Data":"c856cfb69d61d2b13feedcc27d13a5db30c5e35546e20734107b07be69148a56"} Mar 18 18:07:28 crc kubenswrapper[4830]: I0318 18:07:28.598471 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7w7m4" event={"ID":"c9ff1801-8502-4f37-aa28-55a689cbb2c1","Type":"ContainerStarted","Data":"6ceec29ef3c74f9468f105250b9ad056e1700cf856ea0bbc15267af292d42654"} Mar 18 18:07:28 crc kubenswrapper[4830]: I0318 18:07:28.617110 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7w7m4" podStartSLOduration=2.920556329 podStartE2EDuration="5.617089645s" podCreationTimestamp="2026-03-18 18:07:23 +0000 UTC" firstStartedPulling="2026-03-18 18:07:25.564426861 +0000 UTC m=+280.132057183" lastFinishedPulling="2026-03-18 18:07:28.260960167 +0000 UTC m=+282.828590499" observedRunningTime="2026-03-18 18:07:28.615585692 +0000 UTC m=+283.183216024" watchObservedRunningTime="2026-03-18 18:07:28.617089645 +0000 UTC m=+283.184719977" Mar 18 18:07:29 crc kubenswrapper[4830]: I0318 18:07:29.509466 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:07:29 crc kubenswrapper[4830]: I0318 18:07:29.509883 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:07:29 crc kubenswrapper[4830]: I0318 18:07:29.509941 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" Mar 18 18:07:29 crc kubenswrapper[4830]: I0318 18:07:29.510593 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1de4e26f6767da64f02f9792da506b0d4a20c0e15b76e432cd3ee81dff89156a"} pod="openshift-machine-config-operator/machine-config-daemon-plzpb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 18:07:29 crc kubenswrapper[4830]: I0318 18:07:29.510664 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" containerID="cri-o://1de4e26f6767da64f02f9792da506b0d4a20c0e15b76e432cd3ee81dff89156a" gracePeriod=600 Mar 18 18:07:29 crc kubenswrapper[4830]: I0318 18:07:29.605328 4830 generic.go:334] "Generic (PLEG): container finished" podID="f1147eda-0b31-4c1e-9923-f8d73c80f9a6" containerID="025db37d651e5e6fef158e95b68fed0db5c0d7302732ef0bfc2643227cfe8d3d" exitCode=0 Mar 18 18:07:29 crc kubenswrapper[4830]: I0318 18:07:29.605398 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6pt2b" event={"ID":"f1147eda-0b31-4c1e-9923-f8d73c80f9a6","Type":"ContainerDied","Data":"025db37d651e5e6fef158e95b68fed0db5c0d7302732ef0bfc2643227cfe8d3d"} Mar 18 18:07:30 crc kubenswrapper[4830]: I0318 18:07:30.615327 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6pt2b" event={"ID":"f1147eda-0b31-4c1e-9923-f8d73c80f9a6","Type":"ContainerStarted","Data":"e9ad569fe2df5222f13fe74eb5bfb2fa7d5a0badc80777797facee0de6ed6b98"} Mar 18 18:07:30 crc kubenswrapper[4830]: I0318 18:07:30.622726 4830 generic.go:334] "Generic (PLEG): container finished" podID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerID="1de4e26f6767da64f02f9792da506b0d4a20c0e15b76e432cd3ee81dff89156a" exitCode=0 Mar 18 18:07:30 crc kubenswrapper[4830]: I0318 18:07:30.622838 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerDied","Data":"1de4e26f6767da64f02f9792da506b0d4a20c0e15b76e432cd3ee81dff89156a"} Mar 18 18:07:30 crc kubenswrapper[4830]: I0318 18:07:30.622930 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerStarted","Data":"42dc38d8df972677fdf94614f323382629f2127ebd6ae0c69812cf7b8f842f9e"} Mar 18 18:07:30 crc kubenswrapper[4830]: I0318 18:07:30.646034 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6pt2b" podStartSLOduration=3.069602731 podStartE2EDuration="6.64599878s" podCreationTimestamp="2026-03-18 18:07:24 +0000 UTC" firstStartedPulling="2026-03-18 18:07:26.584691477 +0000 UTC m=+281.152321809" lastFinishedPulling="2026-03-18 18:07:30.161087526 +0000 UTC m=+284.728717858" observedRunningTime="2026-03-18 18:07:30.640073852 +0000 UTC m=+285.207704224" watchObservedRunningTime="2026-03-18 18:07:30.64599878 +0000 UTC m=+285.213629152" Mar 18 18:07:32 crc kubenswrapper[4830]: I0318 18:07:32.261483 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wnjfd" Mar 18 18:07:32 crc kubenswrapper[4830]: I0318 18:07:32.263415 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wnjfd" Mar 18 18:07:32 crc kubenswrapper[4830]: I0318 18:07:32.317465 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wnjfd" Mar 18 18:07:32 crc kubenswrapper[4830]: I0318 18:07:32.427071 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-88skg" Mar 18 18:07:32 crc kubenswrapper[4830]: I0318 18:07:32.427124 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-88skg" Mar 18 18:07:32 crc kubenswrapper[4830]: I0318 18:07:32.473749 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-88skg" Mar 18 18:07:32 crc kubenswrapper[4830]: I0318 18:07:32.676324 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-88skg" Mar 18 18:07:32 crc kubenswrapper[4830]: I0318 18:07:32.681842 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wnjfd" Mar 18 18:07:34 crc kubenswrapper[4830]: I0318 18:07:34.035639 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7w7m4" Mar 18 18:07:34 crc kubenswrapper[4830]: I0318 18:07:34.036764 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7w7m4" Mar 18 18:07:34 crc kubenswrapper[4830]: I0318 18:07:34.083401 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7w7m4" Mar 18 18:07:34 crc kubenswrapper[4830]: I0318 18:07:34.718180 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7w7m4" Mar 18 18:07:35 crc kubenswrapper[4830]: I0318 18:07:35.032148 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6pt2b" Mar 18 18:07:35 crc kubenswrapper[4830]: I0318 18:07:35.032202 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6pt2b" Mar 18 18:07:36 crc kubenswrapper[4830]: I0318 18:07:36.082070 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6pt2b" podUID="f1147eda-0b31-4c1e-9923-f8d73c80f9a6" containerName="registry-server" probeResult="failure" output=< Mar 18 18:07:36 crc kubenswrapper[4830]: timeout: failed to connect service ":50051" within 1s Mar 18 18:07:36 crc kubenswrapper[4830]: > Mar 18 18:07:45 crc kubenswrapper[4830]: I0318 18:07:45.100854 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6pt2b" Mar 18 18:07:45 crc kubenswrapper[4830]: I0318 18:07:45.169623 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6pt2b" Mar 18 18:08:00 crc kubenswrapper[4830]: I0318 18:08:00.145483 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564288-n47gr"] Mar 18 18:08:00 crc kubenswrapper[4830]: I0318 18:08:00.146824 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564288-n47gr" Mar 18 18:08:00 crc kubenswrapper[4830]: I0318 18:08:00.149629 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:08:00 crc kubenswrapper[4830]: I0318 18:08:00.150632 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:08:00 crc kubenswrapper[4830]: I0318 18:08:00.150915 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 18:08:00 crc kubenswrapper[4830]: I0318 18:08:00.165954 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564288-n47gr"] Mar 18 18:08:00 crc kubenswrapper[4830]: I0318 18:08:00.368125 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cqwg\" (UniqueName: \"kubernetes.io/projected/1e24015e-7ff6-47e9-8d2b-3f4b7b46af5f-kube-api-access-6cqwg\") pod \"auto-csr-approver-29564288-n47gr\" (UID: \"1e24015e-7ff6-47e9-8d2b-3f4b7b46af5f\") " pod="openshift-infra/auto-csr-approver-29564288-n47gr" Mar 18 18:08:00 crc kubenswrapper[4830]: I0318 18:08:00.468855 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cqwg\" (UniqueName: \"kubernetes.io/projected/1e24015e-7ff6-47e9-8d2b-3f4b7b46af5f-kube-api-access-6cqwg\") pod \"auto-csr-approver-29564288-n47gr\" (UID: \"1e24015e-7ff6-47e9-8d2b-3f4b7b46af5f\") " pod="openshift-infra/auto-csr-approver-29564288-n47gr" Mar 18 18:08:00 crc kubenswrapper[4830]: I0318 18:08:00.487038 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cqwg\" (UniqueName: \"kubernetes.io/projected/1e24015e-7ff6-47e9-8d2b-3f4b7b46af5f-kube-api-access-6cqwg\") pod \"auto-csr-approver-29564288-n47gr\" (UID: \"1e24015e-7ff6-47e9-8d2b-3f4b7b46af5f\") " pod="openshift-infra/auto-csr-approver-29564288-n47gr" Mar 18 18:08:00 crc kubenswrapper[4830]: I0318 18:08:00.568184 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564288-n47gr" Mar 18 18:08:01 crc kubenswrapper[4830]: I0318 18:08:01.041239 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564288-n47gr"] Mar 18 18:08:01 crc kubenswrapper[4830]: I0318 18:08:01.809067 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564288-n47gr" event={"ID":"1e24015e-7ff6-47e9-8d2b-3f4b7b46af5f","Type":"ContainerStarted","Data":"c7a3c6ae5050b5119c6202ec71f9b68f733d27e54aa72196c3102ab1ea63c244"} Mar 18 18:08:02 crc kubenswrapper[4830]: I0318 18:08:02.817693 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564288-n47gr" event={"ID":"1e24015e-7ff6-47e9-8d2b-3f4b7b46af5f","Type":"ContainerStarted","Data":"81ed9c688bd7c555b9c89150a01f87551b09da4b2f705a1f3f051db59cce1b69"} Mar 18 18:08:02 crc kubenswrapper[4830]: I0318 18:08:02.837535 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564288-n47gr" podStartSLOduration=1.5584528579999999 podStartE2EDuration="2.83750218s" podCreationTimestamp="2026-03-18 18:08:00 +0000 UTC" firstStartedPulling="2026-03-18 18:08:01.065998439 +0000 UTC m=+315.633628781" lastFinishedPulling="2026-03-18 18:08:02.345047731 +0000 UTC m=+316.912678103" observedRunningTime="2026-03-18 18:08:02.83465767 +0000 UTC m=+317.402288062" watchObservedRunningTime="2026-03-18 18:08:02.83750218 +0000 UTC m=+317.405132552" Mar 18 18:08:03 crc kubenswrapper[4830]: I0318 18:08:03.828001 4830 generic.go:334] "Generic (PLEG): container finished" podID="1e24015e-7ff6-47e9-8d2b-3f4b7b46af5f" containerID="81ed9c688bd7c555b9c89150a01f87551b09da4b2f705a1f3f051db59cce1b69" exitCode=0 Mar 18 18:08:03 crc kubenswrapper[4830]: I0318 18:08:03.828414 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564288-n47gr" event={"ID":"1e24015e-7ff6-47e9-8d2b-3f4b7b46af5f","Type":"ContainerDied","Data":"81ed9c688bd7c555b9c89150a01f87551b09da4b2f705a1f3f051db59cce1b69"} Mar 18 18:08:05 crc kubenswrapper[4830]: I0318 18:08:05.189924 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564288-n47gr" Mar 18 18:08:05 crc kubenswrapper[4830]: I0318 18:08:05.334271 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cqwg\" (UniqueName: \"kubernetes.io/projected/1e24015e-7ff6-47e9-8d2b-3f4b7b46af5f-kube-api-access-6cqwg\") pod \"1e24015e-7ff6-47e9-8d2b-3f4b7b46af5f\" (UID: \"1e24015e-7ff6-47e9-8d2b-3f4b7b46af5f\") " Mar 18 18:08:05 crc kubenswrapper[4830]: I0318 18:08:05.342794 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e24015e-7ff6-47e9-8d2b-3f4b7b46af5f-kube-api-access-6cqwg" (OuterVolumeSpecName: "kube-api-access-6cqwg") pod "1e24015e-7ff6-47e9-8d2b-3f4b7b46af5f" (UID: "1e24015e-7ff6-47e9-8d2b-3f4b7b46af5f"). InnerVolumeSpecName "kube-api-access-6cqwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:08:05 crc kubenswrapper[4830]: I0318 18:08:05.438187 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cqwg\" (UniqueName: \"kubernetes.io/projected/1e24015e-7ff6-47e9-8d2b-3f4b7b46af5f-kube-api-access-6cqwg\") on node \"crc\" DevicePath \"\"" Mar 18 18:08:05 crc kubenswrapper[4830]: I0318 18:08:05.843275 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564288-n47gr" event={"ID":"1e24015e-7ff6-47e9-8d2b-3f4b7b46af5f","Type":"ContainerDied","Data":"c7a3c6ae5050b5119c6202ec71f9b68f733d27e54aa72196c3102ab1ea63c244"} Mar 18 18:08:05 crc kubenswrapper[4830]: I0318 18:08:05.843359 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7a3c6ae5050b5119c6202ec71f9b68f733d27e54aa72196c3102ab1ea63c244" Mar 18 18:08:05 crc kubenswrapper[4830]: I0318 18:08:05.843404 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564288-n47gr" Mar 18 18:08:06 crc kubenswrapper[4830]: I0318 18:08:06.222286 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wqj7b"] Mar 18 18:08:06 crc kubenswrapper[4830]: E0318 18:08:06.222511 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e24015e-7ff6-47e9-8d2b-3f4b7b46af5f" containerName="oc" Mar 18 18:08:06 crc kubenswrapper[4830]: I0318 18:08:06.222523 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e24015e-7ff6-47e9-8d2b-3f4b7b46af5f" containerName="oc" Mar 18 18:08:06 crc kubenswrapper[4830]: I0318 18:08:06.222614 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e24015e-7ff6-47e9-8d2b-3f4b7b46af5f" containerName="oc" Mar 18 18:08:06 crc kubenswrapper[4830]: I0318 18:08:06.223011 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-wqj7b" Mar 18 18:08:06 crc kubenswrapper[4830]: I0318 18:08:06.249268 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wqj7b"] Mar 18 18:08:06 crc kubenswrapper[4830]: I0318 18:08:06.349105 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9e21c4f2-6755-4cfe-8855-3071194e8389-registry-tls\") pod \"image-registry-66df7c8f76-wqj7b\" (UID: \"9e21c4f2-6755-4cfe-8855-3071194e8389\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqj7b" Mar 18 18:08:06 crc kubenswrapper[4830]: I0318 18:08:06.349161 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r77h\" (UniqueName: \"kubernetes.io/projected/9e21c4f2-6755-4cfe-8855-3071194e8389-kube-api-access-6r77h\") pod \"image-registry-66df7c8f76-wqj7b\" (UID: \"9e21c4f2-6755-4cfe-8855-3071194e8389\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqj7b" Mar 18 18:08:06 crc kubenswrapper[4830]: I0318 18:08:06.349196 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-wqj7b\" (UID: \"9e21c4f2-6755-4cfe-8855-3071194e8389\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqj7b" Mar 18 18:08:06 crc kubenswrapper[4830]: I0318 18:08:06.349335 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9e21c4f2-6755-4cfe-8855-3071194e8389-bound-sa-token\") pod \"image-registry-66df7c8f76-wqj7b\" (UID: \"9e21c4f2-6755-4cfe-8855-3071194e8389\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqj7b" Mar 18 18:08:06 crc kubenswrapper[4830]: I0318 18:08:06.349367 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9e21c4f2-6755-4cfe-8855-3071194e8389-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wqj7b\" (UID: \"9e21c4f2-6755-4cfe-8855-3071194e8389\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqj7b" Mar 18 18:08:06 crc kubenswrapper[4830]: I0318 18:08:06.349400 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9e21c4f2-6755-4cfe-8855-3071194e8389-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wqj7b\" (UID: \"9e21c4f2-6755-4cfe-8855-3071194e8389\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqj7b" Mar 18 18:08:06 crc kubenswrapper[4830]: I0318 18:08:06.349437 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9e21c4f2-6755-4cfe-8855-3071194e8389-registry-certificates\") pod \"image-registry-66df7c8f76-wqj7b\" (UID: \"9e21c4f2-6755-4cfe-8855-3071194e8389\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqj7b" Mar 18 18:08:06 crc kubenswrapper[4830]: I0318 18:08:06.349461 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e21c4f2-6755-4cfe-8855-3071194e8389-trusted-ca\") pod \"image-registry-66df7c8f76-wqj7b\" (UID: \"9e21c4f2-6755-4cfe-8855-3071194e8389\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqj7b" Mar 18 18:08:06 crc kubenswrapper[4830]: I0318 18:08:06.376494 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-wqj7b\" (UID: \"9e21c4f2-6755-4cfe-8855-3071194e8389\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqj7b" Mar 18 18:08:06 crc kubenswrapper[4830]: I0318 18:08:06.450275 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9e21c4f2-6755-4cfe-8855-3071194e8389-bound-sa-token\") pod \"image-registry-66df7c8f76-wqj7b\" (UID: \"9e21c4f2-6755-4cfe-8855-3071194e8389\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqj7b" Mar 18 18:08:06 crc kubenswrapper[4830]: I0318 18:08:06.450337 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9e21c4f2-6755-4cfe-8855-3071194e8389-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wqj7b\" (UID: \"9e21c4f2-6755-4cfe-8855-3071194e8389\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqj7b" Mar 18 18:08:06 crc kubenswrapper[4830]: I0318 18:08:06.450385 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9e21c4f2-6755-4cfe-8855-3071194e8389-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wqj7b\" (UID: \"9e21c4f2-6755-4cfe-8855-3071194e8389\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqj7b" Mar 18 18:08:06 crc kubenswrapper[4830]: I0318 18:08:06.450441 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9e21c4f2-6755-4cfe-8855-3071194e8389-registry-certificates\") pod \"image-registry-66df7c8f76-wqj7b\" (UID: \"9e21c4f2-6755-4cfe-8855-3071194e8389\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqj7b" Mar 18 18:08:06 crc kubenswrapper[4830]: I0318 18:08:06.450476 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e21c4f2-6755-4cfe-8855-3071194e8389-trusted-ca\") pod \"image-registry-66df7c8f76-wqj7b\" (UID: \"9e21c4f2-6755-4cfe-8855-3071194e8389\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqj7b" Mar 18 18:08:06 crc kubenswrapper[4830]: I0318 18:08:06.450511 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9e21c4f2-6755-4cfe-8855-3071194e8389-registry-tls\") pod \"image-registry-66df7c8f76-wqj7b\" (UID: \"9e21c4f2-6755-4cfe-8855-3071194e8389\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqj7b" Mar 18 18:08:06 crc kubenswrapper[4830]: I0318 18:08:06.450539 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r77h\" (UniqueName: \"kubernetes.io/projected/9e21c4f2-6755-4cfe-8855-3071194e8389-kube-api-access-6r77h\") pod \"image-registry-66df7c8f76-wqj7b\" (UID: \"9e21c4f2-6755-4cfe-8855-3071194e8389\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqj7b" Mar 18 18:08:06 crc kubenswrapper[4830]: I0318 18:08:06.450871 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9e21c4f2-6755-4cfe-8855-3071194e8389-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wqj7b\" (UID: \"9e21c4f2-6755-4cfe-8855-3071194e8389\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqj7b" Mar 18 18:08:06 crc kubenswrapper[4830]: I0318 18:08:06.452067 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9e21c4f2-6755-4cfe-8855-3071194e8389-registry-certificates\") pod \"image-registry-66df7c8f76-wqj7b\" (UID: \"9e21c4f2-6755-4cfe-8855-3071194e8389\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqj7b" Mar 18 18:08:06 crc kubenswrapper[4830]: I0318 18:08:06.452093 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e21c4f2-6755-4cfe-8855-3071194e8389-trusted-ca\") pod \"image-registry-66df7c8f76-wqj7b\" (UID: \"9e21c4f2-6755-4cfe-8855-3071194e8389\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqj7b" Mar 18 18:08:06 crc kubenswrapper[4830]: I0318 18:08:06.456879 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9e21c4f2-6755-4cfe-8855-3071194e8389-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wqj7b\" (UID: \"9e21c4f2-6755-4cfe-8855-3071194e8389\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqj7b" Mar 18 18:08:06 crc kubenswrapper[4830]: I0318 18:08:06.457690 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9e21c4f2-6755-4cfe-8855-3071194e8389-registry-tls\") pod \"image-registry-66df7c8f76-wqj7b\" (UID: \"9e21c4f2-6755-4cfe-8855-3071194e8389\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqj7b" Mar 18 18:08:06 crc kubenswrapper[4830]: I0318 18:08:06.473340 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9e21c4f2-6755-4cfe-8855-3071194e8389-bound-sa-token\") pod \"image-registry-66df7c8f76-wqj7b\" (UID: \"9e21c4f2-6755-4cfe-8855-3071194e8389\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqj7b" Mar 18 18:08:06 crc kubenswrapper[4830]: I0318 18:08:06.481441 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r77h\" (UniqueName: \"kubernetes.io/projected/9e21c4f2-6755-4cfe-8855-3071194e8389-kube-api-access-6r77h\") pod \"image-registry-66df7c8f76-wqj7b\" (UID: \"9e21c4f2-6755-4cfe-8855-3071194e8389\") " pod="openshift-image-registry/image-registry-66df7c8f76-wqj7b" Mar 18 18:08:06 crc kubenswrapper[4830]: I0318 18:08:06.546529 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-wqj7b" Mar 18 18:08:07 crc kubenswrapper[4830]: I0318 18:08:07.046977 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wqj7b"] Mar 18 18:08:07 crc kubenswrapper[4830]: I0318 18:08:07.858942 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-wqj7b" event={"ID":"9e21c4f2-6755-4cfe-8855-3071194e8389","Type":"ContainerStarted","Data":"c338295cafbda8bfc810638d6a8ddf9d686861b49afb8fcdc49c3df730da157d"} Mar 18 18:08:07 crc kubenswrapper[4830]: I0318 18:08:07.859011 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-wqj7b" event={"ID":"9e21c4f2-6755-4cfe-8855-3071194e8389","Type":"ContainerStarted","Data":"59c28dfd98ae9aa4d3cc1711289a0e8f77bf8565ddd538c9ee7ffed0f8eaf5db"} Mar 18 18:08:07 crc kubenswrapper[4830]: I0318 18:08:07.860332 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-wqj7b" Mar 18 18:08:07 crc kubenswrapper[4830]: I0318 18:08:07.888538 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-wqj7b" podStartSLOduration=1.8885034200000002 podStartE2EDuration="1.88850342s" podCreationTimestamp="2026-03-18 18:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:08:07.884454778 +0000 UTC m=+322.452085170" watchObservedRunningTime="2026-03-18 18:08:07.88850342 +0000 UTC m=+322.456133802" Mar 18 18:08:26 crc kubenswrapper[4830]: I0318 18:08:26.554912 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-wqj7b" Mar 18 18:08:26 crc kubenswrapper[4830]: I0318 18:08:26.661323 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nr285"] Mar 18 18:08:51 crc kubenswrapper[4830]: I0318 18:08:51.712754 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-nr285" podUID="f83d2867-10a5-46ca-9f3c-caedae650499" containerName="registry" containerID="cri-o://dd3ad7d0dda9e88427af12f7a7df492711ff983a0f9bf4e1842262d264dc5c3b" gracePeriod=30 Mar 18 18:08:52 crc kubenswrapper[4830]: I0318 18:08:52.161477 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:08:52 crc kubenswrapper[4830]: I0318 18:08:52.176934 4830 generic.go:334] "Generic (PLEG): container finished" podID="f83d2867-10a5-46ca-9f3c-caedae650499" containerID="dd3ad7d0dda9e88427af12f7a7df492711ff983a0f9bf4e1842262d264dc5c3b" exitCode=0 Mar 18 18:08:52 crc kubenswrapper[4830]: I0318 18:08:52.176998 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nr285" event={"ID":"f83d2867-10a5-46ca-9f3c-caedae650499","Type":"ContainerDied","Data":"dd3ad7d0dda9e88427af12f7a7df492711ff983a0f9bf4e1842262d264dc5c3b"} Mar 18 18:08:52 crc kubenswrapper[4830]: I0318 18:08:52.177156 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nr285" Mar 18 18:08:52 crc kubenswrapper[4830]: I0318 18:08:52.177178 4830 scope.go:117] "RemoveContainer" containerID="dd3ad7d0dda9e88427af12f7a7df492711ff983a0f9bf4e1842262d264dc5c3b" Mar 18 18:08:52 crc kubenswrapper[4830]: I0318 18:08:52.177555 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nr285" event={"ID":"f83d2867-10a5-46ca-9f3c-caedae650499","Type":"ContainerDied","Data":"bd444a56f83b7168edd0b851e2c895ba41073af54f370fd23bdfa82a5ba88956"} Mar 18 18:08:52 crc kubenswrapper[4830]: I0318 18:08:52.204743 4830 scope.go:117] "RemoveContainer" containerID="dd3ad7d0dda9e88427af12f7a7df492711ff983a0f9bf4e1842262d264dc5c3b" Mar 18 18:08:52 crc kubenswrapper[4830]: E0318 18:08:52.205297 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd3ad7d0dda9e88427af12f7a7df492711ff983a0f9bf4e1842262d264dc5c3b\": container with ID starting with dd3ad7d0dda9e88427af12f7a7df492711ff983a0f9bf4e1842262d264dc5c3b not found: ID does not exist" containerID="dd3ad7d0dda9e88427af12f7a7df492711ff983a0f9bf4e1842262d264dc5c3b" Mar 18 18:08:52 crc kubenswrapper[4830]: I0318 18:08:52.205346 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd3ad7d0dda9e88427af12f7a7df492711ff983a0f9bf4e1842262d264dc5c3b"} err="failed to get container status \"dd3ad7d0dda9e88427af12f7a7df492711ff983a0f9bf4e1842262d264dc5c3b\": rpc error: code = NotFound desc = could not find container \"dd3ad7d0dda9e88427af12f7a7df492711ff983a0f9bf4e1842262d264dc5c3b\": container with ID starting with dd3ad7d0dda9e88427af12f7a7df492711ff983a0f9bf4e1842262d264dc5c3b not found: ID does not exist" Mar 18 18:08:52 crc kubenswrapper[4830]: I0318 18:08:52.294554 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f83d2867-10a5-46ca-9f3c-caedae650499-bound-sa-token\") pod \"f83d2867-10a5-46ca-9f3c-caedae650499\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " Mar 18 18:08:52 crc kubenswrapper[4830]: I0318 18:08:52.294620 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nt6d8\" (UniqueName: \"kubernetes.io/projected/f83d2867-10a5-46ca-9f3c-caedae650499-kube-api-access-nt6d8\") pod \"f83d2867-10a5-46ca-9f3c-caedae650499\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " Mar 18 18:08:52 crc kubenswrapper[4830]: I0318 18:08:52.294865 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"f83d2867-10a5-46ca-9f3c-caedae650499\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " Mar 18 18:08:52 crc kubenswrapper[4830]: I0318 18:08:52.294907 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f83d2867-10a5-46ca-9f3c-caedae650499-registry-certificates\") pod \"f83d2867-10a5-46ca-9f3c-caedae650499\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " Mar 18 18:08:52 crc kubenswrapper[4830]: I0318 18:08:52.294937 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f83d2867-10a5-46ca-9f3c-caedae650499-ca-trust-extracted\") pod \"f83d2867-10a5-46ca-9f3c-caedae650499\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " Mar 18 18:08:52 crc kubenswrapper[4830]: I0318 18:08:52.294961 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f83d2867-10a5-46ca-9f3c-caedae650499-installation-pull-secrets\") pod \"f83d2867-10a5-46ca-9f3c-caedae650499\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " Mar 18 18:08:52 crc kubenswrapper[4830]: I0318 18:08:52.294997 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f83d2867-10a5-46ca-9f3c-caedae650499-trusted-ca\") pod \"f83d2867-10a5-46ca-9f3c-caedae650499\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " Mar 18 18:08:52 crc kubenswrapper[4830]: I0318 18:08:52.295020 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f83d2867-10a5-46ca-9f3c-caedae650499-registry-tls\") pod \"f83d2867-10a5-46ca-9f3c-caedae650499\" (UID: \"f83d2867-10a5-46ca-9f3c-caedae650499\") " Mar 18 18:08:52 crc kubenswrapper[4830]: I0318 18:08:52.296062 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f83d2867-10a5-46ca-9f3c-caedae650499-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f83d2867-10a5-46ca-9f3c-caedae650499" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:08:52 crc kubenswrapper[4830]: I0318 18:08:52.296250 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f83d2867-10a5-46ca-9f3c-caedae650499-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f83d2867-10a5-46ca-9f3c-caedae650499" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:08:52 crc kubenswrapper[4830]: I0318 18:08:52.303053 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f83d2867-10a5-46ca-9f3c-caedae650499-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f83d2867-10a5-46ca-9f3c-caedae650499" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:08:52 crc kubenswrapper[4830]: I0318 18:08:52.305230 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f83d2867-10a5-46ca-9f3c-caedae650499-kube-api-access-nt6d8" (OuterVolumeSpecName: "kube-api-access-nt6d8") pod "f83d2867-10a5-46ca-9f3c-caedae650499" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499"). InnerVolumeSpecName "kube-api-access-nt6d8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:08:52 crc kubenswrapper[4830]: I0318 18:08:52.306466 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f83d2867-10a5-46ca-9f3c-caedae650499-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "f83d2867-10a5-46ca-9f3c-caedae650499" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:08:52 crc kubenswrapper[4830]: I0318 18:08:52.309533 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f83d2867-10a5-46ca-9f3c-caedae650499-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f83d2867-10a5-46ca-9f3c-caedae650499" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:08:52 crc kubenswrapper[4830]: I0318 18:08:52.312820 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f83d2867-10a5-46ca-9f3c-caedae650499-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f83d2867-10a5-46ca-9f3c-caedae650499" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:08:52 crc kubenswrapper[4830]: I0318 18:08:52.317756 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "f83d2867-10a5-46ca-9f3c-caedae650499" (UID: "f83d2867-10a5-46ca-9f3c-caedae650499"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 18:08:52 crc kubenswrapper[4830]: I0318 18:08:52.397352 4830 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f83d2867-10a5-46ca-9f3c-caedae650499-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 18 18:08:52 crc kubenswrapper[4830]: I0318 18:08:52.397438 4830 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f83d2867-10a5-46ca-9f3c-caedae650499-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:08:52 crc kubenswrapper[4830]: I0318 18:08:52.397453 4830 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f83d2867-10a5-46ca-9f3c-caedae650499-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 18 18:08:52 crc kubenswrapper[4830]: I0318 18:08:52.397464 4830 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f83d2867-10a5-46ca-9f3c-caedae650499-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 18:08:52 crc kubenswrapper[4830]: I0318 18:08:52.397476 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nt6d8\" (UniqueName: \"kubernetes.io/projected/f83d2867-10a5-46ca-9f3c-caedae650499-kube-api-access-nt6d8\") on node \"crc\" DevicePath \"\"" Mar 18 18:08:52 crc kubenswrapper[4830]: I0318 18:08:52.397488 4830 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f83d2867-10a5-46ca-9f3c-caedae650499-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 18 18:08:52 crc kubenswrapper[4830]: I0318 18:08:52.397501 4830 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f83d2867-10a5-46ca-9f3c-caedae650499-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 18 18:08:52 crc kubenswrapper[4830]: I0318 18:08:52.514915 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nr285"] Mar 18 18:08:52 crc kubenswrapper[4830]: I0318 18:08:52.520183 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nr285"] Mar 18 18:08:54 crc kubenswrapper[4830]: I0318 18:08:54.246005 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f83d2867-10a5-46ca-9f3c-caedae650499" path="/var/lib/kubelet/pods/f83d2867-10a5-46ca-9f3c-caedae650499/volumes" Mar 18 18:09:29 crc kubenswrapper[4830]: I0318 18:09:29.509356 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:09:29 crc kubenswrapper[4830]: I0318 18:09:29.510267 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:09:59 crc kubenswrapper[4830]: I0318 18:09:59.510543 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:09:59 crc kubenswrapper[4830]: I0318 18:09:59.511456 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:10:00 crc kubenswrapper[4830]: I0318 18:10:00.150378 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564290-msqq7"] Mar 18 18:10:00 crc kubenswrapper[4830]: E0318 18:10:00.150756 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f83d2867-10a5-46ca-9f3c-caedae650499" containerName="registry" Mar 18 18:10:00 crc kubenswrapper[4830]: I0318 18:10:00.150816 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f83d2867-10a5-46ca-9f3c-caedae650499" containerName="registry" Mar 18 18:10:00 crc kubenswrapper[4830]: I0318 18:10:00.151008 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f83d2867-10a5-46ca-9f3c-caedae650499" containerName="registry" Mar 18 18:10:00 crc kubenswrapper[4830]: I0318 18:10:00.151595 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564290-msqq7" Mar 18 18:10:00 crc kubenswrapper[4830]: I0318 18:10:00.154213 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:10:00 crc kubenswrapper[4830]: I0318 18:10:00.154754 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:10:00 crc kubenswrapper[4830]: I0318 18:10:00.154833 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 18:10:00 crc kubenswrapper[4830]: I0318 18:10:00.160130 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564290-msqq7"] Mar 18 18:10:00 crc kubenswrapper[4830]: I0318 18:10:00.282615 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxh8x\" (UniqueName: \"kubernetes.io/projected/319b53d8-aad8-414f-b7d0-204265ab9921-kube-api-access-pxh8x\") pod \"auto-csr-approver-29564290-msqq7\" (UID: \"319b53d8-aad8-414f-b7d0-204265ab9921\") " pod="openshift-infra/auto-csr-approver-29564290-msqq7" Mar 18 18:10:00 crc kubenswrapper[4830]: I0318 18:10:00.384309 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxh8x\" (UniqueName: \"kubernetes.io/projected/319b53d8-aad8-414f-b7d0-204265ab9921-kube-api-access-pxh8x\") pod \"auto-csr-approver-29564290-msqq7\" (UID: \"319b53d8-aad8-414f-b7d0-204265ab9921\") " pod="openshift-infra/auto-csr-approver-29564290-msqq7" Mar 18 18:10:00 crc kubenswrapper[4830]: I0318 18:10:00.409839 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxh8x\" (UniqueName: \"kubernetes.io/projected/319b53d8-aad8-414f-b7d0-204265ab9921-kube-api-access-pxh8x\") pod \"auto-csr-approver-29564290-msqq7\" (UID: \"319b53d8-aad8-414f-b7d0-204265ab9921\") " pod="openshift-infra/auto-csr-approver-29564290-msqq7" Mar 18 18:10:00 crc kubenswrapper[4830]: I0318 18:10:00.484997 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564290-msqq7" Mar 18 18:10:00 crc kubenswrapper[4830]: I0318 18:10:00.803302 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564290-msqq7"] Mar 18 18:10:00 crc kubenswrapper[4830]: I0318 18:10:00.816479 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 18:10:00 crc kubenswrapper[4830]: I0318 18:10:00.854800 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564290-msqq7" event={"ID":"319b53d8-aad8-414f-b7d0-204265ab9921","Type":"ContainerStarted","Data":"d45cd090ad010c9e391544192d79f5d2652d02bedc49840e0cf6eae8be8112f6"} Mar 18 18:10:02 crc kubenswrapper[4830]: I0318 18:10:02.868564 4830 generic.go:334] "Generic (PLEG): container finished" podID="319b53d8-aad8-414f-b7d0-204265ab9921" containerID="de3bb072dd44b76e7b59f4f0e7c9702ffe3cb0ff2228c0454fcbc2eabf8b651d" exitCode=0 Mar 18 18:10:02 crc kubenswrapper[4830]: I0318 18:10:02.868651 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564290-msqq7" event={"ID":"319b53d8-aad8-414f-b7d0-204265ab9921","Type":"ContainerDied","Data":"de3bb072dd44b76e7b59f4f0e7c9702ffe3cb0ff2228c0454fcbc2eabf8b651d"} Mar 18 18:10:04 crc kubenswrapper[4830]: I0318 18:10:04.120092 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564290-msqq7" Mar 18 18:10:04 crc kubenswrapper[4830]: I0318 18:10:04.242144 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxh8x\" (UniqueName: \"kubernetes.io/projected/319b53d8-aad8-414f-b7d0-204265ab9921-kube-api-access-pxh8x\") pod \"319b53d8-aad8-414f-b7d0-204265ab9921\" (UID: \"319b53d8-aad8-414f-b7d0-204265ab9921\") " Mar 18 18:10:04 crc kubenswrapper[4830]: I0318 18:10:04.252497 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/319b53d8-aad8-414f-b7d0-204265ab9921-kube-api-access-pxh8x" (OuterVolumeSpecName: "kube-api-access-pxh8x") pod "319b53d8-aad8-414f-b7d0-204265ab9921" (UID: "319b53d8-aad8-414f-b7d0-204265ab9921"). InnerVolumeSpecName "kube-api-access-pxh8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:10:04 crc kubenswrapper[4830]: I0318 18:10:04.344196 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxh8x\" (UniqueName: \"kubernetes.io/projected/319b53d8-aad8-414f-b7d0-204265ab9921-kube-api-access-pxh8x\") on node \"crc\" DevicePath \"\"" Mar 18 18:10:04 crc kubenswrapper[4830]: I0318 18:10:04.884245 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564290-msqq7" event={"ID":"319b53d8-aad8-414f-b7d0-204265ab9921","Type":"ContainerDied","Data":"d45cd090ad010c9e391544192d79f5d2652d02bedc49840e0cf6eae8be8112f6"} Mar 18 18:10:04 crc kubenswrapper[4830]: I0318 18:10:04.884291 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d45cd090ad010c9e391544192d79f5d2652d02bedc49840e0cf6eae8be8112f6" Mar 18 18:10:04 crc kubenswrapper[4830]: I0318 18:10:04.884357 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564290-msqq7" Mar 18 18:10:29 crc kubenswrapper[4830]: I0318 18:10:29.509426 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:10:29 crc kubenswrapper[4830]: I0318 18:10:29.510316 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:10:29 crc kubenswrapper[4830]: I0318 18:10:29.510400 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" Mar 18 18:10:29 crc kubenswrapper[4830]: I0318 18:10:29.511344 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"42dc38d8df972677fdf94614f323382629f2127ebd6ae0c69812cf7b8f842f9e"} pod="openshift-machine-config-operator/machine-config-daemon-plzpb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 18:10:29 crc kubenswrapper[4830]: I0318 18:10:29.511470 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" containerID="cri-o://42dc38d8df972677fdf94614f323382629f2127ebd6ae0c69812cf7b8f842f9e" gracePeriod=600 Mar 18 18:10:30 crc kubenswrapper[4830]: I0318 18:10:30.081737 4830 generic.go:334] "Generic (PLEG): container finished" podID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerID="42dc38d8df972677fdf94614f323382629f2127ebd6ae0c69812cf7b8f842f9e" exitCode=0 Mar 18 18:10:30 crc kubenswrapper[4830]: I0318 18:10:30.081821 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerDied","Data":"42dc38d8df972677fdf94614f323382629f2127ebd6ae0c69812cf7b8f842f9e"} Mar 18 18:10:30 crc kubenswrapper[4830]: I0318 18:10:30.082270 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerStarted","Data":"00c3c6a3091f8f5d9397121aaf2ddaed1a26f2cb7f216702ce3187e6b6274afc"} Mar 18 18:10:30 crc kubenswrapper[4830]: I0318 18:10:30.082323 4830 scope.go:117] "RemoveContainer" containerID="1de4e26f6767da64f02f9792da506b0d4a20c0e15b76e432cd3ee81dff89156a" Mar 18 18:12:00 crc kubenswrapper[4830]: I0318 18:12:00.146533 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564292-cw525"] Mar 18 18:12:00 crc kubenswrapper[4830]: E0318 18:12:00.147652 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="319b53d8-aad8-414f-b7d0-204265ab9921" containerName="oc" Mar 18 18:12:00 crc kubenswrapper[4830]: I0318 18:12:00.147673 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="319b53d8-aad8-414f-b7d0-204265ab9921" containerName="oc" Mar 18 18:12:00 crc kubenswrapper[4830]: I0318 18:12:00.147877 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="319b53d8-aad8-414f-b7d0-204265ab9921" containerName="oc" Mar 18 18:12:00 crc kubenswrapper[4830]: I0318 18:12:00.148402 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564292-cw525" Mar 18 18:12:00 crc kubenswrapper[4830]: I0318 18:12:00.151058 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564292-cw525"] Mar 18 18:12:00 crc kubenswrapper[4830]: I0318 18:12:00.151375 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 18:12:00 crc kubenswrapper[4830]: I0318 18:12:00.151807 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:12:00 crc kubenswrapper[4830]: I0318 18:12:00.151864 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:12:00 crc kubenswrapper[4830]: I0318 18:12:00.214374 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zzks\" (UniqueName: \"kubernetes.io/projected/cebfa1a5-fbb1-42df-91aa-7b6b07bf72d0-kube-api-access-9zzks\") pod \"auto-csr-approver-29564292-cw525\" (UID: \"cebfa1a5-fbb1-42df-91aa-7b6b07bf72d0\") " pod="openshift-infra/auto-csr-approver-29564292-cw525" Mar 18 18:12:00 crc kubenswrapper[4830]: I0318 18:12:00.315492 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zzks\" (UniqueName: \"kubernetes.io/projected/cebfa1a5-fbb1-42df-91aa-7b6b07bf72d0-kube-api-access-9zzks\") pod \"auto-csr-approver-29564292-cw525\" (UID: \"cebfa1a5-fbb1-42df-91aa-7b6b07bf72d0\") " pod="openshift-infra/auto-csr-approver-29564292-cw525" Mar 18 18:12:00 crc kubenswrapper[4830]: I0318 18:12:00.337524 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zzks\" (UniqueName: \"kubernetes.io/projected/cebfa1a5-fbb1-42df-91aa-7b6b07bf72d0-kube-api-access-9zzks\") pod \"auto-csr-approver-29564292-cw525\" (UID: \"cebfa1a5-fbb1-42df-91aa-7b6b07bf72d0\") " pod="openshift-infra/auto-csr-approver-29564292-cw525" Mar 18 18:12:00 crc kubenswrapper[4830]: I0318 18:12:00.513087 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564292-cw525" Mar 18 18:12:00 crc kubenswrapper[4830]: I0318 18:12:00.754860 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564292-cw525"] Mar 18 18:12:00 crc kubenswrapper[4830]: I0318 18:12:00.956395 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564292-cw525" event={"ID":"cebfa1a5-fbb1-42df-91aa-7b6b07bf72d0","Type":"ContainerStarted","Data":"e701b71da18355199a4fe48d02c2930f0fe4b24f6c5193548ae1c3c8e5acc851"} Mar 18 18:12:02 crc kubenswrapper[4830]: I0318 18:12:02.972269 4830 generic.go:334] "Generic (PLEG): container finished" podID="cebfa1a5-fbb1-42df-91aa-7b6b07bf72d0" containerID="7e77e9b2ec82769db03ddeef037c29764253859bdb47ad46ff58506f55cd808a" exitCode=0 Mar 18 18:12:02 crc kubenswrapper[4830]: I0318 18:12:02.972365 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564292-cw525" event={"ID":"cebfa1a5-fbb1-42df-91aa-7b6b07bf72d0","Type":"ContainerDied","Data":"7e77e9b2ec82769db03ddeef037c29764253859bdb47ad46ff58506f55cd808a"} Mar 18 18:12:04 crc kubenswrapper[4830]: I0318 18:12:04.235661 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564292-cw525" Mar 18 18:12:04 crc kubenswrapper[4830]: I0318 18:12:04.369357 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zzks\" (UniqueName: \"kubernetes.io/projected/cebfa1a5-fbb1-42df-91aa-7b6b07bf72d0-kube-api-access-9zzks\") pod \"cebfa1a5-fbb1-42df-91aa-7b6b07bf72d0\" (UID: \"cebfa1a5-fbb1-42df-91aa-7b6b07bf72d0\") " Mar 18 18:12:04 crc kubenswrapper[4830]: I0318 18:12:04.381936 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cebfa1a5-fbb1-42df-91aa-7b6b07bf72d0-kube-api-access-9zzks" (OuterVolumeSpecName: "kube-api-access-9zzks") pod "cebfa1a5-fbb1-42df-91aa-7b6b07bf72d0" (UID: "cebfa1a5-fbb1-42df-91aa-7b6b07bf72d0"). InnerVolumeSpecName "kube-api-access-9zzks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:12:04 crc kubenswrapper[4830]: I0318 18:12:04.471991 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zzks\" (UniqueName: \"kubernetes.io/projected/cebfa1a5-fbb1-42df-91aa-7b6b07bf72d0-kube-api-access-9zzks\") on node \"crc\" DevicePath \"\"" Mar 18 18:12:04 crc kubenswrapper[4830]: I0318 18:12:04.986108 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564292-cw525" event={"ID":"cebfa1a5-fbb1-42df-91aa-7b6b07bf72d0","Type":"ContainerDied","Data":"e701b71da18355199a4fe48d02c2930f0fe4b24f6c5193548ae1c3c8e5acc851"} Mar 18 18:12:04 crc kubenswrapper[4830]: I0318 18:12:04.986146 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e701b71da18355199a4fe48d02c2930f0fe4b24f6c5193548ae1c3c8e5acc851" Mar 18 18:12:04 crc kubenswrapper[4830]: I0318 18:12:04.986173 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564292-cw525" Mar 18 18:12:05 crc kubenswrapper[4830]: I0318 18:12:05.306736 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564286-l7vpb"] Mar 18 18:12:05 crc kubenswrapper[4830]: I0318 18:12:05.310891 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564286-l7vpb"] Mar 18 18:12:06 crc kubenswrapper[4830]: I0318 18:12:06.244875 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9199b38a-eef8-4a83-a1b8-0f6fd6faaffd" path="/var/lib/kubelet/pods/9199b38a-eef8-4a83-a1b8-0f6fd6faaffd/volumes" Mar 18 18:12:29 crc kubenswrapper[4830]: I0318 18:12:29.509472 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:12:29 crc kubenswrapper[4830]: I0318 18:12:29.510447 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:12:59 crc kubenswrapper[4830]: I0318 18:12:59.509405 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:12:59 crc kubenswrapper[4830]: I0318 18:12:59.510110 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:13:10 crc kubenswrapper[4830]: I0318 18:13:10.319086 4830 scope.go:117] "RemoveContainer" containerID="fd9f357484bb5699efad9a8d926a5b96868a1510b7a97d470ebc5dbd41c4c863" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.370169 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vjt8t"] Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.371480 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerName="ovn-controller" containerID="cri-o://81c3b57759b7e66d561bfe18dcaaf96eac2559368ca5365ef503c9a51c40257a" gracePeriod=30 Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.371577 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://49631a35ca0e3e3c551bf4a7bbf459082852d2954932d8e8212368032ae2cc14" gracePeriod=30 Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.371644 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerName="ovn-acl-logging" containerID="cri-o://c629b99c9a5b1f6a1e086a9549a39f86a1f029131265428efa37c24d248aa0d9" gracePeriod=30 Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.371637 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerName="kube-rbac-proxy-node" containerID="cri-o://7b85b051550a6614e24d0456dc036cbe8af92feb00795c5b5d8524ce25df2891" gracePeriod=30 Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.371733 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerName="sbdb" containerID="cri-o://786c218fae0f716a035dc2a2c4e3799634bd5378dbf735aaceadfbab8bb1514e" gracePeriod=30 Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.371763 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerName="northd" containerID="cri-o://9c055074fab1f5b68af5ed7cc89c931f51bee0cb3e31c0853b5929a15a28e524" gracePeriod=30 Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.371862 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerName="nbdb" containerID="cri-o://3ed91f96f623365d7facd62210c861526be22ad64521a84c54d9a3aabf209195" gracePeriod=30 Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.453547 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerName="ovnkube-controller" containerID="cri-o://c62190ab09749ef2cbc13187502af58d6172a5156803e23363a6f5bf35f7d482" gracePeriod=30 Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.547904 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjt8t_af6abd23-401c-4f5a-a63a-19d7eed4f9ef/ovn-acl-logging/0.log" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.548577 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjt8t_af6abd23-401c-4f5a-a63a-19d7eed4f9ef/ovn-controller/0.log" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.549244 4830 generic.go:334] "Generic (PLEG): container finished" podID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerID="49631a35ca0e3e3c551bf4a7bbf459082852d2954932d8e8212368032ae2cc14" exitCode=0 Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.549329 4830 generic.go:334] "Generic (PLEG): container finished" podID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerID="7b85b051550a6614e24d0456dc036cbe8af92feb00795c5b5d8524ce25df2891" exitCode=0 Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.549340 4830 generic.go:334] "Generic (PLEG): container finished" podID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerID="c629b99c9a5b1f6a1e086a9549a39f86a1f029131265428efa37c24d248aa0d9" exitCode=143 Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.549351 4830 generic.go:334] "Generic (PLEG): container finished" podID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerID="81c3b57759b7e66d561bfe18dcaaf96eac2559368ca5365ef503c9a51c40257a" exitCode=143 Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.549302 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" event={"ID":"af6abd23-401c-4f5a-a63a-19d7eed4f9ef","Type":"ContainerDied","Data":"49631a35ca0e3e3c551bf4a7bbf459082852d2954932d8e8212368032ae2cc14"} Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.549420 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" event={"ID":"af6abd23-401c-4f5a-a63a-19d7eed4f9ef","Type":"ContainerDied","Data":"7b85b051550a6614e24d0456dc036cbe8af92feb00795c5b5d8524ce25df2891"} Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.549432 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" event={"ID":"af6abd23-401c-4f5a-a63a-19d7eed4f9ef","Type":"ContainerDied","Data":"c629b99c9a5b1f6a1e086a9549a39f86a1f029131265428efa37c24d248aa0d9"} Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.549442 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" event={"ID":"af6abd23-401c-4f5a-a63a-19d7eed4f9ef","Type":"ContainerDied","Data":"81c3b57759b7e66d561bfe18dcaaf96eac2559368ca5365ef503c9a51c40257a"} Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.551406 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zpw8m_55b8eced-700a-4b44-8315-c5afac8ca1bf/kube-multus/0.log" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.551467 4830 generic.go:334] "Generic (PLEG): container finished" podID="55b8eced-700a-4b44-8315-c5afac8ca1bf" containerID="b2527278040093822f87c66655799a34ece81575e3a39c64302b99c1b2945142" exitCode=2 Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.551503 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zpw8m" event={"ID":"55b8eced-700a-4b44-8315-c5afac8ca1bf","Type":"ContainerDied","Data":"b2527278040093822f87c66655799a34ece81575e3a39c64302b99c1b2945142"} Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.552041 4830 scope.go:117] "RemoveContainer" containerID="b2527278040093822f87c66655799a34ece81575e3a39c64302b99c1b2945142" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.767349 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjt8t_af6abd23-401c-4f5a-a63a-19d7eed4f9ef/ovn-acl-logging/0.log" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.768220 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjt8t_af6abd23-401c-4f5a-a63a-19d7eed4f9ef/ovn-controller/0.log" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.768620 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.785410 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-systemd-units\") pod \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.785475 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-run-netns\") pod \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.785512 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-run-openvswitch\") pod \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.785548 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-cni-netd\") pod \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.785548 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "af6abd23-401c-4f5a-a63a-19d7eed4f9ef" (UID: "af6abd23-401c-4f5a-a63a-19d7eed4f9ef"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.785548 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "af6abd23-401c-4f5a-a63a-19d7eed4f9ef" (UID: "af6abd23-401c-4f5a-a63a-19d7eed4f9ef"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.785598 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "af6abd23-401c-4f5a-a63a-19d7eed4f9ef" (UID: "af6abd23-401c-4f5a-a63a-19d7eed4f9ef"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.785578 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-etc-openvswitch\") pod \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.785623 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "af6abd23-401c-4f5a-a63a-19d7eed4f9ef" (UID: "af6abd23-401c-4f5a-a63a-19d7eed4f9ef"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.785624 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "af6abd23-401c-4f5a-a63a-19d7eed4f9ef" (UID: "af6abd23-401c-4f5a-a63a-19d7eed4f9ef"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.785681 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-run-ovn-kubernetes\") pod \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.785721 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-env-overrides\") pod \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.785761 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8r9t\" (UniqueName: \"kubernetes.io/projected/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-kube-api-access-s8r9t\") pod \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.785835 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-log-socket\") pod \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.785842 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "af6abd23-401c-4f5a-a63a-19d7eed4f9ef" (UID: "af6abd23-401c-4f5a-a63a-19d7eed4f9ef"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.785864 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-cni-bin\") pod \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.785891 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "af6abd23-401c-4f5a-a63a-19d7eed4f9ef" (UID: "af6abd23-401c-4f5a-a63a-19d7eed4f9ef"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.785915 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-log-socket" (OuterVolumeSpecName: "log-socket") pod "af6abd23-401c-4f5a-a63a-19d7eed4f9ef" (UID: "af6abd23-401c-4f5a-a63a-19d7eed4f9ef"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.785936 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-kubelet\") pod \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.785957 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "af6abd23-401c-4f5a-a63a-19d7eed4f9ef" (UID: "af6abd23-401c-4f5a-a63a-19d7eed4f9ef"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.785970 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-node-log\") pod \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.786008 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-slash\") pod \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.786006 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-node-log" (OuterVolumeSpecName: "node-log") pod "af6abd23-401c-4f5a-a63a-19d7eed4f9ef" (UID: "af6abd23-401c-4f5a-a63a-19d7eed4f9ef"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.786027 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-run-systemd\") pod \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.786045 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-var-lib-openvswitch\") pod \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.786055 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-slash" (OuterVolumeSpecName: "host-slash") pod "af6abd23-401c-4f5a-a63a-19d7eed4f9ef" (UID: "af6abd23-401c-4f5a-a63a-19d7eed4f9ef"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.786307 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "af6abd23-401c-4f5a-a63a-19d7eed4f9ef" (UID: "af6abd23-401c-4f5a-a63a-19d7eed4f9ef"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.786350 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "af6abd23-401c-4f5a-a63a-19d7eed4f9ef" (UID: "af6abd23-401c-4f5a-a63a-19d7eed4f9ef"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.786067 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-run-ovn\") pod \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.786444 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-var-lib-cni-networks-ovn-kubernetes\") pod \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.786521 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-ovnkube-config\") pod \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.786537 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "af6abd23-401c-4f5a-a63a-19d7eed4f9ef" (UID: "af6abd23-401c-4f5a-a63a-19d7eed4f9ef"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.786568 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-ovn-node-metrics-cert\") pod \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.786610 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-ovnkube-script-lib\") pod \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\" (UID: \"af6abd23-401c-4f5a-a63a-19d7eed4f9ef\") " Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.787094 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "af6abd23-401c-4f5a-a63a-19d7eed4f9ef" (UID: "af6abd23-401c-4f5a-a63a-19d7eed4f9ef"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.787174 4830 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.787204 4830 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.787222 4830 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.787240 4830 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.787257 4830 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.787274 4830 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.787292 4830 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-log-socket\") on node \"crc\" DevicePath \"\"" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.787307 4830 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.787322 4830 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.787336 4830 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-node-log\") on node \"crc\" DevicePath \"\"" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.787351 4830 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-slash\") on node \"crc\" DevicePath \"\"" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.787368 4830 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.787384 4830 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.787400 4830 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.787618 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "af6abd23-401c-4f5a-a63a-19d7eed4f9ef" (UID: "af6abd23-401c-4f5a-a63a-19d7eed4f9ef"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.787749 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "af6abd23-401c-4f5a-a63a-19d7eed4f9ef" (UID: "af6abd23-401c-4f5a-a63a-19d7eed4f9ef"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.796233 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "af6abd23-401c-4f5a-a63a-19d7eed4f9ef" (UID: "af6abd23-401c-4f5a-a63a-19d7eed4f9ef"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.796272 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-kube-api-access-s8r9t" (OuterVolumeSpecName: "kube-api-access-s8r9t") pod "af6abd23-401c-4f5a-a63a-19d7eed4f9ef" (UID: "af6abd23-401c-4f5a-a63a-19d7eed4f9ef"). InnerVolumeSpecName "kube-api-access-s8r9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.812448 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "af6abd23-401c-4f5a-a63a-19d7eed4f9ef" (UID: "af6abd23-401c-4f5a-a63a-19d7eed4f9ef"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.829431 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2dj28"] Mar 18 18:13:23 crc kubenswrapper[4830]: E0318 18:13:23.829795 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerName="ovn-acl-logging" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.829818 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerName="ovn-acl-logging" Mar 18 18:13:23 crc kubenswrapper[4830]: E0318 18:13:23.829830 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.829838 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 18:13:23 crc kubenswrapper[4830]: E0318 18:13:23.829849 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cebfa1a5-fbb1-42df-91aa-7b6b07bf72d0" containerName="oc" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.829856 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="cebfa1a5-fbb1-42df-91aa-7b6b07bf72d0" containerName="oc" Mar 18 18:13:23 crc kubenswrapper[4830]: E0318 18:13:23.829873 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerName="ovnkube-controller" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.829880 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerName="ovnkube-controller" Mar 18 18:13:23 crc kubenswrapper[4830]: E0318 18:13:23.829891 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerName="kube-rbac-proxy-node" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.829900 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerName="kube-rbac-proxy-node" Mar 18 18:13:23 crc kubenswrapper[4830]: E0318 18:13:23.829909 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerName="northd" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.829918 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerName="northd" Mar 18 18:13:23 crc kubenswrapper[4830]: E0318 18:13:23.829935 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerName="kubecfg-setup" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.829942 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerName="kubecfg-setup" Mar 18 18:13:23 crc kubenswrapper[4830]: E0318 18:13:23.829953 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerName="sbdb" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.829961 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerName="sbdb" Mar 18 18:13:23 crc kubenswrapper[4830]: E0318 18:13:23.829969 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerName="nbdb" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.829977 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerName="nbdb" Mar 18 18:13:23 crc kubenswrapper[4830]: E0318 18:13:23.829986 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerName="ovn-controller" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.829995 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerName="ovn-controller" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.830102 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerName="ovn-acl-logging" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.830113 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="cebfa1a5-fbb1-42df-91aa-7b6b07bf72d0" containerName="oc" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.830126 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerName="nbdb" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.830136 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerName="sbdb" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.830148 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerName="ovn-controller" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.830158 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerName="kube-rbac-proxy-node" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.830169 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.830180 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerName="northd" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.830188 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerName="ovnkube-controller" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.833170 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.888649 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-ovn-node-metrics-cert\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.889148 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-run-ovn\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.889307 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-run-openvswitch\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.889461 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6sdb\" (UniqueName: \"kubernetes.io/projected/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-kube-api-access-s6sdb\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.889622 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.889763 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-host-run-netns\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.889914 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-var-lib-openvswitch\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.890042 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-host-run-ovn-kubernetes\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.890223 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-host-cni-bin\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.890279 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-run-systemd\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.890314 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-node-log\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.890348 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-host-cni-netd\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.890473 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-systemd-units\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.890546 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-env-overrides\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.890575 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-host-kubelet\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.890593 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-ovnkube-script-lib\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.890628 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-log-socket\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.890718 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-ovnkube-config\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.890791 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-etc-openvswitch\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.890827 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-host-slash\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.890896 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8r9t\" (UniqueName: \"kubernetes.io/projected/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-kube-api-access-s8r9t\") on node \"crc\" DevicePath \"\"" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.890919 4830 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.890935 4830 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.890952 4830 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.890968 4830 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.890986 4830 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/af6abd23-401c-4f5a-a63a-19d7eed4f9ef-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.992563 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-node-log\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.992870 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-host-cni-netd\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.992950 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-host-cni-netd\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.992973 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-systemd-units\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.992726 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-node-log\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.993060 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-env-overrides\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.993087 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-ovnkube-script-lib\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.993108 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-host-kubelet\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.993139 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-log-socket\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.993170 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-ovnkube-config\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.993187 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-host-slash\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.993204 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-etc-openvswitch\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.993237 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-ovn-node-metrics-cert\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.993221 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-log-socket\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.993246 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-host-kubelet\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.993280 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-host-slash\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.993282 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-run-ovn\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.993298 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-run-ovn\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.993348 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-run-openvswitch\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.993371 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6sdb\" (UniqueName: \"kubernetes.io/projected/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-kube-api-access-s6sdb\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.993397 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.993422 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-host-run-ovn-kubernetes\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.993442 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-host-run-netns\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.993460 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-var-lib-openvswitch\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.993527 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-host-cni-bin\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.993578 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-run-systemd\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.993678 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-run-systemd\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.993681 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-env-overrides\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.993710 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-host-run-netns\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.993737 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.993801 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-host-cni-bin\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.993807 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-host-run-ovn-kubernetes\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.993820 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-etc-openvswitch\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.993835 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-var-lib-openvswitch\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.994183 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-run-openvswitch\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.994341 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-ovnkube-config\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.995048 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-ovnkube-script-lib\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.995214 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-systemd-units\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:23 crc kubenswrapper[4830]: I0318 18:13:23.998152 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-ovn-node-metrics-cert\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.011723 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6sdb\" (UniqueName: \"kubernetes.io/projected/a716ef1c-0b4e-4de5-a3b9-87af15338fbc-kube-api-access-s6sdb\") pod \"ovnkube-node-2dj28\" (UID: \"a716ef1c-0b4e-4de5-a3b9-87af15338fbc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.150339 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:24 crc kubenswrapper[4830]: W0318 18:13:24.177964 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda716ef1c_0b4e_4de5_a3b9_87af15338fbc.slice/crio-464ee0b431b4d4f117c949387caf28227657af0fc4552fdcebd60d5580dc01a5 WatchSource:0}: Error finding container 464ee0b431b4d4f117c949387caf28227657af0fc4552fdcebd60d5580dc01a5: Status 404 returned error can't find the container with id 464ee0b431b4d4f117c949387caf28227657af0fc4552fdcebd60d5580dc01a5 Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.558363 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjt8t_af6abd23-401c-4f5a-a63a-19d7eed4f9ef/ovn-acl-logging/0.log" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.559477 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjt8t_af6abd23-401c-4f5a-a63a-19d7eed4f9ef/ovn-controller/0.log" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.560065 4830 generic.go:334] "Generic (PLEG): container finished" podID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerID="c62190ab09749ef2cbc13187502af58d6172a5156803e23363a6f5bf35f7d482" exitCode=0 Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.560090 4830 generic.go:334] "Generic (PLEG): container finished" podID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerID="786c218fae0f716a035dc2a2c4e3799634bd5378dbf735aaceadfbab8bb1514e" exitCode=0 Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.560099 4830 generic.go:334] "Generic (PLEG): container finished" podID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerID="3ed91f96f623365d7facd62210c861526be22ad64521a84c54d9a3aabf209195" exitCode=0 Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.560107 4830 generic.go:334] "Generic (PLEG): container finished" podID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" containerID="9c055074fab1f5b68af5ed7cc89c931f51bee0cb3e31c0853b5929a15a28e524" exitCode=0 Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.560158 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" event={"ID":"af6abd23-401c-4f5a-a63a-19d7eed4f9ef","Type":"ContainerDied","Data":"c62190ab09749ef2cbc13187502af58d6172a5156803e23363a6f5bf35f7d482"} Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.560170 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.560187 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" event={"ID":"af6abd23-401c-4f5a-a63a-19d7eed4f9ef","Type":"ContainerDied","Data":"786c218fae0f716a035dc2a2c4e3799634bd5378dbf735aaceadfbab8bb1514e"} Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.560199 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" event={"ID":"af6abd23-401c-4f5a-a63a-19d7eed4f9ef","Type":"ContainerDied","Data":"3ed91f96f623365d7facd62210c861526be22ad64521a84c54d9a3aabf209195"} Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.560447 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" event={"ID":"af6abd23-401c-4f5a-a63a-19d7eed4f9ef","Type":"ContainerDied","Data":"9c055074fab1f5b68af5ed7cc89c931f51bee0cb3e31c0853b5929a15a28e524"} Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.560462 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjt8t" event={"ID":"af6abd23-401c-4f5a-a63a-19d7eed4f9ef","Type":"ContainerDied","Data":"a7425dfb8b27990e707f48978b8a44c389acf9d1920a77ca6381f874ef3bdd3f"} Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.560241 4830 scope.go:117] "RemoveContainer" containerID="c62190ab09749ef2cbc13187502af58d6172a5156803e23363a6f5bf35f7d482" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.563506 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zpw8m_55b8eced-700a-4b44-8315-c5afac8ca1bf/kube-multus/0.log" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.563585 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zpw8m" event={"ID":"55b8eced-700a-4b44-8315-c5afac8ca1bf","Type":"ContainerStarted","Data":"8430930daea7150ead596ef27f93b9719766fe24716abb74327dfc59b42d82fe"} Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.566889 4830 generic.go:334] "Generic (PLEG): container finished" podID="a716ef1c-0b4e-4de5-a3b9-87af15338fbc" containerID="c212918482edaea351b211d26ecf5347dedf9d0070922661b7ea9c8b2644e8e3" exitCode=0 Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.566941 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" event={"ID":"a716ef1c-0b4e-4de5-a3b9-87af15338fbc","Type":"ContainerDied","Data":"c212918482edaea351b211d26ecf5347dedf9d0070922661b7ea9c8b2644e8e3"} Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.566974 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" event={"ID":"a716ef1c-0b4e-4de5-a3b9-87af15338fbc","Type":"ContainerStarted","Data":"464ee0b431b4d4f117c949387caf28227657af0fc4552fdcebd60d5580dc01a5"} Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.587611 4830 scope.go:117] "RemoveContainer" containerID="786c218fae0f716a035dc2a2c4e3799634bd5378dbf735aaceadfbab8bb1514e" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.589915 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vjt8t"] Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.597186 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vjt8t"] Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.606582 4830 scope.go:117] "RemoveContainer" containerID="3ed91f96f623365d7facd62210c861526be22ad64521a84c54d9a3aabf209195" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.618424 4830 scope.go:117] "RemoveContainer" containerID="9c055074fab1f5b68af5ed7cc89c931f51bee0cb3e31c0853b5929a15a28e524" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.652700 4830 scope.go:117] "RemoveContainer" containerID="49631a35ca0e3e3c551bf4a7bbf459082852d2954932d8e8212368032ae2cc14" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.673662 4830 scope.go:117] "RemoveContainer" containerID="7b85b051550a6614e24d0456dc036cbe8af92feb00795c5b5d8524ce25df2891" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.698149 4830 scope.go:117] "RemoveContainer" containerID="c629b99c9a5b1f6a1e086a9549a39f86a1f029131265428efa37c24d248aa0d9" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.711721 4830 scope.go:117] "RemoveContainer" containerID="81c3b57759b7e66d561bfe18dcaaf96eac2559368ca5365ef503c9a51c40257a" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.732863 4830 scope.go:117] "RemoveContainer" containerID="25bfc51a1d89fadf1e82f3a2d35c1dcbf73f13fcfa73185e9b4db03fb61fdfbc" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.756717 4830 scope.go:117] "RemoveContainer" containerID="c62190ab09749ef2cbc13187502af58d6172a5156803e23363a6f5bf35f7d482" Mar 18 18:13:24 crc kubenswrapper[4830]: E0318 18:13:24.757093 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c62190ab09749ef2cbc13187502af58d6172a5156803e23363a6f5bf35f7d482\": container with ID starting with c62190ab09749ef2cbc13187502af58d6172a5156803e23363a6f5bf35f7d482 not found: ID does not exist" containerID="c62190ab09749ef2cbc13187502af58d6172a5156803e23363a6f5bf35f7d482" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.757151 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c62190ab09749ef2cbc13187502af58d6172a5156803e23363a6f5bf35f7d482"} err="failed to get container status \"c62190ab09749ef2cbc13187502af58d6172a5156803e23363a6f5bf35f7d482\": rpc error: code = NotFound desc = could not find container \"c62190ab09749ef2cbc13187502af58d6172a5156803e23363a6f5bf35f7d482\": container with ID starting with c62190ab09749ef2cbc13187502af58d6172a5156803e23363a6f5bf35f7d482 not found: ID does not exist" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.757196 4830 scope.go:117] "RemoveContainer" containerID="786c218fae0f716a035dc2a2c4e3799634bd5378dbf735aaceadfbab8bb1514e" Mar 18 18:13:24 crc kubenswrapper[4830]: E0318 18:13:24.757531 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"786c218fae0f716a035dc2a2c4e3799634bd5378dbf735aaceadfbab8bb1514e\": container with ID starting with 786c218fae0f716a035dc2a2c4e3799634bd5378dbf735aaceadfbab8bb1514e not found: ID does not exist" containerID="786c218fae0f716a035dc2a2c4e3799634bd5378dbf735aaceadfbab8bb1514e" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.757564 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"786c218fae0f716a035dc2a2c4e3799634bd5378dbf735aaceadfbab8bb1514e"} err="failed to get container status \"786c218fae0f716a035dc2a2c4e3799634bd5378dbf735aaceadfbab8bb1514e\": rpc error: code = NotFound desc = could not find container \"786c218fae0f716a035dc2a2c4e3799634bd5378dbf735aaceadfbab8bb1514e\": container with ID starting with 786c218fae0f716a035dc2a2c4e3799634bd5378dbf735aaceadfbab8bb1514e not found: ID does not exist" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.757580 4830 scope.go:117] "RemoveContainer" containerID="3ed91f96f623365d7facd62210c861526be22ad64521a84c54d9a3aabf209195" Mar 18 18:13:24 crc kubenswrapper[4830]: E0318 18:13:24.758008 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ed91f96f623365d7facd62210c861526be22ad64521a84c54d9a3aabf209195\": container with ID starting with 3ed91f96f623365d7facd62210c861526be22ad64521a84c54d9a3aabf209195 not found: ID does not exist" containerID="3ed91f96f623365d7facd62210c861526be22ad64521a84c54d9a3aabf209195" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.758071 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ed91f96f623365d7facd62210c861526be22ad64521a84c54d9a3aabf209195"} err="failed to get container status \"3ed91f96f623365d7facd62210c861526be22ad64521a84c54d9a3aabf209195\": rpc error: code = NotFound desc = could not find container \"3ed91f96f623365d7facd62210c861526be22ad64521a84c54d9a3aabf209195\": container with ID starting with 3ed91f96f623365d7facd62210c861526be22ad64521a84c54d9a3aabf209195 not found: ID does not exist" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.758098 4830 scope.go:117] "RemoveContainer" containerID="9c055074fab1f5b68af5ed7cc89c931f51bee0cb3e31c0853b5929a15a28e524" Mar 18 18:13:24 crc kubenswrapper[4830]: E0318 18:13:24.758557 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c055074fab1f5b68af5ed7cc89c931f51bee0cb3e31c0853b5929a15a28e524\": container with ID starting with 9c055074fab1f5b68af5ed7cc89c931f51bee0cb3e31c0853b5929a15a28e524 not found: ID does not exist" containerID="9c055074fab1f5b68af5ed7cc89c931f51bee0cb3e31c0853b5929a15a28e524" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.758584 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c055074fab1f5b68af5ed7cc89c931f51bee0cb3e31c0853b5929a15a28e524"} err="failed to get container status \"9c055074fab1f5b68af5ed7cc89c931f51bee0cb3e31c0853b5929a15a28e524\": rpc error: code = NotFound desc = could not find container \"9c055074fab1f5b68af5ed7cc89c931f51bee0cb3e31c0853b5929a15a28e524\": container with ID starting with 9c055074fab1f5b68af5ed7cc89c931f51bee0cb3e31c0853b5929a15a28e524 not found: ID does not exist" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.758600 4830 scope.go:117] "RemoveContainer" containerID="49631a35ca0e3e3c551bf4a7bbf459082852d2954932d8e8212368032ae2cc14" Mar 18 18:13:24 crc kubenswrapper[4830]: E0318 18:13:24.760165 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49631a35ca0e3e3c551bf4a7bbf459082852d2954932d8e8212368032ae2cc14\": container with ID starting with 49631a35ca0e3e3c551bf4a7bbf459082852d2954932d8e8212368032ae2cc14 not found: ID does not exist" containerID="49631a35ca0e3e3c551bf4a7bbf459082852d2954932d8e8212368032ae2cc14" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.760188 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49631a35ca0e3e3c551bf4a7bbf459082852d2954932d8e8212368032ae2cc14"} err="failed to get container status \"49631a35ca0e3e3c551bf4a7bbf459082852d2954932d8e8212368032ae2cc14\": rpc error: code = NotFound desc = could not find container \"49631a35ca0e3e3c551bf4a7bbf459082852d2954932d8e8212368032ae2cc14\": container with ID starting with 49631a35ca0e3e3c551bf4a7bbf459082852d2954932d8e8212368032ae2cc14 not found: ID does not exist" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.760202 4830 scope.go:117] "RemoveContainer" containerID="7b85b051550a6614e24d0456dc036cbe8af92feb00795c5b5d8524ce25df2891" Mar 18 18:13:24 crc kubenswrapper[4830]: E0318 18:13:24.760414 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b85b051550a6614e24d0456dc036cbe8af92feb00795c5b5d8524ce25df2891\": container with ID starting with 7b85b051550a6614e24d0456dc036cbe8af92feb00795c5b5d8524ce25df2891 not found: ID does not exist" containerID="7b85b051550a6614e24d0456dc036cbe8af92feb00795c5b5d8524ce25df2891" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.760429 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b85b051550a6614e24d0456dc036cbe8af92feb00795c5b5d8524ce25df2891"} err="failed to get container status \"7b85b051550a6614e24d0456dc036cbe8af92feb00795c5b5d8524ce25df2891\": rpc error: code = NotFound desc = could not find container \"7b85b051550a6614e24d0456dc036cbe8af92feb00795c5b5d8524ce25df2891\": container with ID starting with 7b85b051550a6614e24d0456dc036cbe8af92feb00795c5b5d8524ce25df2891 not found: ID does not exist" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.760440 4830 scope.go:117] "RemoveContainer" containerID="c629b99c9a5b1f6a1e086a9549a39f86a1f029131265428efa37c24d248aa0d9" Mar 18 18:13:24 crc kubenswrapper[4830]: E0318 18:13:24.764114 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c629b99c9a5b1f6a1e086a9549a39f86a1f029131265428efa37c24d248aa0d9\": container with ID starting with c629b99c9a5b1f6a1e086a9549a39f86a1f029131265428efa37c24d248aa0d9 not found: ID does not exist" containerID="c629b99c9a5b1f6a1e086a9549a39f86a1f029131265428efa37c24d248aa0d9" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.764161 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c629b99c9a5b1f6a1e086a9549a39f86a1f029131265428efa37c24d248aa0d9"} err="failed to get container status \"c629b99c9a5b1f6a1e086a9549a39f86a1f029131265428efa37c24d248aa0d9\": rpc error: code = NotFound desc = could not find container \"c629b99c9a5b1f6a1e086a9549a39f86a1f029131265428efa37c24d248aa0d9\": container with ID starting with c629b99c9a5b1f6a1e086a9549a39f86a1f029131265428efa37c24d248aa0d9 not found: ID does not exist" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.764189 4830 scope.go:117] "RemoveContainer" containerID="81c3b57759b7e66d561bfe18dcaaf96eac2559368ca5365ef503c9a51c40257a" Mar 18 18:13:24 crc kubenswrapper[4830]: E0318 18:13:24.765551 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81c3b57759b7e66d561bfe18dcaaf96eac2559368ca5365ef503c9a51c40257a\": container with ID starting with 81c3b57759b7e66d561bfe18dcaaf96eac2559368ca5365ef503c9a51c40257a not found: ID does not exist" containerID="81c3b57759b7e66d561bfe18dcaaf96eac2559368ca5365ef503c9a51c40257a" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.765596 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81c3b57759b7e66d561bfe18dcaaf96eac2559368ca5365ef503c9a51c40257a"} err="failed to get container status \"81c3b57759b7e66d561bfe18dcaaf96eac2559368ca5365ef503c9a51c40257a\": rpc error: code = NotFound desc = could not find container \"81c3b57759b7e66d561bfe18dcaaf96eac2559368ca5365ef503c9a51c40257a\": container with ID starting with 81c3b57759b7e66d561bfe18dcaaf96eac2559368ca5365ef503c9a51c40257a not found: ID does not exist" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.765615 4830 scope.go:117] "RemoveContainer" containerID="25bfc51a1d89fadf1e82f3a2d35c1dcbf73f13fcfa73185e9b4db03fb61fdfbc" Mar 18 18:13:24 crc kubenswrapper[4830]: E0318 18:13:24.767201 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25bfc51a1d89fadf1e82f3a2d35c1dcbf73f13fcfa73185e9b4db03fb61fdfbc\": container with ID starting with 25bfc51a1d89fadf1e82f3a2d35c1dcbf73f13fcfa73185e9b4db03fb61fdfbc not found: ID does not exist" containerID="25bfc51a1d89fadf1e82f3a2d35c1dcbf73f13fcfa73185e9b4db03fb61fdfbc" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.767265 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25bfc51a1d89fadf1e82f3a2d35c1dcbf73f13fcfa73185e9b4db03fb61fdfbc"} err="failed to get container status \"25bfc51a1d89fadf1e82f3a2d35c1dcbf73f13fcfa73185e9b4db03fb61fdfbc\": rpc error: code = NotFound desc = could not find container \"25bfc51a1d89fadf1e82f3a2d35c1dcbf73f13fcfa73185e9b4db03fb61fdfbc\": container with ID starting with 25bfc51a1d89fadf1e82f3a2d35c1dcbf73f13fcfa73185e9b4db03fb61fdfbc not found: ID does not exist" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.767303 4830 scope.go:117] "RemoveContainer" containerID="c62190ab09749ef2cbc13187502af58d6172a5156803e23363a6f5bf35f7d482" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.767699 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c62190ab09749ef2cbc13187502af58d6172a5156803e23363a6f5bf35f7d482"} err="failed to get container status \"c62190ab09749ef2cbc13187502af58d6172a5156803e23363a6f5bf35f7d482\": rpc error: code = NotFound desc = could not find container \"c62190ab09749ef2cbc13187502af58d6172a5156803e23363a6f5bf35f7d482\": container with ID starting with c62190ab09749ef2cbc13187502af58d6172a5156803e23363a6f5bf35f7d482 not found: ID does not exist" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.767728 4830 scope.go:117] "RemoveContainer" containerID="786c218fae0f716a035dc2a2c4e3799634bd5378dbf735aaceadfbab8bb1514e" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.768067 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"786c218fae0f716a035dc2a2c4e3799634bd5378dbf735aaceadfbab8bb1514e"} err="failed to get container status \"786c218fae0f716a035dc2a2c4e3799634bd5378dbf735aaceadfbab8bb1514e\": rpc error: code = NotFound desc = could not find container \"786c218fae0f716a035dc2a2c4e3799634bd5378dbf735aaceadfbab8bb1514e\": container with ID starting with 786c218fae0f716a035dc2a2c4e3799634bd5378dbf735aaceadfbab8bb1514e not found: ID does not exist" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.768134 4830 scope.go:117] "RemoveContainer" containerID="3ed91f96f623365d7facd62210c861526be22ad64521a84c54d9a3aabf209195" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.768404 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ed91f96f623365d7facd62210c861526be22ad64521a84c54d9a3aabf209195"} err="failed to get container status \"3ed91f96f623365d7facd62210c861526be22ad64521a84c54d9a3aabf209195\": rpc error: code = NotFound desc = could not find container \"3ed91f96f623365d7facd62210c861526be22ad64521a84c54d9a3aabf209195\": container with ID starting with 3ed91f96f623365d7facd62210c861526be22ad64521a84c54d9a3aabf209195 not found: ID does not exist" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.768447 4830 scope.go:117] "RemoveContainer" containerID="9c055074fab1f5b68af5ed7cc89c931f51bee0cb3e31c0853b5929a15a28e524" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.768849 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c055074fab1f5b68af5ed7cc89c931f51bee0cb3e31c0853b5929a15a28e524"} err="failed to get container status \"9c055074fab1f5b68af5ed7cc89c931f51bee0cb3e31c0853b5929a15a28e524\": rpc error: code = NotFound desc = could not find container \"9c055074fab1f5b68af5ed7cc89c931f51bee0cb3e31c0853b5929a15a28e524\": container with ID starting with 9c055074fab1f5b68af5ed7cc89c931f51bee0cb3e31c0853b5929a15a28e524 not found: ID does not exist" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.768921 4830 scope.go:117] "RemoveContainer" containerID="49631a35ca0e3e3c551bf4a7bbf459082852d2954932d8e8212368032ae2cc14" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.769241 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49631a35ca0e3e3c551bf4a7bbf459082852d2954932d8e8212368032ae2cc14"} err="failed to get container status \"49631a35ca0e3e3c551bf4a7bbf459082852d2954932d8e8212368032ae2cc14\": rpc error: code = NotFound desc = could not find container \"49631a35ca0e3e3c551bf4a7bbf459082852d2954932d8e8212368032ae2cc14\": container with ID starting with 49631a35ca0e3e3c551bf4a7bbf459082852d2954932d8e8212368032ae2cc14 not found: ID does not exist" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.769287 4830 scope.go:117] "RemoveContainer" containerID="7b85b051550a6614e24d0456dc036cbe8af92feb00795c5b5d8524ce25df2891" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.769759 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b85b051550a6614e24d0456dc036cbe8af92feb00795c5b5d8524ce25df2891"} err="failed to get container status \"7b85b051550a6614e24d0456dc036cbe8af92feb00795c5b5d8524ce25df2891\": rpc error: code = NotFound desc = could not find container \"7b85b051550a6614e24d0456dc036cbe8af92feb00795c5b5d8524ce25df2891\": container with ID starting with 7b85b051550a6614e24d0456dc036cbe8af92feb00795c5b5d8524ce25df2891 not found: ID does not exist" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.769836 4830 scope.go:117] "RemoveContainer" containerID="c629b99c9a5b1f6a1e086a9549a39f86a1f029131265428efa37c24d248aa0d9" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.770256 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c629b99c9a5b1f6a1e086a9549a39f86a1f029131265428efa37c24d248aa0d9"} err="failed to get container status \"c629b99c9a5b1f6a1e086a9549a39f86a1f029131265428efa37c24d248aa0d9\": rpc error: code = NotFound desc = could not find container \"c629b99c9a5b1f6a1e086a9549a39f86a1f029131265428efa37c24d248aa0d9\": container with ID starting with c629b99c9a5b1f6a1e086a9549a39f86a1f029131265428efa37c24d248aa0d9 not found: ID does not exist" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.770279 4830 scope.go:117] "RemoveContainer" containerID="81c3b57759b7e66d561bfe18dcaaf96eac2559368ca5365ef503c9a51c40257a" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.770723 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81c3b57759b7e66d561bfe18dcaaf96eac2559368ca5365ef503c9a51c40257a"} err="failed to get container status \"81c3b57759b7e66d561bfe18dcaaf96eac2559368ca5365ef503c9a51c40257a\": rpc error: code = NotFound desc = could not find container \"81c3b57759b7e66d561bfe18dcaaf96eac2559368ca5365ef503c9a51c40257a\": container with ID starting with 81c3b57759b7e66d561bfe18dcaaf96eac2559368ca5365ef503c9a51c40257a not found: ID does not exist" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.770759 4830 scope.go:117] "RemoveContainer" containerID="25bfc51a1d89fadf1e82f3a2d35c1dcbf73f13fcfa73185e9b4db03fb61fdfbc" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.772646 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25bfc51a1d89fadf1e82f3a2d35c1dcbf73f13fcfa73185e9b4db03fb61fdfbc"} err="failed to get container status \"25bfc51a1d89fadf1e82f3a2d35c1dcbf73f13fcfa73185e9b4db03fb61fdfbc\": rpc error: code = NotFound desc = could not find container \"25bfc51a1d89fadf1e82f3a2d35c1dcbf73f13fcfa73185e9b4db03fb61fdfbc\": container with ID starting with 25bfc51a1d89fadf1e82f3a2d35c1dcbf73f13fcfa73185e9b4db03fb61fdfbc not found: ID does not exist" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.772668 4830 scope.go:117] "RemoveContainer" containerID="c62190ab09749ef2cbc13187502af58d6172a5156803e23363a6f5bf35f7d482" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.773215 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c62190ab09749ef2cbc13187502af58d6172a5156803e23363a6f5bf35f7d482"} err="failed to get container status \"c62190ab09749ef2cbc13187502af58d6172a5156803e23363a6f5bf35f7d482\": rpc error: code = NotFound desc = could not find container \"c62190ab09749ef2cbc13187502af58d6172a5156803e23363a6f5bf35f7d482\": container with ID starting with c62190ab09749ef2cbc13187502af58d6172a5156803e23363a6f5bf35f7d482 not found: ID does not exist" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.773235 4830 scope.go:117] "RemoveContainer" containerID="786c218fae0f716a035dc2a2c4e3799634bd5378dbf735aaceadfbab8bb1514e" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.773628 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"786c218fae0f716a035dc2a2c4e3799634bd5378dbf735aaceadfbab8bb1514e"} err="failed to get container status \"786c218fae0f716a035dc2a2c4e3799634bd5378dbf735aaceadfbab8bb1514e\": rpc error: code = NotFound desc = could not find container \"786c218fae0f716a035dc2a2c4e3799634bd5378dbf735aaceadfbab8bb1514e\": container with ID starting with 786c218fae0f716a035dc2a2c4e3799634bd5378dbf735aaceadfbab8bb1514e not found: ID does not exist" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.773645 4830 scope.go:117] "RemoveContainer" containerID="3ed91f96f623365d7facd62210c861526be22ad64521a84c54d9a3aabf209195" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.774063 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ed91f96f623365d7facd62210c861526be22ad64521a84c54d9a3aabf209195"} err="failed to get container status \"3ed91f96f623365d7facd62210c861526be22ad64521a84c54d9a3aabf209195\": rpc error: code = NotFound desc = could not find container \"3ed91f96f623365d7facd62210c861526be22ad64521a84c54d9a3aabf209195\": container with ID starting with 3ed91f96f623365d7facd62210c861526be22ad64521a84c54d9a3aabf209195 not found: ID does not exist" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.774265 4830 scope.go:117] "RemoveContainer" containerID="9c055074fab1f5b68af5ed7cc89c931f51bee0cb3e31c0853b5929a15a28e524" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.774846 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c055074fab1f5b68af5ed7cc89c931f51bee0cb3e31c0853b5929a15a28e524"} err="failed to get container status \"9c055074fab1f5b68af5ed7cc89c931f51bee0cb3e31c0853b5929a15a28e524\": rpc error: code = NotFound desc = could not find container \"9c055074fab1f5b68af5ed7cc89c931f51bee0cb3e31c0853b5929a15a28e524\": container with ID starting with 9c055074fab1f5b68af5ed7cc89c931f51bee0cb3e31c0853b5929a15a28e524 not found: ID does not exist" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.774885 4830 scope.go:117] "RemoveContainer" containerID="49631a35ca0e3e3c551bf4a7bbf459082852d2954932d8e8212368032ae2cc14" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.776009 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49631a35ca0e3e3c551bf4a7bbf459082852d2954932d8e8212368032ae2cc14"} err="failed to get container status \"49631a35ca0e3e3c551bf4a7bbf459082852d2954932d8e8212368032ae2cc14\": rpc error: code = NotFound desc = could not find container \"49631a35ca0e3e3c551bf4a7bbf459082852d2954932d8e8212368032ae2cc14\": container with ID starting with 49631a35ca0e3e3c551bf4a7bbf459082852d2954932d8e8212368032ae2cc14 not found: ID does not exist" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.776041 4830 scope.go:117] "RemoveContainer" containerID="7b85b051550a6614e24d0456dc036cbe8af92feb00795c5b5d8524ce25df2891" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.776506 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b85b051550a6614e24d0456dc036cbe8af92feb00795c5b5d8524ce25df2891"} err="failed to get container status \"7b85b051550a6614e24d0456dc036cbe8af92feb00795c5b5d8524ce25df2891\": rpc error: code = NotFound desc = could not find container \"7b85b051550a6614e24d0456dc036cbe8af92feb00795c5b5d8524ce25df2891\": container with ID starting with 7b85b051550a6614e24d0456dc036cbe8af92feb00795c5b5d8524ce25df2891 not found: ID does not exist" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.776556 4830 scope.go:117] "RemoveContainer" containerID="c629b99c9a5b1f6a1e086a9549a39f86a1f029131265428efa37c24d248aa0d9" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.777003 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c629b99c9a5b1f6a1e086a9549a39f86a1f029131265428efa37c24d248aa0d9"} err="failed to get container status \"c629b99c9a5b1f6a1e086a9549a39f86a1f029131265428efa37c24d248aa0d9\": rpc error: code = NotFound desc = could not find container \"c629b99c9a5b1f6a1e086a9549a39f86a1f029131265428efa37c24d248aa0d9\": container with ID starting with c629b99c9a5b1f6a1e086a9549a39f86a1f029131265428efa37c24d248aa0d9 not found: ID does not exist" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.777024 4830 scope.go:117] "RemoveContainer" containerID="81c3b57759b7e66d561bfe18dcaaf96eac2559368ca5365ef503c9a51c40257a" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.777529 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81c3b57759b7e66d561bfe18dcaaf96eac2559368ca5365ef503c9a51c40257a"} err="failed to get container status \"81c3b57759b7e66d561bfe18dcaaf96eac2559368ca5365ef503c9a51c40257a\": rpc error: code = NotFound desc = could not find container \"81c3b57759b7e66d561bfe18dcaaf96eac2559368ca5365ef503c9a51c40257a\": container with ID starting with 81c3b57759b7e66d561bfe18dcaaf96eac2559368ca5365ef503c9a51c40257a not found: ID does not exist" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.777546 4830 scope.go:117] "RemoveContainer" containerID="25bfc51a1d89fadf1e82f3a2d35c1dcbf73f13fcfa73185e9b4db03fb61fdfbc" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.778004 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25bfc51a1d89fadf1e82f3a2d35c1dcbf73f13fcfa73185e9b4db03fb61fdfbc"} err="failed to get container status \"25bfc51a1d89fadf1e82f3a2d35c1dcbf73f13fcfa73185e9b4db03fb61fdfbc\": rpc error: code = NotFound desc = could not find container \"25bfc51a1d89fadf1e82f3a2d35c1dcbf73f13fcfa73185e9b4db03fb61fdfbc\": container with ID starting with 25bfc51a1d89fadf1e82f3a2d35c1dcbf73f13fcfa73185e9b4db03fb61fdfbc not found: ID does not exist" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.778024 4830 scope.go:117] "RemoveContainer" containerID="c62190ab09749ef2cbc13187502af58d6172a5156803e23363a6f5bf35f7d482" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.778321 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c62190ab09749ef2cbc13187502af58d6172a5156803e23363a6f5bf35f7d482"} err="failed to get container status \"c62190ab09749ef2cbc13187502af58d6172a5156803e23363a6f5bf35f7d482\": rpc error: code = NotFound desc = could not find container \"c62190ab09749ef2cbc13187502af58d6172a5156803e23363a6f5bf35f7d482\": container with ID starting with c62190ab09749ef2cbc13187502af58d6172a5156803e23363a6f5bf35f7d482 not found: ID does not exist" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.778333 4830 scope.go:117] "RemoveContainer" containerID="786c218fae0f716a035dc2a2c4e3799634bd5378dbf735aaceadfbab8bb1514e" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.779109 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"786c218fae0f716a035dc2a2c4e3799634bd5378dbf735aaceadfbab8bb1514e"} err="failed to get container status \"786c218fae0f716a035dc2a2c4e3799634bd5378dbf735aaceadfbab8bb1514e\": rpc error: code = NotFound desc = could not find container \"786c218fae0f716a035dc2a2c4e3799634bd5378dbf735aaceadfbab8bb1514e\": container with ID starting with 786c218fae0f716a035dc2a2c4e3799634bd5378dbf735aaceadfbab8bb1514e not found: ID does not exist" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.779155 4830 scope.go:117] "RemoveContainer" containerID="3ed91f96f623365d7facd62210c861526be22ad64521a84c54d9a3aabf209195" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.779446 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ed91f96f623365d7facd62210c861526be22ad64521a84c54d9a3aabf209195"} err="failed to get container status \"3ed91f96f623365d7facd62210c861526be22ad64521a84c54d9a3aabf209195\": rpc error: code = NotFound desc = could not find container \"3ed91f96f623365d7facd62210c861526be22ad64521a84c54d9a3aabf209195\": container with ID starting with 3ed91f96f623365d7facd62210c861526be22ad64521a84c54d9a3aabf209195 not found: ID does not exist" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.779504 4830 scope.go:117] "RemoveContainer" containerID="9c055074fab1f5b68af5ed7cc89c931f51bee0cb3e31c0853b5929a15a28e524" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.779985 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c055074fab1f5b68af5ed7cc89c931f51bee0cb3e31c0853b5929a15a28e524"} err="failed to get container status \"9c055074fab1f5b68af5ed7cc89c931f51bee0cb3e31c0853b5929a15a28e524\": rpc error: code = NotFound desc = could not find container \"9c055074fab1f5b68af5ed7cc89c931f51bee0cb3e31c0853b5929a15a28e524\": container with ID starting with 9c055074fab1f5b68af5ed7cc89c931f51bee0cb3e31c0853b5929a15a28e524 not found: ID does not exist" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.780033 4830 scope.go:117] "RemoveContainer" containerID="49631a35ca0e3e3c551bf4a7bbf459082852d2954932d8e8212368032ae2cc14" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.780317 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49631a35ca0e3e3c551bf4a7bbf459082852d2954932d8e8212368032ae2cc14"} err="failed to get container status \"49631a35ca0e3e3c551bf4a7bbf459082852d2954932d8e8212368032ae2cc14\": rpc error: code = NotFound desc = could not find container \"49631a35ca0e3e3c551bf4a7bbf459082852d2954932d8e8212368032ae2cc14\": container with ID starting with 49631a35ca0e3e3c551bf4a7bbf459082852d2954932d8e8212368032ae2cc14 not found: ID does not exist" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.780345 4830 scope.go:117] "RemoveContainer" containerID="7b85b051550a6614e24d0456dc036cbe8af92feb00795c5b5d8524ce25df2891" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.780659 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b85b051550a6614e24d0456dc036cbe8af92feb00795c5b5d8524ce25df2891"} err="failed to get container status \"7b85b051550a6614e24d0456dc036cbe8af92feb00795c5b5d8524ce25df2891\": rpc error: code = NotFound desc = could not find container \"7b85b051550a6614e24d0456dc036cbe8af92feb00795c5b5d8524ce25df2891\": container with ID starting with 7b85b051550a6614e24d0456dc036cbe8af92feb00795c5b5d8524ce25df2891 not found: ID does not exist" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.780687 4830 scope.go:117] "RemoveContainer" containerID="c629b99c9a5b1f6a1e086a9549a39f86a1f029131265428efa37c24d248aa0d9" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.780986 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c629b99c9a5b1f6a1e086a9549a39f86a1f029131265428efa37c24d248aa0d9"} err="failed to get container status \"c629b99c9a5b1f6a1e086a9549a39f86a1f029131265428efa37c24d248aa0d9\": rpc error: code = NotFound desc = could not find container \"c629b99c9a5b1f6a1e086a9549a39f86a1f029131265428efa37c24d248aa0d9\": container with ID starting with c629b99c9a5b1f6a1e086a9549a39f86a1f029131265428efa37c24d248aa0d9 not found: ID does not exist" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.781004 4830 scope.go:117] "RemoveContainer" containerID="81c3b57759b7e66d561bfe18dcaaf96eac2559368ca5365ef503c9a51c40257a" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.781285 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81c3b57759b7e66d561bfe18dcaaf96eac2559368ca5365ef503c9a51c40257a"} err="failed to get container status \"81c3b57759b7e66d561bfe18dcaaf96eac2559368ca5365ef503c9a51c40257a\": rpc error: code = NotFound desc = could not find container \"81c3b57759b7e66d561bfe18dcaaf96eac2559368ca5365ef503c9a51c40257a\": container with ID starting with 81c3b57759b7e66d561bfe18dcaaf96eac2559368ca5365ef503c9a51c40257a not found: ID does not exist" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.781355 4830 scope.go:117] "RemoveContainer" containerID="25bfc51a1d89fadf1e82f3a2d35c1dcbf73f13fcfa73185e9b4db03fb61fdfbc" Mar 18 18:13:24 crc kubenswrapper[4830]: I0318 18:13:24.781658 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25bfc51a1d89fadf1e82f3a2d35c1dcbf73f13fcfa73185e9b4db03fb61fdfbc"} err="failed to get container status \"25bfc51a1d89fadf1e82f3a2d35c1dcbf73f13fcfa73185e9b4db03fb61fdfbc\": rpc error: code = NotFound desc = could not find container \"25bfc51a1d89fadf1e82f3a2d35c1dcbf73f13fcfa73185e9b4db03fb61fdfbc\": container with ID starting with 25bfc51a1d89fadf1e82f3a2d35c1dcbf73f13fcfa73185e9b4db03fb61fdfbc not found: ID does not exist" Mar 18 18:13:25 crc kubenswrapper[4830]: I0318 18:13:25.578176 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" event={"ID":"a716ef1c-0b4e-4de5-a3b9-87af15338fbc","Type":"ContainerStarted","Data":"0736a2c7343f2a0fd6087aee32dfb1fcf1d73af6f5ccc1a7f62ac3df59f85f73"} Mar 18 18:13:25 crc kubenswrapper[4830]: I0318 18:13:25.578926 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" event={"ID":"a716ef1c-0b4e-4de5-a3b9-87af15338fbc","Type":"ContainerStarted","Data":"d881ffe8622f14b46ca3bdcf879df74d13880f46a6d9978a1203dcaa632e41db"} Mar 18 18:13:25 crc kubenswrapper[4830]: I0318 18:13:25.578943 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" event={"ID":"a716ef1c-0b4e-4de5-a3b9-87af15338fbc","Type":"ContainerStarted","Data":"563645c2ed99f1f4c8c670b9d0de3b83878774b4779ad897f16057f7aea89305"} Mar 18 18:13:25 crc kubenswrapper[4830]: I0318 18:13:25.578954 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" event={"ID":"a716ef1c-0b4e-4de5-a3b9-87af15338fbc","Type":"ContainerStarted","Data":"b950420ef976ce27413909d7c98039a7d721100ab6e939eca0fae596f0a030dd"} Mar 18 18:13:25 crc kubenswrapper[4830]: I0318 18:13:25.578965 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" event={"ID":"a716ef1c-0b4e-4de5-a3b9-87af15338fbc","Type":"ContainerStarted","Data":"3e6601bc0ff49427f312626f349d0f2add8403e0c543d2f39cebef8e58c6a238"} Mar 18 18:13:25 crc kubenswrapper[4830]: I0318 18:13:25.578977 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" event={"ID":"a716ef1c-0b4e-4de5-a3b9-87af15338fbc","Type":"ContainerStarted","Data":"99ac273d9d8b78722f6d89a7baec024b57fa7ccea83817e22be0a7f0fa2f214d"} Mar 18 18:13:26 crc kubenswrapper[4830]: I0318 18:13:26.249250 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af6abd23-401c-4f5a-a63a-19d7eed4f9ef" path="/var/lib/kubelet/pods/af6abd23-401c-4f5a-a63a-19d7eed4f9ef/volumes" Mar 18 18:13:27 crc kubenswrapper[4830]: I0318 18:13:27.602228 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" event={"ID":"a716ef1c-0b4e-4de5-a3b9-87af15338fbc","Type":"ContainerStarted","Data":"976baf2fed7a347a8d413fb6976f12bb783fb27896d44e22815ad0b349bbbf62"} Mar 18 18:13:29 crc kubenswrapper[4830]: I0318 18:13:29.509698 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:13:29 crc kubenswrapper[4830]: I0318 18:13:29.510190 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:13:29 crc kubenswrapper[4830]: I0318 18:13:29.510260 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" Mar 18 18:13:29 crc kubenswrapper[4830]: I0318 18:13:29.511155 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"00c3c6a3091f8f5d9397121aaf2ddaed1a26f2cb7f216702ce3187e6b6274afc"} pod="openshift-machine-config-operator/machine-config-daemon-plzpb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 18:13:29 crc kubenswrapper[4830]: I0318 18:13:29.511259 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" containerID="cri-o://00c3c6a3091f8f5d9397121aaf2ddaed1a26f2cb7f216702ce3187e6b6274afc" gracePeriod=600 Mar 18 18:13:30 crc kubenswrapper[4830]: I0318 18:13:30.629866 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" event={"ID":"a716ef1c-0b4e-4de5-a3b9-87af15338fbc","Type":"ContainerStarted","Data":"2009f32ec375db8eb0691523373c3f9932b5365313acf69d0c5dc33e066342be"} Mar 18 18:13:30 crc kubenswrapper[4830]: I0318 18:13:30.632192 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:30 crc kubenswrapper[4830]: I0318 18:13:30.632264 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:30 crc kubenswrapper[4830]: I0318 18:13:30.632352 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:30 crc kubenswrapper[4830]: I0318 18:13:30.672690 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:30 crc kubenswrapper[4830]: I0318 18:13:30.680332 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:30 crc kubenswrapper[4830]: I0318 18:13:30.717476 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" podStartSLOduration=7.717451942 podStartE2EDuration="7.717451942s" podCreationTimestamp="2026-03-18 18:13:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:13:30.690731074 +0000 UTC m=+645.258361446" watchObservedRunningTime="2026-03-18 18:13:30.717451942 +0000 UTC m=+645.285082274" Mar 18 18:13:31 crc kubenswrapper[4830]: I0318 18:13:31.259075 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-rld4k"] Mar 18 18:13:31 crc kubenswrapper[4830]: I0318 18:13:31.260417 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-rld4k" Mar 18 18:13:31 crc kubenswrapper[4830]: I0318 18:13:31.263186 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 18 18:13:31 crc kubenswrapper[4830]: I0318 18:13:31.263464 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 18 18:13:31 crc kubenswrapper[4830]: I0318 18:13:31.265234 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 18 18:13:31 crc kubenswrapper[4830]: I0318 18:13:31.270671 4830 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-dvfpb" Mar 18 18:13:31 crc kubenswrapper[4830]: I0318 18:13:31.381954 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/42e2d0f3-46a9-4395-abf0-cfba0ce8803e-node-mnt\") pod \"crc-storage-crc-rld4k\" (UID: \"42e2d0f3-46a9-4395-abf0-cfba0ce8803e\") " pod="crc-storage/crc-storage-crc-rld4k" Mar 18 18:13:31 crc kubenswrapper[4830]: I0318 18:13:31.382002 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/42e2d0f3-46a9-4395-abf0-cfba0ce8803e-crc-storage\") pod \"crc-storage-crc-rld4k\" (UID: \"42e2d0f3-46a9-4395-abf0-cfba0ce8803e\") " pod="crc-storage/crc-storage-crc-rld4k" Mar 18 18:13:31 crc kubenswrapper[4830]: I0318 18:13:31.382041 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44qng\" (UniqueName: \"kubernetes.io/projected/42e2d0f3-46a9-4395-abf0-cfba0ce8803e-kube-api-access-44qng\") pod \"crc-storage-crc-rld4k\" (UID: \"42e2d0f3-46a9-4395-abf0-cfba0ce8803e\") " pod="crc-storage/crc-storage-crc-rld4k" Mar 18 18:13:31 crc kubenswrapper[4830]: I0318 18:13:31.398444 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-rld4k"] Mar 18 18:13:31 crc kubenswrapper[4830]: I0318 18:13:31.483693 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/42e2d0f3-46a9-4395-abf0-cfba0ce8803e-node-mnt\") pod \"crc-storage-crc-rld4k\" (UID: \"42e2d0f3-46a9-4395-abf0-cfba0ce8803e\") " pod="crc-storage/crc-storage-crc-rld4k" Mar 18 18:13:31 crc kubenswrapper[4830]: I0318 18:13:31.484107 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/42e2d0f3-46a9-4395-abf0-cfba0ce8803e-crc-storage\") pod \"crc-storage-crc-rld4k\" (UID: \"42e2d0f3-46a9-4395-abf0-cfba0ce8803e\") " pod="crc-storage/crc-storage-crc-rld4k" Mar 18 18:13:31 crc kubenswrapper[4830]: I0318 18:13:31.484008 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/42e2d0f3-46a9-4395-abf0-cfba0ce8803e-node-mnt\") pod \"crc-storage-crc-rld4k\" (UID: \"42e2d0f3-46a9-4395-abf0-cfba0ce8803e\") " pod="crc-storage/crc-storage-crc-rld4k" Mar 18 18:13:31 crc kubenswrapper[4830]: I0318 18:13:31.484973 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/42e2d0f3-46a9-4395-abf0-cfba0ce8803e-crc-storage\") pod \"crc-storage-crc-rld4k\" (UID: \"42e2d0f3-46a9-4395-abf0-cfba0ce8803e\") " pod="crc-storage/crc-storage-crc-rld4k" Mar 18 18:13:31 crc kubenswrapper[4830]: I0318 18:13:31.485019 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44qng\" (UniqueName: \"kubernetes.io/projected/42e2d0f3-46a9-4395-abf0-cfba0ce8803e-kube-api-access-44qng\") pod \"crc-storage-crc-rld4k\" (UID: \"42e2d0f3-46a9-4395-abf0-cfba0ce8803e\") " pod="crc-storage/crc-storage-crc-rld4k" Mar 18 18:13:31 crc kubenswrapper[4830]: I0318 18:13:31.517867 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44qng\" (UniqueName: \"kubernetes.io/projected/42e2d0f3-46a9-4395-abf0-cfba0ce8803e-kube-api-access-44qng\") pod \"crc-storage-crc-rld4k\" (UID: \"42e2d0f3-46a9-4395-abf0-cfba0ce8803e\") " pod="crc-storage/crc-storage-crc-rld4k" Mar 18 18:13:31 crc kubenswrapper[4830]: I0318 18:13:31.583735 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-rld4k" Mar 18 18:13:31 crc kubenswrapper[4830]: E0318 18:13:31.621509 4830 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-rld4k_crc-storage_42e2d0f3-46a9-4395-abf0-cfba0ce8803e_0(aaef035c140e39f2071704ad7e9b868f0a842e9944d61390c75d6f4194dde35c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 18:13:31 crc kubenswrapper[4830]: E0318 18:13:31.621619 4830 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-rld4k_crc-storage_42e2d0f3-46a9-4395-abf0-cfba0ce8803e_0(aaef035c140e39f2071704ad7e9b868f0a842e9944d61390c75d6f4194dde35c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-rld4k" Mar 18 18:13:31 crc kubenswrapper[4830]: E0318 18:13:31.621662 4830 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-rld4k_crc-storage_42e2d0f3-46a9-4395-abf0-cfba0ce8803e_0(aaef035c140e39f2071704ad7e9b868f0a842e9944d61390c75d6f4194dde35c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-rld4k" Mar 18 18:13:31 crc kubenswrapper[4830]: E0318 18:13:31.621747 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-rld4k_crc-storage(42e2d0f3-46a9-4395-abf0-cfba0ce8803e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-rld4k_crc-storage(42e2d0f3-46a9-4395-abf0-cfba0ce8803e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-rld4k_crc-storage_42e2d0f3-46a9-4395-abf0-cfba0ce8803e_0(aaef035c140e39f2071704ad7e9b868f0a842e9944d61390c75d6f4194dde35c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-rld4k" podUID="42e2d0f3-46a9-4395-abf0-cfba0ce8803e" Mar 18 18:13:31 crc kubenswrapper[4830]: I0318 18:13:31.640516 4830 generic.go:334] "Generic (PLEG): container finished" podID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerID="00c3c6a3091f8f5d9397121aaf2ddaed1a26f2cb7f216702ce3187e6b6274afc" exitCode=0 Mar 18 18:13:31 crc kubenswrapper[4830]: I0318 18:13:31.640610 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerDied","Data":"00c3c6a3091f8f5d9397121aaf2ddaed1a26f2cb7f216702ce3187e6b6274afc"} Mar 18 18:13:31 crc kubenswrapper[4830]: I0318 18:13:31.640675 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-rld4k" Mar 18 18:13:31 crc kubenswrapper[4830]: I0318 18:13:31.640705 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerStarted","Data":"3a569bf099365538438bf2523866621050b3b655b0210e45d89e9932425c1a49"} Mar 18 18:13:31 crc kubenswrapper[4830]: I0318 18:13:31.640739 4830 scope.go:117] "RemoveContainer" containerID="42dc38d8df972677fdf94614f323382629f2127ebd6ae0c69812cf7b8f842f9e" Mar 18 18:13:31 crc kubenswrapper[4830]: I0318 18:13:31.641406 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-rld4k" Mar 18 18:13:31 crc kubenswrapper[4830]: E0318 18:13:31.687701 4830 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-rld4k_crc-storage_42e2d0f3-46a9-4395-abf0-cfba0ce8803e_0(29665ffb7fa773b68020dba64bbb8d651d431af5da26935ebe05a7db97b523fb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 18:13:31 crc kubenswrapper[4830]: E0318 18:13:31.687758 4830 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-rld4k_crc-storage_42e2d0f3-46a9-4395-abf0-cfba0ce8803e_0(29665ffb7fa773b68020dba64bbb8d651d431af5da26935ebe05a7db97b523fb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-rld4k" Mar 18 18:13:31 crc kubenswrapper[4830]: E0318 18:13:31.687803 4830 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-rld4k_crc-storage_42e2d0f3-46a9-4395-abf0-cfba0ce8803e_0(29665ffb7fa773b68020dba64bbb8d651d431af5da26935ebe05a7db97b523fb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-rld4k" Mar 18 18:13:31 crc kubenswrapper[4830]: E0318 18:13:31.687860 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-rld4k_crc-storage(42e2d0f3-46a9-4395-abf0-cfba0ce8803e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-rld4k_crc-storage(42e2d0f3-46a9-4395-abf0-cfba0ce8803e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-rld4k_crc-storage_42e2d0f3-46a9-4395-abf0-cfba0ce8803e_0(29665ffb7fa773b68020dba64bbb8d651d431af5da26935ebe05a7db97b523fb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-rld4k" podUID="42e2d0f3-46a9-4395-abf0-cfba0ce8803e" Mar 18 18:13:43 crc kubenswrapper[4830]: I0318 18:13:43.233985 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-rld4k" Mar 18 18:13:43 crc kubenswrapper[4830]: I0318 18:13:43.235597 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-rld4k" Mar 18 18:13:43 crc kubenswrapper[4830]: I0318 18:13:43.503945 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-rld4k"] Mar 18 18:13:43 crc kubenswrapper[4830]: W0318 18:13:43.518993 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42e2d0f3_46a9_4395_abf0_cfba0ce8803e.slice/crio-18f783e0e34f3f8c9f85d21fc67e8cf174beb4db465417042f04447629157fd1 WatchSource:0}: Error finding container 18f783e0e34f3f8c9f85d21fc67e8cf174beb4db465417042f04447629157fd1: Status 404 returned error can't find the container with id 18f783e0e34f3f8c9f85d21fc67e8cf174beb4db465417042f04447629157fd1 Mar 18 18:13:43 crc kubenswrapper[4830]: I0318 18:13:43.726906 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-rld4k" event={"ID":"42e2d0f3-46a9-4395-abf0-cfba0ce8803e","Type":"ContainerStarted","Data":"18f783e0e34f3f8c9f85d21fc67e8cf174beb4db465417042f04447629157fd1"} Mar 18 18:13:45 crc kubenswrapper[4830]: I0318 18:13:45.758818 4830 generic.go:334] "Generic (PLEG): container finished" podID="42e2d0f3-46a9-4395-abf0-cfba0ce8803e" containerID="c79a38fa06ab08731da9f5b76c8877107b5bd2828136bc7e5f348e91feb6593f" exitCode=0 Mar 18 18:13:45 crc kubenswrapper[4830]: I0318 18:13:45.758916 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-rld4k" event={"ID":"42e2d0f3-46a9-4395-abf0-cfba0ce8803e","Type":"ContainerDied","Data":"c79a38fa06ab08731da9f5b76c8877107b5bd2828136bc7e5f348e91feb6593f"} Mar 18 18:13:47 crc kubenswrapper[4830]: I0318 18:13:47.119351 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-rld4k" Mar 18 18:13:47 crc kubenswrapper[4830]: I0318 18:13:47.310266 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44qng\" (UniqueName: \"kubernetes.io/projected/42e2d0f3-46a9-4395-abf0-cfba0ce8803e-kube-api-access-44qng\") pod \"42e2d0f3-46a9-4395-abf0-cfba0ce8803e\" (UID: \"42e2d0f3-46a9-4395-abf0-cfba0ce8803e\") " Mar 18 18:13:47 crc kubenswrapper[4830]: I0318 18:13:47.310410 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/42e2d0f3-46a9-4395-abf0-cfba0ce8803e-crc-storage\") pod \"42e2d0f3-46a9-4395-abf0-cfba0ce8803e\" (UID: \"42e2d0f3-46a9-4395-abf0-cfba0ce8803e\") " Mar 18 18:13:47 crc kubenswrapper[4830]: I0318 18:13:47.310472 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/42e2d0f3-46a9-4395-abf0-cfba0ce8803e-node-mnt\") pod \"42e2d0f3-46a9-4395-abf0-cfba0ce8803e\" (UID: \"42e2d0f3-46a9-4395-abf0-cfba0ce8803e\") " Mar 18 18:13:47 crc kubenswrapper[4830]: I0318 18:13:47.310825 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/42e2d0f3-46a9-4395-abf0-cfba0ce8803e-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "42e2d0f3-46a9-4395-abf0-cfba0ce8803e" (UID: "42e2d0f3-46a9-4395-abf0-cfba0ce8803e"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:13:47 crc kubenswrapper[4830]: I0318 18:13:47.329230 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42e2d0f3-46a9-4395-abf0-cfba0ce8803e-kube-api-access-44qng" (OuterVolumeSpecName: "kube-api-access-44qng") pod "42e2d0f3-46a9-4395-abf0-cfba0ce8803e" (UID: "42e2d0f3-46a9-4395-abf0-cfba0ce8803e"). InnerVolumeSpecName "kube-api-access-44qng". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:13:47 crc kubenswrapper[4830]: I0318 18:13:47.333575 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42e2d0f3-46a9-4395-abf0-cfba0ce8803e-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "42e2d0f3-46a9-4395-abf0-cfba0ce8803e" (UID: "42e2d0f3-46a9-4395-abf0-cfba0ce8803e"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:13:47 crc kubenswrapper[4830]: I0318 18:13:47.412542 4830 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/42e2d0f3-46a9-4395-abf0-cfba0ce8803e-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 18 18:13:47 crc kubenswrapper[4830]: I0318 18:13:47.412667 4830 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/42e2d0f3-46a9-4395-abf0-cfba0ce8803e-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 18 18:13:47 crc kubenswrapper[4830]: I0318 18:13:47.412740 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44qng\" (UniqueName: \"kubernetes.io/projected/42e2d0f3-46a9-4395-abf0-cfba0ce8803e-kube-api-access-44qng\") on node \"crc\" DevicePath \"\"" Mar 18 18:13:47 crc kubenswrapper[4830]: I0318 18:13:47.777839 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-rld4k" event={"ID":"42e2d0f3-46a9-4395-abf0-cfba0ce8803e","Type":"ContainerDied","Data":"18f783e0e34f3f8c9f85d21fc67e8cf174beb4db465417042f04447629157fd1"} Mar 18 18:13:47 crc kubenswrapper[4830]: I0318 18:13:47.777893 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-rld4k" Mar 18 18:13:47 crc kubenswrapper[4830]: I0318 18:13:47.777914 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18f783e0e34f3f8c9f85d21fc67e8cf174beb4db465417042f04447629157fd1" Mar 18 18:13:54 crc kubenswrapper[4830]: I0318 18:13:54.177356 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2dj28" Mar 18 18:13:55 crc kubenswrapper[4830]: I0318 18:13:55.362751 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj"] Mar 18 18:13:55 crc kubenswrapper[4830]: E0318 18:13:55.363443 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42e2d0f3-46a9-4395-abf0-cfba0ce8803e" containerName="storage" Mar 18 18:13:55 crc kubenswrapper[4830]: I0318 18:13:55.363461 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e2d0f3-46a9-4395-abf0-cfba0ce8803e" containerName="storage" Mar 18 18:13:55 crc kubenswrapper[4830]: I0318 18:13:55.363587 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="42e2d0f3-46a9-4395-abf0-cfba0ce8803e" containerName="storage" Mar 18 18:13:55 crc kubenswrapper[4830]: I0318 18:13:55.364463 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj" Mar 18 18:13:55 crc kubenswrapper[4830]: I0318 18:13:55.368353 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 18:13:55 crc kubenswrapper[4830]: I0318 18:13:55.377603 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj"] Mar 18 18:13:55 crc kubenswrapper[4830]: I0318 18:13:55.457555 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6700b709-fb58-41aa-a8e9-aad61b389860-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj\" (UID: \"6700b709-fb58-41aa-a8e9-aad61b389860\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj" Mar 18 18:13:55 crc kubenswrapper[4830]: I0318 18:13:55.457626 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m85bn\" (UniqueName: \"kubernetes.io/projected/6700b709-fb58-41aa-a8e9-aad61b389860-kube-api-access-m85bn\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj\" (UID: \"6700b709-fb58-41aa-a8e9-aad61b389860\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj" Mar 18 18:13:55 crc kubenswrapper[4830]: I0318 18:13:55.457651 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6700b709-fb58-41aa-a8e9-aad61b389860-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj\" (UID: \"6700b709-fb58-41aa-a8e9-aad61b389860\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj" Mar 18 18:13:55 crc kubenswrapper[4830]: I0318 18:13:55.558677 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6700b709-fb58-41aa-a8e9-aad61b389860-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj\" (UID: \"6700b709-fb58-41aa-a8e9-aad61b389860\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj" Mar 18 18:13:55 crc kubenswrapper[4830]: I0318 18:13:55.558793 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m85bn\" (UniqueName: \"kubernetes.io/projected/6700b709-fb58-41aa-a8e9-aad61b389860-kube-api-access-m85bn\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj\" (UID: \"6700b709-fb58-41aa-a8e9-aad61b389860\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj" Mar 18 18:13:55 crc kubenswrapper[4830]: I0318 18:13:55.558830 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6700b709-fb58-41aa-a8e9-aad61b389860-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj\" (UID: \"6700b709-fb58-41aa-a8e9-aad61b389860\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj" Mar 18 18:13:55 crc kubenswrapper[4830]: I0318 18:13:55.559417 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6700b709-fb58-41aa-a8e9-aad61b389860-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj\" (UID: \"6700b709-fb58-41aa-a8e9-aad61b389860\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj" Mar 18 18:13:55 crc kubenswrapper[4830]: I0318 18:13:55.559473 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6700b709-fb58-41aa-a8e9-aad61b389860-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj\" (UID: \"6700b709-fb58-41aa-a8e9-aad61b389860\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj" Mar 18 18:13:55 crc kubenswrapper[4830]: I0318 18:13:55.585184 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m85bn\" (UniqueName: \"kubernetes.io/projected/6700b709-fb58-41aa-a8e9-aad61b389860-kube-api-access-m85bn\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj\" (UID: \"6700b709-fb58-41aa-a8e9-aad61b389860\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj" Mar 18 18:13:55 crc kubenswrapper[4830]: I0318 18:13:55.682155 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj" Mar 18 18:13:55 crc kubenswrapper[4830]: I0318 18:13:55.950821 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj"] Mar 18 18:13:56 crc kubenswrapper[4830]: I0318 18:13:56.834243 4830 generic.go:334] "Generic (PLEG): container finished" podID="6700b709-fb58-41aa-a8e9-aad61b389860" containerID="4a2044e890184e21cbbdc7a09840d753dc9e73dc6b29bf91d017f73212b9d96d" exitCode=0 Mar 18 18:13:56 crc kubenswrapper[4830]: I0318 18:13:56.834312 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj" event={"ID":"6700b709-fb58-41aa-a8e9-aad61b389860","Type":"ContainerDied","Data":"4a2044e890184e21cbbdc7a09840d753dc9e73dc6b29bf91d017f73212b9d96d"} Mar 18 18:13:56 crc kubenswrapper[4830]: I0318 18:13:56.834687 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj" event={"ID":"6700b709-fb58-41aa-a8e9-aad61b389860","Type":"ContainerStarted","Data":"2761402fdc556e52763181a9466f3ae49459a9066589f8d8548cac8d5cbfbb2a"} Mar 18 18:14:00 crc kubenswrapper[4830]: I0318 18:14:00.149268 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564294-mrcl8"] Mar 18 18:14:00 crc kubenswrapper[4830]: I0318 18:14:00.150472 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564294-mrcl8" Mar 18 18:14:00 crc kubenswrapper[4830]: I0318 18:14:00.153541 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:14:00 crc kubenswrapper[4830]: I0318 18:14:00.154060 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:14:00 crc kubenswrapper[4830]: I0318 18:14:00.154307 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 18:14:00 crc kubenswrapper[4830]: I0318 18:14:00.155418 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564294-mrcl8"] Mar 18 18:14:00 crc kubenswrapper[4830]: I0318 18:14:00.226758 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpjbs\" (UniqueName: \"kubernetes.io/projected/368156cb-80ff-479d-8417-2ff46f33363f-kube-api-access-jpjbs\") pod \"auto-csr-approver-29564294-mrcl8\" (UID: \"368156cb-80ff-479d-8417-2ff46f33363f\") " pod="openshift-infra/auto-csr-approver-29564294-mrcl8" Mar 18 18:14:00 crc kubenswrapper[4830]: I0318 18:14:00.328487 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpjbs\" (UniqueName: \"kubernetes.io/projected/368156cb-80ff-479d-8417-2ff46f33363f-kube-api-access-jpjbs\") pod \"auto-csr-approver-29564294-mrcl8\" (UID: \"368156cb-80ff-479d-8417-2ff46f33363f\") " pod="openshift-infra/auto-csr-approver-29564294-mrcl8" Mar 18 18:14:00 crc kubenswrapper[4830]: I0318 18:14:00.355469 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpjbs\" (UniqueName: \"kubernetes.io/projected/368156cb-80ff-479d-8417-2ff46f33363f-kube-api-access-jpjbs\") pod \"auto-csr-approver-29564294-mrcl8\" (UID: \"368156cb-80ff-479d-8417-2ff46f33363f\") " pod="openshift-infra/auto-csr-approver-29564294-mrcl8" Mar 18 18:14:00 crc kubenswrapper[4830]: I0318 18:14:00.511562 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564294-mrcl8" Mar 18 18:14:00 crc kubenswrapper[4830]: I0318 18:14:00.696856 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564294-mrcl8"] Mar 18 18:14:00 crc kubenswrapper[4830]: I0318 18:14:00.864657 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564294-mrcl8" event={"ID":"368156cb-80ff-479d-8417-2ff46f33363f","Type":"ContainerStarted","Data":"069ed9704dbef42c4f6969a78aa6c9551db0b1a825a9097795bd9e1c769626b0"} Mar 18 18:14:02 crc kubenswrapper[4830]: I0318 18:14:02.885063 4830 generic.go:334] "Generic (PLEG): container finished" podID="368156cb-80ff-479d-8417-2ff46f33363f" containerID="1c1b4c219495822da312818182ba9a7042f0be47364fe23f93d8553abdd3d518" exitCode=0 Mar 18 18:14:02 crc kubenswrapper[4830]: I0318 18:14:02.885156 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564294-mrcl8" event={"ID":"368156cb-80ff-479d-8417-2ff46f33363f","Type":"ContainerDied","Data":"1c1b4c219495822da312818182ba9a7042f0be47364fe23f93d8553abdd3d518"} Mar 18 18:14:02 crc kubenswrapper[4830]: I0318 18:14:02.893571 4830 generic.go:334] "Generic (PLEG): container finished" podID="6700b709-fb58-41aa-a8e9-aad61b389860" containerID="189006e7425d48e1bb2f7df0ea1621ffc32815a7cce60643bca7e26701e1eb4c" exitCode=0 Mar 18 18:14:02 crc kubenswrapper[4830]: I0318 18:14:02.893640 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj" event={"ID":"6700b709-fb58-41aa-a8e9-aad61b389860","Type":"ContainerDied","Data":"189006e7425d48e1bb2f7df0ea1621ffc32815a7cce60643bca7e26701e1eb4c"} Mar 18 18:14:03 crc kubenswrapper[4830]: I0318 18:14:03.908800 4830 generic.go:334] "Generic (PLEG): container finished" podID="6700b709-fb58-41aa-a8e9-aad61b389860" containerID="385d968e3797225e63157aae56e396974abc8d0f4207a790a5b741e7bec8d917" exitCode=0 Mar 18 18:14:03 crc kubenswrapper[4830]: I0318 18:14:03.908874 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj" event={"ID":"6700b709-fb58-41aa-a8e9-aad61b389860","Type":"ContainerDied","Data":"385d968e3797225e63157aae56e396974abc8d0f4207a790a5b741e7bec8d917"} Mar 18 18:14:04 crc kubenswrapper[4830]: I0318 18:14:04.245227 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564294-mrcl8" Mar 18 18:14:04 crc kubenswrapper[4830]: I0318 18:14:04.282295 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpjbs\" (UniqueName: \"kubernetes.io/projected/368156cb-80ff-479d-8417-2ff46f33363f-kube-api-access-jpjbs\") pod \"368156cb-80ff-479d-8417-2ff46f33363f\" (UID: \"368156cb-80ff-479d-8417-2ff46f33363f\") " Mar 18 18:14:04 crc kubenswrapper[4830]: I0318 18:14:04.295045 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/368156cb-80ff-479d-8417-2ff46f33363f-kube-api-access-jpjbs" (OuterVolumeSpecName: "kube-api-access-jpjbs") pod "368156cb-80ff-479d-8417-2ff46f33363f" (UID: "368156cb-80ff-479d-8417-2ff46f33363f"). InnerVolumeSpecName "kube-api-access-jpjbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:14:04 crc kubenswrapper[4830]: I0318 18:14:04.384965 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpjbs\" (UniqueName: \"kubernetes.io/projected/368156cb-80ff-479d-8417-2ff46f33363f-kube-api-access-jpjbs\") on node \"crc\" DevicePath \"\"" Mar 18 18:14:04 crc kubenswrapper[4830]: I0318 18:14:04.920467 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564294-mrcl8" Mar 18 18:14:04 crc kubenswrapper[4830]: I0318 18:14:04.920446 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564294-mrcl8" event={"ID":"368156cb-80ff-479d-8417-2ff46f33363f","Type":"ContainerDied","Data":"069ed9704dbef42c4f6969a78aa6c9551db0b1a825a9097795bd9e1c769626b0"} Mar 18 18:14:04 crc kubenswrapper[4830]: I0318 18:14:04.921177 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="069ed9704dbef42c4f6969a78aa6c9551db0b1a825a9097795bd9e1c769626b0" Mar 18 18:14:05 crc kubenswrapper[4830]: E0318 18:14:05.062810 4830 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod368156cb_80ff_479d_8417_2ff46f33363f.slice/crio-069ed9704dbef42c4f6969a78aa6c9551db0b1a825a9097795bd9e1c769626b0\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod368156cb_80ff_479d_8417_2ff46f33363f.slice\": RecentStats: unable to find data in memory cache]" Mar 18 18:14:05 crc kubenswrapper[4830]: I0318 18:14:05.226224 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj" Mar 18 18:14:05 crc kubenswrapper[4830]: I0318 18:14:05.297423 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564288-n47gr"] Mar 18 18:14:05 crc kubenswrapper[4830]: I0318 18:14:05.298220 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m85bn\" (UniqueName: \"kubernetes.io/projected/6700b709-fb58-41aa-a8e9-aad61b389860-kube-api-access-m85bn\") pod \"6700b709-fb58-41aa-a8e9-aad61b389860\" (UID: \"6700b709-fb58-41aa-a8e9-aad61b389860\") " Mar 18 18:14:05 crc kubenswrapper[4830]: I0318 18:14:05.298332 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6700b709-fb58-41aa-a8e9-aad61b389860-util\") pod \"6700b709-fb58-41aa-a8e9-aad61b389860\" (UID: \"6700b709-fb58-41aa-a8e9-aad61b389860\") " Mar 18 18:14:05 crc kubenswrapper[4830]: I0318 18:14:05.298453 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6700b709-fb58-41aa-a8e9-aad61b389860-bundle\") pod \"6700b709-fb58-41aa-a8e9-aad61b389860\" (UID: \"6700b709-fb58-41aa-a8e9-aad61b389860\") " Mar 18 18:14:05 crc kubenswrapper[4830]: I0318 18:14:05.299643 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6700b709-fb58-41aa-a8e9-aad61b389860-bundle" (OuterVolumeSpecName: "bundle") pod "6700b709-fb58-41aa-a8e9-aad61b389860" (UID: "6700b709-fb58-41aa-a8e9-aad61b389860"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:14:05 crc kubenswrapper[4830]: I0318 18:14:05.300782 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564288-n47gr"] Mar 18 18:14:05 crc kubenswrapper[4830]: I0318 18:14:05.302785 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6700b709-fb58-41aa-a8e9-aad61b389860-kube-api-access-m85bn" (OuterVolumeSpecName: "kube-api-access-m85bn") pod "6700b709-fb58-41aa-a8e9-aad61b389860" (UID: "6700b709-fb58-41aa-a8e9-aad61b389860"). InnerVolumeSpecName "kube-api-access-m85bn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:14:05 crc kubenswrapper[4830]: I0318 18:14:05.309103 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6700b709-fb58-41aa-a8e9-aad61b389860-util" (OuterVolumeSpecName: "util") pod "6700b709-fb58-41aa-a8e9-aad61b389860" (UID: "6700b709-fb58-41aa-a8e9-aad61b389860"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:14:05 crc kubenswrapper[4830]: I0318 18:14:05.399980 4830 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6700b709-fb58-41aa-a8e9-aad61b389860-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:14:05 crc kubenswrapper[4830]: I0318 18:14:05.400027 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m85bn\" (UniqueName: \"kubernetes.io/projected/6700b709-fb58-41aa-a8e9-aad61b389860-kube-api-access-m85bn\") on node \"crc\" DevicePath \"\"" Mar 18 18:14:05 crc kubenswrapper[4830]: I0318 18:14:05.400047 4830 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6700b709-fb58-41aa-a8e9-aad61b389860-util\") on node \"crc\" DevicePath \"\"" Mar 18 18:14:05 crc kubenswrapper[4830]: I0318 18:14:05.933271 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj" event={"ID":"6700b709-fb58-41aa-a8e9-aad61b389860","Type":"ContainerDied","Data":"2761402fdc556e52763181a9466f3ae49459a9066589f8d8548cac8d5cbfbb2a"} Mar 18 18:14:05 crc kubenswrapper[4830]: I0318 18:14:05.933719 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2761402fdc556e52763181a9466f3ae49459a9066589f8d8548cac8d5cbfbb2a" Mar 18 18:14:05 crc kubenswrapper[4830]: I0318 18:14:05.934148 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj" Mar 18 18:14:06 crc kubenswrapper[4830]: I0318 18:14:06.267755 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e24015e-7ff6-47e9-8d2b-3f4b7b46af5f" path="/var/lib/kubelet/pods/1e24015e-7ff6-47e9-8d2b-3f4b7b46af5f/volumes" Mar 18 18:14:10 crc kubenswrapper[4830]: I0318 18:14:10.393956 4830 scope.go:117] "RemoveContainer" containerID="81ed9c688bd7c555b9c89150a01f87551b09da4b2f705a1f3f051db59cce1b69" Mar 18 18:14:11 crc kubenswrapper[4830]: I0318 18:14:11.995736 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-9mfrv"] Mar 18 18:14:11 crc kubenswrapper[4830]: E0318 18:14:11.996415 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6700b709-fb58-41aa-a8e9-aad61b389860" containerName="extract" Mar 18 18:14:11 crc kubenswrapper[4830]: I0318 18:14:11.996433 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="6700b709-fb58-41aa-a8e9-aad61b389860" containerName="extract" Mar 18 18:14:11 crc kubenswrapper[4830]: E0318 18:14:11.996463 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6700b709-fb58-41aa-a8e9-aad61b389860" containerName="util" Mar 18 18:14:11 crc kubenswrapper[4830]: I0318 18:14:11.996472 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="6700b709-fb58-41aa-a8e9-aad61b389860" containerName="util" Mar 18 18:14:11 crc kubenswrapper[4830]: E0318 18:14:11.996484 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6700b709-fb58-41aa-a8e9-aad61b389860" containerName="pull" Mar 18 18:14:11 crc kubenswrapper[4830]: I0318 18:14:11.996492 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="6700b709-fb58-41aa-a8e9-aad61b389860" containerName="pull" Mar 18 18:14:11 crc kubenswrapper[4830]: E0318 18:14:11.996505 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="368156cb-80ff-479d-8417-2ff46f33363f" containerName="oc" Mar 18 18:14:11 crc kubenswrapper[4830]: I0318 18:14:11.996513 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="368156cb-80ff-479d-8417-2ff46f33363f" containerName="oc" Mar 18 18:14:11 crc kubenswrapper[4830]: I0318 18:14:11.996620 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="368156cb-80ff-479d-8417-2ff46f33363f" containerName="oc" Mar 18 18:14:11 crc kubenswrapper[4830]: I0318 18:14:11.996644 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="6700b709-fb58-41aa-a8e9-aad61b389860" containerName="extract" Mar 18 18:14:11 crc kubenswrapper[4830]: I0318 18:14:11.997138 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-9mfrv" Mar 18 18:14:11 crc kubenswrapper[4830]: I0318 18:14:11.999524 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 18 18:14:12 crc kubenswrapper[4830]: I0318 18:14:12.000081 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-dndrb" Mar 18 18:14:12 crc kubenswrapper[4830]: I0318 18:14:12.000178 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 18 18:14:12 crc kubenswrapper[4830]: I0318 18:14:12.043162 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-9mfrv"] Mar 18 18:14:12 crc kubenswrapper[4830]: I0318 18:14:12.194932 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v5bx\" (UniqueName: \"kubernetes.io/projected/50496be8-7302-44e4-87ac-b976ebb2099e-kube-api-access-8v5bx\") pod \"nmstate-operator-796d4cfff4-9mfrv\" (UID: \"50496be8-7302-44e4-87ac-b976ebb2099e\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-9mfrv" Mar 18 18:14:12 crc kubenswrapper[4830]: I0318 18:14:12.296113 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v5bx\" (UniqueName: \"kubernetes.io/projected/50496be8-7302-44e4-87ac-b976ebb2099e-kube-api-access-8v5bx\") pod \"nmstate-operator-796d4cfff4-9mfrv\" (UID: \"50496be8-7302-44e4-87ac-b976ebb2099e\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-9mfrv" Mar 18 18:14:12 crc kubenswrapper[4830]: I0318 18:14:12.320067 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v5bx\" (UniqueName: \"kubernetes.io/projected/50496be8-7302-44e4-87ac-b976ebb2099e-kube-api-access-8v5bx\") pod \"nmstate-operator-796d4cfff4-9mfrv\" (UID: \"50496be8-7302-44e4-87ac-b976ebb2099e\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-9mfrv" Mar 18 18:14:12 crc kubenswrapper[4830]: I0318 18:14:12.612708 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-9mfrv" Mar 18 18:14:12 crc kubenswrapper[4830]: I0318 18:14:12.883449 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-9mfrv"] Mar 18 18:14:12 crc kubenswrapper[4830]: I0318 18:14:12.977959 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-9mfrv" event={"ID":"50496be8-7302-44e4-87ac-b976ebb2099e","Type":"ContainerStarted","Data":"411b306877de240e7de707984d3b2f42f8a27ffc434a6a3eefa5747974bef934"} Mar 18 18:14:15 crc kubenswrapper[4830]: I0318 18:14:15.997575 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-9mfrv" event={"ID":"50496be8-7302-44e4-87ac-b976ebb2099e","Type":"ContainerStarted","Data":"d3910947344cb42dc47a5ec891b5ae2e04aeab69f6cf2722ba3527d0fe15c20d"} Mar 18 18:14:16 crc kubenswrapper[4830]: I0318 18:14:16.026755 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-9mfrv" podStartSLOduration=2.247196701 podStartE2EDuration="5.026730538s" podCreationTimestamp="2026-03-18 18:14:11 +0000 UTC" firstStartedPulling="2026-03-18 18:14:12.890760607 +0000 UTC m=+687.458390939" lastFinishedPulling="2026-03-18 18:14:15.670294444 +0000 UTC m=+690.237924776" observedRunningTime="2026-03-18 18:14:16.024169155 +0000 UTC m=+690.591799527" watchObservedRunningTime="2026-03-18 18:14:16.026730538 +0000 UTC m=+690.594360910" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.053680 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-lfvdq"] Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.055984 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-lfvdq" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.058741 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-n9xlv" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.063087 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-775jj\" (UniqueName: \"kubernetes.io/projected/fb2bc12d-5865-47fb-bb2e-85e8c0dad6c8-kube-api-access-775jj\") pod \"nmstate-metrics-9b8c8685d-lfvdq\" (UID: \"fb2bc12d-5865-47fb-bb2e-85e8c0dad6c8\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-lfvdq" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.067215 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-xww5t"] Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.068302 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-xww5t" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.070247 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.083055 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-xww5t"] Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.089723 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-lfvdq"] Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.130722 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-qdbvx"] Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.131484 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-qdbvx" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.163945 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6mq8\" (UniqueName: \"kubernetes.io/projected/aca99d47-cac9-4c1d-97da-5ad69260de41-kube-api-access-r6mq8\") pod \"nmstate-webhook-5f558f5558-xww5t\" (UID: \"aca99d47-cac9-4c1d-97da-5ad69260de41\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-xww5t" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.164002 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f480edf1-a3da-4567-bf97-d3067dd88a64-dbus-socket\") pod \"nmstate-handler-qdbvx\" (UID: \"f480edf1-a3da-4567-bf97-d3067dd88a64\") " pod="openshift-nmstate/nmstate-handler-qdbvx" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.164043 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/aca99d47-cac9-4c1d-97da-5ad69260de41-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-xww5t\" (UID: \"aca99d47-cac9-4c1d-97da-5ad69260de41\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-xww5t" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.164069 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-775jj\" (UniqueName: \"kubernetes.io/projected/fb2bc12d-5865-47fb-bb2e-85e8c0dad6c8-kube-api-access-775jj\") pod \"nmstate-metrics-9b8c8685d-lfvdq\" (UID: \"fb2bc12d-5865-47fb-bb2e-85e8c0dad6c8\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-lfvdq" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.164091 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f480edf1-a3da-4567-bf97-d3067dd88a64-nmstate-lock\") pod \"nmstate-handler-qdbvx\" (UID: \"f480edf1-a3da-4567-bf97-d3067dd88a64\") " pod="openshift-nmstate/nmstate-handler-qdbvx" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.164116 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f480edf1-a3da-4567-bf97-d3067dd88a64-ovs-socket\") pod \"nmstate-handler-qdbvx\" (UID: \"f480edf1-a3da-4567-bf97-d3067dd88a64\") " pod="openshift-nmstate/nmstate-handler-qdbvx" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.164133 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jwm9\" (UniqueName: \"kubernetes.io/projected/f480edf1-a3da-4567-bf97-d3067dd88a64-kube-api-access-5jwm9\") pod \"nmstate-handler-qdbvx\" (UID: \"f480edf1-a3da-4567-bf97-d3067dd88a64\") " pod="openshift-nmstate/nmstate-handler-qdbvx" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.184173 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-775jj\" (UniqueName: \"kubernetes.io/projected/fb2bc12d-5865-47fb-bb2e-85e8c0dad6c8-kube-api-access-775jj\") pod \"nmstate-metrics-9b8c8685d-lfvdq\" (UID: \"fb2bc12d-5865-47fb-bb2e-85e8c0dad6c8\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-lfvdq" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.221866 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-fhhnd"] Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.222691 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fhhnd" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.224904 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-c79bs" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.225076 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.225130 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.241522 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-fhhnd"] Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.265245 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f480edf1-a3da-4567-bf97-d3067dd88a64-dbus-socket\") pod \"nmstate-handler-qdbvx\" (UID: \"f480edf1-a3da-4567-bf97-d3067dd88a64\") " pod="openshift-nmstate/nmstate-handler-qdbvx" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.265310 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98txl\" (UniqueName: \"kubernetes.io/projected/0d49e8e6-69ee-4c37-a687-433ad140281d-kube-api-access-98txl\") pod \"nmstate-console-plugin-86f58fcf4-fhhnd\" (UID: \"0d49e8e6-69ee-4c37-a687-433ad140281d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fhhnd" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.265337 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/aca99d47-cac9-4c1d-97da-5ad69260de41-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-xww5t\" (UID: \"aca99d47-cac9-4c1d-97da-5ad69260de41\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-xww5t" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.265376 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f480edf1-a3da-4567-bf97-d3067dd88a64-nmstate-lock\") pod \"nmstate-handler-qdbvx\" (UID: \"f480edf1-a3da-4567-bf97-d3067dd88a64\") " pod="openshift-nmstate/nmstate-handler-qdbvx" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.265394 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d49e8e6-69ee-4c37-a687-433ad140281d-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-fhhnd\" (UID: \"0d49e8e6-69ee-4c37-a687-433ad140281d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fhhnd" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.265419 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f480edf1-a3da-4567-bf97-d3067dd88a64-ovs-socket\") pod \"nmstate-handler-qdbvx\" (UID: \"f480edf1-a3da-4567-bf97-d3067dd88a64\") " pod="openshift-nmstate/nmstate-handler-qdbvx" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.265437 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jwm9\" (UniqueName: \"kubernetes.io/projected/f480edf1-a3da-4567-bf97-d3067dd88a64-kube-api-access-5jwm9\") pod \"nmstate-handler-qdbvx\" (UID: \"f480edf1-a3da-4567-bf97-d3067dd88a64\") " pod="openshift-nmstate/nmstate-handler-qdbvx" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.265458 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6mq8\" (UniqueName: \"kubernetes.io/projected/aca99d47-cac9-4c1d-97da-5ad69260de41-kube-api-access-r6mq8\") pod \"nmstate-webhook-5f558f5558-xww5t\" (UID: \"aca99d47-cac9-4c1d-97da-5ad69260de41\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-xww5t" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.265473 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0d49e8e6-69ee-4c37-a687-433ad140281d-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-fhhnd\" (UID: \"0d49e8e6-69ee-4c37-a687-433ad140281d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fhhnd" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.265862 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f480edf1-a3da-4567-bf97-d3067dd88a64-dbus-socket\") pod \"nmstate-handler-qdbvx\" (UID: \"f480edf1-a3da-4567-bf97-d3067dd88a64\") " pod="openshift-nmstate/nmstate-handler-qdbvx" Mar 18 18:14:17 crc kubenswrapper[4830]: E0318 18:14:17.265950 4830 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 18 18:14:17 crc kubenswrapper[4830]: E0318 18:14:17.265993 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aca99d47-cac9-4c1d-97da-5ad69260de41-tls-key-pair podName:aca99d47-cac9-4c1d-97da-5ad69260de41 nodeName:}" failed. No retries permitted until 2026-03-18 18:14:17.765976355 +0000 UTC m=+692.333606687 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/aca99d47-cac9-4c1d-97da-5ad69260de41-tls-key-pair") pod "nmstate-webhook-5f558f5558-xww5t" (UID: "aca99d47-cac9-4c1d-97da-5ad69260de41") : secret "openshift-nmstate-webhook" not found Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.266555 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f480edf1-a3da-4567-bf97-d3067dd88a64-nmstate-lock\") pod \"nmstate-handler-qdbvx\" (UID: \"f480edf1-a3da-4567-bf97-d3067dd88a64\") " pod="openshift-nmstate/nmstate-handler-qdbvx" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.266589 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f480edf1-a3da-4567-bf97-d3067dd88a64-ovs-socket\") pod \"nmstate-handler-qdbvx\" (UID: \"f480edf1-a3da-4567-bf97-d3067dd88a64\") " pod="openshift-nmstate/nmstate-handler-qdbvx" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.285330 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jwm9\" (UniqueName: \"kubernetes.io/projected/f480edf1-a3da-4567-bf97-d3067dd88a64-kube-api-access-5jwm9\") pod \"nmstate-handler-qdbvx\" (UID: \"f480edf1-a3da-4567-bf97-d3067dd88a64\") " pod="openshift-nmstate/nmstate-handler-qdbvx" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.289549 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6mq8\" (UniqueName: \"kubernetes.io/projected/aca99d47-cac9-4c1d-97da-5ad69260de41-kube-api-access-r6mq8\") pod \"nmstate-webhook-5f558f5558-xww5t\" (UID: \"aca99d47-cac9-4c1d-97da-5ad69260de41\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-xww5t" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.367137 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d49e8e6-69ee-4c37-a687-433ad140281d-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-fhhnd\" (UID: \"0d49e8e6-69ee-4c37-a687-433ad140281d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fhhnd" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.367443 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0d49e8e6-69ee-4c37-a687-433ad140281d-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-fhhnd\" (UID: \"0d49e8e6-69ee-4c37-a687-433ad140281d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fhhnd" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.367551 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98txl\" (UniqueName: \"kubernetes.io/projected/0d49e8e6-69ee-4c37-a687-433ad140281d-kube-api-access-98txl\") pod \"nmstate-console-plugin-86f58fcf4-fhhnd\" (UID: \"0d49e8e6-69ee-4c37-a687-433ad140281d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fhhnd" Mar 18 18:14:17 crc kubenswrapper[4830]: E0318 18:14:17.367294 4830 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 18 18:14:17 crc kubenswrapper[4830]: E0318 18:14:17.368006 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d49e8e6-69ee-4c37-a687-433ad140281d-plugin-serving-cert podName:0d49e8e6-69ee-4c37-a687-433ad140281d nodeName:}" failed. No retries permitted until 2026-03-18 18:14:17.867991124 +0000 UTC m=+692.435621456 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/0d49e8e6-69ee-4c37-a687-433ad140281d-plugin-serving-cert") pod "nmstate-console-plugin-86f58fcf4-fhhnd" (UID: "0d49e8e6-69ee-4c37-a687-433ad140281d") : secret "plugin-serving-cert" not found Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.368343 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0d49e8e6-69ee-4c37-a687-433ad140281d-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-fhhnd\" (UID: \"0d49e8e6-69ee-4c37-a687-433ad140281d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fhhnd" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.386834 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98txl\" (UniqueName: \"kubernetes.io/projected/0d49e8e6-69ee-4c37-a687-433ad140281d-kube-api-access-98txl\") pod \"nmstate-console-plugin-86f58fcf4-fhhnd\" (UID: \"0d49e8e6-69ee-4c37-a687-433ad140281d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fhhnd" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.412734 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f867cf8d5-r7xct"] Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.413392 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f867cf8d5-r7xct" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.429556 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-lfvdq" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.444331 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f867cf8d5-r7xct"] Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.461464 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-qdbvx" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.468610 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461-trusted-ca-bundle\") pod \"console-f867cf8d5-r7xct\" (UID: \"3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461\") " pod="openshift-console/console-f867cf8d5-r7xct" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.468731 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461-console-oauth-config\") pod \"console-f867cf8d5-r7xct\" (UID: \"3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461\") " pod="openshift-console/console-f867cf8d5-r7xct" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.468796 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461-service-ca\") pod \"console-f867cf8d5-r7xct\" (UID: \"3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461\") " pod="openshift-console/console-f867cf8d5-r7xct" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.468875 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461-console-serving-cert\") pod \"console-f867cf8d5-r7xct\" (UID: \"3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461\") " pod="openshift-console/console-f867cf8d5-r7xct" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.468914 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461-oauth-serving-cert\") pod \"console-f867cf8d5-r7xct\" (UID: \"3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461\") " pod="openshift-console/console-f867cf8d5-r7xct" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.468967 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwb4j\" (UniqueName: \"kubernetes.io/projected/3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461-kube-api-access-wwb4j\") pod \"console-f867cf8d5-r7xct\" (UID: \"3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461\") " pod="openshift-console/console-f867cf8d5-r7xct" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.469011 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461-console-config\") pod \"console-f867cf8d5-r7xct\" (UID: \"3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461\") " pod="openshift-console/console-f867cf8d5-r7xct" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.569811 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461-console-serving-cert\") pod \"console-f867cf8d5-r7xct\" (UID: \"3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461\") " pod="openshift-console/console-f867cf8d5-r7xct" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.569859 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461-oauth-serving-cert\") pod \"console-f867cf8d5-r7xct\" (UID: \"3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461\") " pod="openshift-console/console-f867cf8d5-r7xct" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.569892 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwb4j\" (UniqueName: \"kubernetes.io/projected/3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461-kube-api-access-wwb4j\") pod \"console-f867cf8d5-r7xct\" (UID: \"3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461\") " pod="openshift-console/console-f867cf8d5-r7xct" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.569920 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461-console-config\") pod \"console-f867cf8d5-r7xct\" (UID: \"3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461\") " pod="openshift-console/console-f867cf8d5-r7xct" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.569947 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461-trusted-ca-bundle\") pod \"console-f867cf8d5-r7xct\" (UID: \"3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461\") " pod="openshift-console/console-f867cf8d5-r7xct" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.569987 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461-console-oauth-config\") pod \"console-f867cf8d5-r7xct\" (UID: \"3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461\") " pod="openshift-console/console-f867cf8d5-r7xct" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.570004 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461-service-ca\") pod \"console-f867cf8d5-r7xct\" (UID: \"3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461\") " pod="openshift-console/console-f867cf8d5-r7xct" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.570920 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461-service-ca\") pod \"console-f867cf8d5-r7xct\" (UID: \"3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461\") " pod="openshift-console/console-f867cf8d5-r7xct" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.571580 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461-console-config\") pod \"console-f867cf8d5-r7xct\" (UID: \"3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461\") " pod="openshift-console/console-f867cf8d5-r7xct" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.572346 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461-trusted-ca-bundle\") pod \"console-f867cf8d5-r7xct\" (UID: \"3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461\") " pod="openshift-console/console-f867cf8d5-r7xct" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.572372 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461-oauth-serving-cert\") pod \"console-f867cf8d5-r7xct\" (UID: \"3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461\") " pod="openshift-console/console-f867cf8d5-r7xct" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.576890 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461-console-serving-cert\") pod \"console-f867cf8d5-r7xct\" (UID: \"3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461\") " pod="openshift-console/console-f867cf8d5-r7xct" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.578522 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461-console-oauth-config\") pod \"console-f867cf8d5-r7xct\" (UID: \"3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461\") " pod="openshift-console/console-f867cf8d5-r7xct" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.589724 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwb4j\" (UniqueName: \"kubernetes.io/projected/3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461-kube-api-access-wwb4j\") pod \"console-f867cf8d5-r7xct\" (UID: \"3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461\") " pod="openshift-console/console-f867cf8d5-r7xct" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.639069 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-lfvdq"] Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.728324 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f867cf8d5-r7xct" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.771624 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/aca99d47-cac9-4c1d-97da-5ad69260de41-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-xww5t\" (UID: \"aca99d47-cac9-4c1d-97da-5ad69260de41\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-xww5t" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.777411 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/aca99d47-cac9-4c1d-97da-5ad69260de41-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-xww5t\" (UID: \"aca99d47-cac9-4c1d-97da-5ad69260de41\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-xww5t" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.872625 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d49e8e6-69ee-4c37-a687-433ad140281d-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-fhhnd\" (UID: \"0d49e8e6-69ee-4c37-a687-433ad140281d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fhhnd" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.877003 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d49e8e6-69ee-4c37-a687-433ad140281d-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-fhhnd\" (UID: \"0d49e8e6-69ee-4c37-a687-433ad140281d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fhhnd" Mar 18 18:14:17 crc kubenswrapper[4830]: I0318 18:14:17.948784 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f867cf8d5-r7xct"] Mar 18 18:14:17 crc kubenswrapper[4830]: W0318 18:14:17.954467 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a3f2941_3ef1_45c0_b9b5_d7c7a75b1461.slice/crio-db8910d3b4a954a9774064c609e81cb8b93cd0081066951f0b03a6b0f7d77fef WatchSource:0}: Error finding container db8910d3b4a954a9774064c609e81cb8b93cd0081066951f0b03a6b0f7d77fef: Status 404 returned error can't find the container with id db8910d3b4a954a9774064c609e81cb8b93cd0081066951f0b03a6b0f7d77fef Mar 18 18:14:18 crc kubenswrapper[4830]: I0318 18:14:18.017744 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-qdbvx" event={"ID":"f480edf1-a3da-4567-bf97-d3067dd88a64","Type":"ContainerStarted","Data":"766529e6887dbf40fb176d212f7ff2fab1811cab19784615ef5c930fd04d463f"} Mar 18 18:14:18 crc kubenswrapper[4830]: I0318 18:14:18.020156 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-lfvdq" event={"ID":"fb2bc12d-5865-47fb-bb2e-85e8c0dad6c8","Type":"ContainerStarted","Data":"fc0d5f503913262e970f0ab453e233509575aeaf110f93c78de9465f505a66f2"} Mar 18 18:14:18 crc kubenswrapper[4830]: I0318 18:14:18.021592 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f867cf8d5-r7xct" event={"ID":"3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461","Type":"ContainerStarted","Data":"db8910d3b4a954a9774064c609e81cb8b93cd0081066951f0b03a6b0f7d77fef"} Mar 18 18:14:18 crc kubenswrapper[4830]: I0318 18:14:18.046154 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-xww5t" Mar 18 18:14:18 crc kubenswrapper[4830]: I0318 18:14:18.141331 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fhhnd" Mar 18 18:14:18 crc kubenswrapper[4830]: I0318 18:14:18.255815 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-xww5t"] Mar 18 18:14:18 crc kubenswrapper[4830]: I0318 18:14:18.370654 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-fhhnd"] Mar 18 18:14:19 crc kubenswrapper[4830]: I0318 18:14:19.030768 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fhhnd" event={"ID":"0d49e8e6-69ee-4c37-a687-433ad140281d","Type":"ContainerStarted","Data":"37770c9931ec0e4dfa7851c7381ca1ad347704f08536b4a403c80b6b8b6869e4"} Mar 18 18:14:19 crc kubenswrapper[4830]: I0318 18:14:19.035089 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-xww5t" event={"ID":"aca99d47-cac9-4c1d-97da-5ad69260de41","Type":"ContainerStarted","Data":"c067c2a9c5b2bfa1efb1ac43b109dc4176d39847e389ff24549dc24bde4e0190"} Mar 18 18:14:19 crc kubenswrapper[4830]: I0318 18:14:19.038407 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f867cf8d5-r7xct" event={"ID":"3a3f2941-3ef1-45c0-b9b5-d7c7a75b1461","Type":"ContainerStarted","Data":"3e213ee52749e5093889244bc15e627a77b71cdcf2a06ba1dffbe46d4953cd62"} Mar 18 18:14:19 crc kubenswrapper[4830]: I0318 18:14:19.083823 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f867cf8d5-r7xct" podStartSLOduration=2.083759178 podStartE2EDuration="2.083759178s" podCreationTimestamp="2026-03-18 18:14:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:14:19.072449485 +0000 UTC m=+693.640079867" watchObservedRunningTime="2026-03-18 18:14:19.083759178 +0000 UTC m=+693.651389550" Mar 18 18:14:21 crc kubenswrapper[4830]: I0318 18:14:21.061307 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-lfvdq" event={"ID":"fb2bc12d-5865-47fb-bb2e-85e8c0dad6c8","Type":"ContainerStarted","Data":"64232162ad23d8e5538b422f007b588bc6f2bf42d0b428a59dc78cd7a3c059e5"} Mar 18 18:14:21 crc kubenswrapper[4830]: I0318 18:14:21.064647 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-qdbvx" event={"ID":"f480edf1-a3da-4567-bf97-d3067dd88a64","Type":"ContainerStarted","Data":"f530a5152032d0f891544251c626e262ebfb67d7722c50babda4694270699ba8"} Mar 18 18:14:21 crc kubenswrapper[4830]: I0318 18:14:21.064815 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-qdbvx" Mar 18 18:14:21 crc kubenswrapper[4830]: I0318 18:14:21.088168 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-qdbvx" podStartSLOduration=1.299414241 podStartE2EDuration="4.088140992s" podCreationTimestamp="2026-03-18 18:14:17 +0000 UTC" firstStartedPulling="2026-03-18 18:14:17.483737593 +0000 UTC m=+692.051367925" lastFinishedPulling="2026-03-18 18:14:20.272464334 +0000 UTC m=+694.840094676" observedRunningTime="2026-03-18 18:14:21.084084916 +0000 UTC m=+695.651715248" watchObservedRunningTime="2026-03-18 18:14:21.088140992 +0000 UTC m=+695.655771324" Mar 18 18:14:22 crc kubenswrapper[4830]: I0318 18:14:22.087177 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-xww5t" event={"ID":"aca99d47-cac9-4c1d-97da-5ad69260de41","Type":"ContainerStarted","Data":"b6d808b11971e67cd32111c3e6dab1ba9c02d9b0b6d4dec9ecf88fb62e692d16"} Mar 18 18:14:22 crc kubenswrapper[4830]: I0318 18:14:22.110834 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-xww5t" podStartSLOduration=2.430612679 podStartE2EDuration="5.110768212s" podCreationTimestamp="2026-03-18 18:14:17 +0000 UTC" firstStartedPulling="2026-03-18 18:14:18.293650248 +0000 UTC m=+692.861280580" lastFinishedPulling="2026-03-18 18:14:20.973805761 +0000 UTC m=+695.541436113" observedRunningTime="2026-03-18 18:14:22.110301478 +0000 UTC m=+696.677931850" watchObservedRunningTime="2026-03-18 18:14:22.110768212 +0000 UTC m=+696.678398584" Mar 18 18:14:23 crc kubenswrapper[4830]: I0318 18:14:23.121927 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fhhnd" event={"ID":"0d49e8e6-69ee-4c37-a687-433ad140281d","Type":"ContainerStarted","Data":"3f0f85d5dbc804d1a7d13098d1ab0a39d07f9eef49664810e301c5258cc8f9dc"} Mar 18 18:14:23 crc kubenswrapper[4830]: I0318 18:14:23.122454 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-xww5t" Mar 18 18:14:23 crc kubenswrapper[4830]: I0318 18:14:23.152671 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fhhnd" podStartSLOduration=2.217606924 podStartE2EDuration="6.152652781s" podCreationTimestamp="2026-03-18 18:14:17 +0000 UTC" firstStartedPulling="2026-03-18 18:14:18.380226376 +0000 UTC m=+692.947856708" lastFinishedPulling="2026-03-18 18:14:22.315272233 +0000 UTC m=+696.882902565" observedRunningTime="2026-03-18 18:14:23.149160441 +0000 UTC m=+697.716790813" watchObservedRunningTime="2026-03-18 18:14:23.152652781 +0000 UTC m=+697.720283123" Mar 18 18:14:24 crc kubenswrapper[4830]: I0318 18:14:24.130455 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-lfvdq" event={"ID":"fb2bc12d-5865-47fb-bb2e-85e8c0dad6c8","Type":"ContainerStarted","Data":"7c47d2b2f47179f172f87dc432278dadc90bc67a449904bb1066942d83b184cd"} Mar 18 18:14:24 crc kubenswrapper[4830]: I0318 18:14:24.152434 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-lfvdq" podStartSLOduration=1.83836786 podStartE2EDuration="7.152419239s" podCreationTimestamp="2026-03-18 18:14:17 +0000 UTC" firstStartedPulling="2026-03-18 18:14:17.647993227 +0000 UTC m=+692.215623559" lastFinishedPulling="2026-03-18 18:14:22.962044596 +0000 UTC m=+697.529674938" observedRunningTime="2026-03-18 18:14:24.151636757 +0000 UTC m=+698.719267129" watchObservedRunningTime="2026-03-18 18:14:24.152419239 +0000 UTC m=+698.720049571" Mar 18 18:14:27 crc kubenswrapper[4830]: I0318 18:14:27.504401 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-qdbvx" Mar 18 18:14:27 crc kubenswrapper[4830]: I0318 18:14:27.729160 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f867cf8d5-r7xct" Mar 18 18:14:27 crc kubenswrapper[4830]: I0318 18:14:27.729235 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f867cf8d5-r7xct" Mar 18 18:14:27 crc kubenswrapper[4830]: I0318 18:14:27.739504 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f867cf8d5-r7xct" Mar 18 18:14:28 crc kubenswrapper[4830]: I0318 18:14:28.171839 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f867cf8d5-r7xct" Mar 18 18:14:28 crc kubenswrapper[4830]: I0318 18:14:28.246028 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-lfz57"] Mar 18 18:14:38 crc kubenswrapper[4830]: I0318 18:14:38.056853 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-xww5t" Mar 18 18:14:41 crc kubenswrapper[4830]: I0318 18:14:41.193976 4830 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 18:14:53 crc kubenswrapper[4830]: I0318 18:14:53.325482 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-lfz57" podUID="95b4d24e-09da-4c0d-9d24-81621509024a" containerName="console" containerID="cri-o://2931bf69515a4a9a5d0e070d0c198fb1079578d21a98cc67e91f316c33e5b89c" gracePeriod=15 Mar 18 18:14:53 crc kubenswrapper[4830]: I0318 18:14:53.767592 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-lfz57_95b4d24e-09da-4c0d-9d24-81621509024a/console/0.log" Mar 18 18:14:53 crc kubenswrapper[4830]: I0318 18:14:53.768261 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lfz57" Mar 18 18:14:53 crc kubenswrapper[4830]: I0318 18:14:53.888291 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95b4d24e-09da-4c0d-9d24-81621509024a-trusted-ca-bundle\") pod \"95b4d24e-09da-4c0d-9d24-81621509024a\" (UID: \"95b4d24e-09da-4c0d-9d24-81621509024a\") " Mar 18 18:14:53 crc kubenswrapper[4830]: I0318 18:14:53.888333 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/95b4d24e-09da-4c0d-9d24-81621509024a-console-config\") pod \"95b4d24e-09da-4c0d-9d24-81621509024a\" (UID: \"95b4d24e-09da-4c0d-9d24-81621509024a\") " Mar 18 18:14:53 crc kubenswrapper[4830]: I0318 18:14:53.888443 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/95b4d24e-09da-4c0d-9d24-81621509024a-console-oauth-config\") pod \"95b4d24e-09da-4c0d-9d24-81621509024a\" (UID: \"95b4d24e-09da-4c0d-9d24-81621509024a\") " Mar 18 18:14:53 crc kubenswrapper[4830]: I0318 18:14:53.888471 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/95b4d24e-09da-4c0d-9d24-81621509024a-oauth-serving-cert\") pod \"95b4d24e-09da-4c0d-9d24-81621509024a\" (UID: \"95b4d24e-09da-4c0d-9d24-81621509024a\") " Mar 18 18:14:53 crc kubenswrapper[4830]: I0318 18:14:53.888495 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/95b4d24e-09da-4c0d-9d24-81621509024a-console-serving-cert\") pod \"95b4d24e-09da-4c0d-9d24-81621509024a\" (UID: \"95b4d24e-09da-4c0d-9d24-81621509024a\") " Mar 18 18:14:53 crc kubenswrapper[4830]: I0318 18:14:53.888529 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7p6sn\" (UniqueName: \"kubernetes.io/projected/95b4d24e-09da-4c0d-9d24-81621509024a-kube-api-access-7p6sn\") pod \"95b4d24e-09da-4c0d-9d24-81621509024a\" (UID: \"95b4d24e-09da-4c0d-9d24-81621509024a\") " Mar 18 18:14:53 crc kubenswrapper[4830]: I0318 18:14:53.888566 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/95b4d24e-09da-4c0d-9d24-81621509024a-service-ca\") pod \"95b4d24e-09da-4c0d-9d24-81621509024a\" (UID: \"95b4d24e-09da-4c0d-9d24-81621509024a\") " Mar 18 18:14:53 crc kubenswrapper[4830]: I0318 18:14:53.889969 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95b4d24e-09da-4c0d-9d24-81621509024a-service-ca" (OuterVolumeSpecName: "service-ca") pod "95b4d24e-09da-4c0d-9d24-81621509024a" (UID: "95b4d24e-09da-4c0d-9d24-81621509024a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:14:53 crc kubenswrapper[4830]: I0318 18:14:53.889989 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95b4d24e-09da-4c0d-9d24-81621509024a-console-config" (OuterVolumeSpecName: "console-config") pod "95b4d24e-09da-4c0d-9d24-81621509024a" (UID: "95b4d24e-09da-4c0d-9d24-81621509024a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:14:53 crc kubenswrapper[4830]: I0318 18:14:53.890008 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95b4d24e-09da-4c0d-9d24-81621509024a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "95b4d24e-09da-4c0d-9d24-81621509024a" (UID: "95b4d24e-09da-4c0d-9d24-81621509024a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:14:53 crc kubenswrapper[4830]: I0318 18:14:53.890301 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95b4d24e-09da-4c0d-9d24-81621509024a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "95b4d24e-09da-4c0d-9d24-81621509024a" (UID: "95b4d24e-09da-4c0d-9d24-81621509024a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:14:53 crc kubenswrapper[4830]: I0318 18:14:53.901974 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95b4d24e-09da-4c0d-9d24-81621509024a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "95b4d24e-09da-4c0d-9d24-81621509024a" (UID: "95b4d24e-09da-4c0d-9d24-81621509024a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:14:53 crc kubenswrapper[4830]: I0318 18:14:53.902351 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95b4d24e-09da-4c0d-9d24-81621509024a-kube-api-access-7p6sn" (OuterVolumeSpecName: "kube-api-access-7p6sn") pod "95b4d24e-09da-4c0d-9d24-81621509024a" (UID: "95b4d24e-09da-4c0d-9d24-81621509024a"). InnerVolumeSpecName "kube-api-access-7p6sn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:14:53 crc kubenswrapper[4830]: I0318 18:14:53.902659 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95b4d24e-09da-4c0d-9d24-81621509024a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "95b4d24e-09da-4c0d-9d24-81621509024a" (UID: "95b4d24e-09da-4c0d-9d24-81621509024a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:14:53 crc kubenswrapper[4830]: I0318 18:14:53.991007 4830 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/95b4d24e-09da-4c0d-9d24-81621509024a-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:14:53 crc kubenswrapper[4830]: I0318 18:14:53.991480 4830 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/95b4d24e-09da-4c0d-9d24-81621509024a-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:14:53 crc kubenswrapper[4830]: I0318 18:14:53.991707 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7p6sn\" (UniqueName: \"kubernetes.io/projected/95b4d24e-09da-4c0d-9d24-81621509024a-kube-api-access-7p6sn\") on node \"crc\" DevicePath \"\"" Mar 18 18:14:53 crc kubenswrapper[4830]: I0318 18:14:53.991872 4830 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/95b4d24e-09da-4c0d-9d24-81621509024a-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:14:53 crc kubenswrapper[4830]: I0318 18:14:53.991992 4830 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95b4d24e-09da-4c0d-9d24-81621509024a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:14:53 crc kubenswrapper[4830]: I0318 18:14:53.992140 4830 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/95b4d24e-09da-4c0d-9d24-81621509024a-console-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:14:53 crc kubenswrapper[4830]: I0318 18:14:53.992258 4830 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/95b4d24e-09da-4c0d-9d24-81621509024a-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:14:54 crc kubenswrapper[4830]: I0318 18:14:54.353589 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-lfz57_95b4d24e-09da-4c0d-9d24-81621509024a/console/0.log" Mar 18 18:14:54 crc kubenswrapper[4830]: I0318 18:14:54.353646 4830 generic.go:334] "Generic (PLEG): container finished" podID="95b4d24e-09da-4c0d-9d24-81621509024a" containerID="2931bf69515a4a9a5d0e070d0c198fb1079578d21a98cc67e91f316c33e5b89c" exitCode=2 Mar 18 18:14:54 crc kubenswrapper[4830]: I0318 18:14:54.353681 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lfz57" event={"ID":"95b4d24e-09da-4c0d-9d24-81621509024a","Type":"ContainerDied","Data":"2931bf69515a4a9a5d0e070d0c198fb1079578d21a98cc67e91f316c33e5b89c"} Mar 18 18:14:54 crc kubenswrapper[4830]: I0318 18:14:54.353707 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lfz57" event={"ID":"95b4d24e-09da-4c0d-9d24-81621509024a","Type":"ContainerDied","Data":"052cda2b09aa5376af7220895ab194ca96345372ea8b743b3811ddb0eb9b2045"} Mar 18 18:14:54 crc kubenswrapper[4830]: I0318 18:14:54.353726 4830 scope.go:117] "RemoveContainer" containerID="2931bf69515a4a9a5d0e070d0c198fb1079578d21a98cc67e91f316c33e5b89c" Mar 18 18:14:54 crc kubenswrapper[4830]: I0318 18:14:54.353807 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lfz57" Mar 18 18:14:54 crc kubenswrapper[4830]: I0318 18:14:54.378010 4830 scope.go:117] "RemoveContainer" containerID="2931bf69515a4a9a5d0e070d0c198fb1079578d21a98cc67e91f316c33e5b89c" Mar 18 18:14:54 crc kubenswrapper[4830]: E0318 18:14:54.378537 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2931bf69515a4a9a5d0e070d0c198fb1079578d21a98cc67e91f316c33e5b89c\": container with ID starting with 2931bf69515a4a9a5d0e070d0c198fb1079578d21a98cc67e91f316c33e5b89c not found: ID does not exist" containerID="2931bf69515a4a9a5d0e070d0c198fb1079578d21a98cc67e91f316c33e5b89c" Mar 18 18:14:54 crc kubenswrapper[4830]: I0318 18:14:54.378605 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2931bf69515a4a9a5d0e070d0c198fb1079578d21a98cc67e91f316c33e5b89c"} err="failed to get container status \"2931bf69515a4a9a5d0e070d0c198fb1079578d21a98cc67e91f316c33e5b89c\": rpc error: code = NotFound desc = could not find container \"2931bf69515a4a9a5d0e070d0c198fb1079578d21a98cc67e91f316c33e5b89c\": container with ID starting with 2931bf69515a4a9a5d0e070d0c198fb1079578d21a98cc67e91f316c33e5b89c not found: ID does not exist" Mar 18 18:14:54 crc kubenswrapper[4830]: I0318 18:14:54.381771 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-lfz57"] Mar 18 18:14:54 crc kubenswrapper[4830]: I0318 18:14:54.386314 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-lfz57"] Mar 18 18:14:56 crc kubenswrapper[4830]: I0318 18:14:56.249021 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95b4d24e-09da-4c0d-9d24-81621509024a" path="/var/lib/kubelet/pods/95b4d24e-09da-4c0d-9d24-81621509024a/volumes" Mar 18 18:14:56 crc kubenswrapper[4830]: I0318 18:14:56.825018 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c"] Mar 18 18:14:56 crc kubenswrapper[4830]: E0318 18:14:56.825492 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95b4d24e-09da-4c0d-9d24-81621509024a" containerName="console" Mar 18 18:14:56 crc kubenswrapper[4830]: I0318 18:14:56.825535 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="95b4d24e-09da-4c0d-9d24-81621509024a" containerName="console" Mar 18 18:14:56 crc kubenswrapper[4830]: I0318 18:14:56.825877 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="95b4d24e-09da-4c0d-9d24-81621509024a" containerName="console" Mar 18 18:14:56 crc kubenswrapper[4830]: I0318 18:14:56.827496 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c" Mar 18 18:14:56 crc kubenswrapper[4830]: I0318 18:14:56.834431 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 18:14:56 crc kubenswrapper[4830]: I0318 18:14:56.844524 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c"] Mar 18 18:14:56 crc kubenswrapper[4830]: I0318 18:14:56.940468 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3ced8bbc-00e4-4b23-88dc-809962a78be2-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c\" (UID: \"3ced8bbc-00e4-4b23-88dc-809962a78be2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c" Mar 18 18:14:56 crc kubenswrapper[4830]: I0318 18:14:56.940641 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3ced8bbc-00e4-4b23-88dc-809962a78be2-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c\" (UID: \"3ced8bbc-00e4-4b23-88dc-809962a78be2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c" Mar 18 18:14:56 crc kubenswrapper[4830]: I0318 18:14:56.940679 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fskkt\" (UniqueName: \"kubernetes.io/projected/3ced8bbc-00e4-4b23-88dc-809962a78be2-kube-api-access-fskkt\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c\" (UID: \"3ced8bbc-00e4-4b23-88dc-809962a78be2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c" Mar 18 18:14:57 crc kubenswrapper[4830]: I0318 18:14:57.042735 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3ced8bbc-00e4-4b23-88dc-809962a78be2-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c\" (UID: \"3ced8bbc-00e4-4b23-88dc-809962a78be2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c" Mar 18 18:14:57 crc kubenswrapper[4830]: I0318 18:14:57.042930 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3ced8bbc-00e4-4b23-88dc-809962a78be2-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c\" (UID: \"3ced8bbc-00e4-4b23-88dc-809962a78be2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c" Mar 18 18:14:57 crc kubenswrapper[4830]: I0318 18:14:57.043010 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fskkt\" (UniqueName: \"kubernetes.io/projected/3ced8bbc-00e4-4b23-88dc-809962a78be2-kube-api-access-fskkt\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c\" (UID: \"3ced8bbc-00e4-4b23-88dc-809962a78be2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c" Mar 18 18:14:57 crc kubenswrapper[4830]: I0318 18:14:57.043350 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3ced8bbc-00e4-4b23-88dc-809962a78be2-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c\" (UID: \"3ced8bbc-00e4-4b23-88dc-809962a78be2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c" Mar 18 18:14:57 crc kubenswrapper[4830]: I0318 18:14:57.043572 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3ced8bbc-00e4-4b23-88dc-809962a78be2-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c\" (UID: \"3ced8bbc-00e4-4b23-88dc-809962a78be2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c" Mar 18 18:14:57 crc kubenswrapper[4830]: I0318 18:14:57.074225 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fskkt\" (UniqueName: \"kubernetes.io/projected/3ced8bbc-00e4-4b23-88dc-809962a78be2-kube-api-access-fskkt\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c\" (UID: \"3ced8bbc-00e4-4b23-88dc-809962a78be2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c" Mar 18 18:14:57 crc kubenswrapper[4830]: I0318 18:14:57.194930 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c" Mar 18 18:14:57 crc kubenswrapper[4830]: I0318 18:14:57.454143 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c"] Mar 18 18:14:57 crc kubenswrapper[4830]: W0318 18:14:57.477374 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ced8bbc_00e4_4b23_88dc_809962a78be2.slice/crio-211fc0b3dcae4367d17da55ecd85fce91afd80590a1f490dcaa258b1feeb7e8e WatchSource:0}: Error finding container 211fc0b3dcae4367d17da55ecd85fce91afd80590a1f490dcaa258b1feeb7e8e: Status 404 returned error can't find the container with id 211fc0b3dcae4367d17da55ecd85fce91afd80590a1f490dcaa258b1feeb7e8e Mar 18 18:14:58 crc kubenswrapper[4830]: I0318 18:14:58.408585 4830 generic.go:334] "Generic (PLEG): container finished" podID="3ced8bbc-00e4-4b23-88dc-809962a78be2" containerID="977632deb8937c3a4314b6dcd59219635e8ad5f6c27a751984a91387b83a9848" exitCode=0 Mar 18 18:14:58 crc kubenswrapper[4830]: I0318 18:14:58.408936 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c" event={"ID":"3ced8bbc-00e4-4b23-88dc-809962a78be2","Type":"ContainerDied","Data":"977632deb8937c3a4314b6dcd59219635e8ad5f6c27a751984a91387b83a9848"} Mar 18 18:14:58 crc kubenswrapper[4830]: I0318 18:14:58.409376 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c" event={"ID":"3ced8bbc-00e4-4b23-88dc-809962a78be2","Type":"ContainerStarted","Data":"211fc0b3dcae4367d17da55ecd85fce91afd80590a1f490dcaa258b1feeb7e8e"} Mar 18 18:15:00 crc kubenswrapper[4830]: I0318 18:15:00.156125 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564295-g725v"] Mar 18 18:15:00 crc kubenswrapper[4830]: I0318 18:15:00.157823 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-g725v" Mar 18 18:15:00 crc kubenswrapper[4830]: I0318 18:15:00.165402 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 18:15:00 crc kubenswrapper[4830]: I0318 18:15:00.166195 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 18:15:00 crc kubenswrapper[4830]: I0318 18:15:00.176010 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rbh58"] Mar 18 18:15:00 crc kubenswrapper[4830]: I0318 18:15:00.178219 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rbh58" Mar 18 18:15:00 crc kubenswrapper[4830]: I0318 18:15:00.188185 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564295-g725v"] Mar 18 18:15:00 crc kubenswrapper[4830]: I0318 18:15:00.202398 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rbh58"] Mar 18 18:15:00 crc kubenswrapper[4830]: I0318 18:15:00.300841 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9445ef0c-ef2a-4b12-8505-6864f39e0f59-utilities\") pod \"redhat-operators-rbh58\" (UID: \"9445ef0c-ef2a-4b12-8505-6864f39e0f59\") " pod="openshift-marketplace/redhat-operators-rbh58" Mar 18 18:15:00 crc kubenswrapper[4830]: I0318 18:15:00.300920 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5lvx\" (UniqueName: \"kubernetes.io/projected/cfb11874-1ec9-48f8-9312-718370bab9d1-kube-api-access-w5lvx\") pod \"collect-profiles-29564295-g725v\" (UID: \"cfb11874-1ec9-48f8-9312-718370bab9d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-g725v" Mar 18 18:15:00 crc kubenswrapper[4830]: I0318 18:15:00.300964 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9445ef0c-ef2a-4b12-8505-6864f39e0f59-catalog-content\") pod \"redhat-operators-rbh58\" (UID: \"9445ef0c-ef2a-4b12-8505-6864f39e0f59\") " pod="openshift-marketplace/redhat-operators-rbh58" Mar 18 18:15:00 crc kubenswrapper[4830]: I0318 18:15:00.300995 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfb11874-1ec9-48f8-9312-718370bab9d1-config-volume\") pod \"collect-profiles-29564295-g725v\" (UID: \"cfb11874-1ec9-48f8-9312-718370bab9d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-g725v" Mar 18 18:15:00 crc kubenswrapper[4830]: I0318 18:15:00.301022 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgmq8\" (UniqueName: \"kubernetes.io/projected/9445ef0c-ef2a-4b12-8505-6864f39e0f59-kube-api-access-mgmq8\") pod \"redhat-operators-rbh58\" (UID: \"9445ef0c-ef2a-4b12-8505-6864f39e0f59\") " pod="openshift-marketplace/redhat-operators-rbh58" Mar 18 18:15:00 crc kubenswrapper[4830]: I0318 18:15:00.301960 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cfb11874-1ec9-48f8-9312-718370bab9d1-secret-volume\") pod \"collect-profiles-29564295-g725v\" (UID: \"cfb11874-1ec9-48f8-9312-718370bab9d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-g725v" Mar 18 18:15:00 crc kubenswrapper[4830]: I0318 18:15:00.403985 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9445ef0c-ef2a-4b12-8505-6864f39e0f59-catalog-content\") pod \"redhat-operators-rbh58\" (UID: \"9445ef0c-ef2a-4b12-8505-6864f39e0f59\") " pod="openshift-marketplace/redhat-operators-rbh58" Mar 18 18:15:00 crc kubenswrapper[4830]: I0318 18:15:00.404052 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5lvx\" (UniqueName: \"kubernetes.io/projected/cfb11874-1ec9-48f8-9312-718370bab9d1-kube-api-access-w5lvx\") pod \"collect-profiles-29564295-g725v\" (UID: \"cfb11874-1ec9-48f8-9312-718370bab9d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-g725v" Mar 18 18:15:00 crc kubenswrapper[4830]: I0318 18:15:00.404083 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfb11874-1ec9-48f8-9312-718370bab9d1-config-volume\") pod \"collect-profiles-29564295-g725v\" (UID: \"cfb11874-1ec9-48f8-9312-718370bab9d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-g725v" Mar 18 18:15:00 crc kubenswrapper[4830]: I0318 18:15:00.404108 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgmq8\" (UniqueName: \"kubernetes.io/projected/9445ef0c-ef2a-4b12-8505-6864f39e0f59-kube-api-access-mgmq8\") pod \"redhat-operators-rbh58\" (UID: \"9445ef0c-ef2a-4b12-8505-6864f39e0f59\") " pod="openshift-marketplace/redhat-operators-rbh58" Mar 18 18:15:00 crc kubenswrapper[4830]: I0318 18:15:00.404170 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cfb11874-1ec9-48f8-9312-718370bab9d1-secret-volume\") pod \"collect-profiles-29564295-g725v\" (UID: \"cfb11874-1ec9-48f8-9312-718370bab9d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-g725v" Mar 18 18:15:00 crc kubenswrapper[4830]: I0318 18:15:00.404210 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9445ef0c-ef2a-4b12-8505-6864f39e0f59-utilities\") pod \"redhat-operators-rbh58\" (UID: \"9445ef0c-ef2a-4b12-8505-6864f39e0f59\") " pod="openshift-marketplace/redhat-operators-rbh58" Mar 18 18:15:00 crc kubenswrapper[4830]: I0318 18:15:00.404570 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9445ef0c-ef2a-4b12-8505-6864f39e0f59-catalog-content\") pod \"redhat-operators-rbh58\" (UID: \"9445ef0c-ef2a-4b12-8505-6864f39e0f59\") " pod="openshift-marketplace/redhat-operators-rbh58" Mar 18 18:15:00 crc kubenswrapper[4830]: I0318 18:15:00.404722 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9445ef0c-ef2a-4b12-8505-6864f39e0f59-utilities\") pod \"redhat-operators-rbh58\" (UID: \"9445ef0c-ef2a-4b12-8505-6864f39e0f59\") " pod="openshift-marketplace/redhat-operators-rbh58" Mar 18 18:15:00 crc kubenswrapper[4830]: I0318 18:15:00.405261 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfb11874-1ec9-48f8-9312-718370bab9d1-config-volume\") pod \"collect-profiles-29564295-g725v\" (UID: \"cfb11874-1ec9-48f8-9312-718370bab9d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-g725v" Mar 18 18:15:00 crc kubenswrapper[4830]: I0318 18:15:00.412203 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cfb11874-1ec9-48f8-9312-718370bab9d1-secret-volume\") pod \"collect-profiles-29564295-g725v\" (UID: \"cfb11874-1ec9-48f8-9312-718370bab9d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-g725v" Mar 18 18:15:00 crc kubenswrapper[4830]: I0318 18:15:00.426129 4830 generic.go:334] "Generic (PLEG): container finished" podID="3ced8bbc-00e4-4b23-88dc-809962a78be2" containerID="788b23e52974b32474406a4776a33a513f114050830828ef6441678f6c9ab3dd" exitCode=0 Mar 18 18:15:00 crc kubenswrapper[4830]: I0318 18:15:00.426172 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c" event={"ID":"3ced8bbc-00e4-4b23-88dc-809962a78be2","Type":"ContainerDied","Data":"788b23e52974b32474406a4776a33a513f114050830828ef6441678f6c9ab3dd"} Mar 18 18:15:00 crc kubenswrapper[4830]: I0318 18:15:00.437286 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgmq8\" (UniqueName: \"kubernetes.io/projected/9445ef0c-ef2a-4b12-8505-6864f39e0f59-kube-api-access-mgmq8\") pod \"redhat-operators-rbh58\" (UID: \"9445ef0c-ef2a-4b12-8505-6864f39e0f59\") " pod="openshift-marketplace/redhat-operators-rbh58" Mar 18 18:15:00 crc kubenswrapper[4830]: I0318 18:15:00.525773 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rbh58" Mar 18 18:15:00 crc kubenswrapper[4830]: I0318 18:15:00.527745 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5lvx\" (UniqueName: \"kubernetes.io/projected/cfb11874-1ec9-48f8-9312-718370bab9d1-kube-api-access-w5lvx\") pod \"collect-profiles-29564295-g725v\" (UID: \"cfb11874-1ec9-48f8-9312-718370bab9d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-g725v" Mar 18 18:15:00 crc kubenswrapper[4830]: I0318 18:15:00.750181 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rbh58"] Mar 18 18:15:00 crc kubenswrapper[4830]: I0318 18:15:00.805352 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-g725v" Mar 18 18:15:01 crc kubenswrapper[4830]: I0318 18:15:01.048602 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564295-g725v"] Mar 18 18:15:01 crc kubenswrapper[4830]: I0318 18:15:01.434217 4830 generic.go:334] "Generic (PLEG): container finished" podID="cfb11874-1ec9-48f8-9312-718370bab9d1" containerID="026cd292b47a740a4f80947a447a9aae6282e861d3336af0e7a1d9ce29072b04" exitCode=0 Mar 18 18:15:01 crc kubenswrapper[4830]: I0318 18:15:01.434381 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-g725v" event={"ID":"cfb11874-1ec9-48f8-9312-718370bab9d1","Type":"ContainerDied","Data":"026cd292b47a740a4f80947a447a9aae6282e861d3336af0e7a1d9ce29072b04"} Mar 18 18:15:01 crc kubenswrapper[4830]: I0318 18:15:01.434797 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-g725v" event={"ID":"cfb11874-1ec9-48f8-9312-718370bab9d1","Type":"ContainerStarted","Data":"c26cce99b54288e72b7b9c90e1a18fc51f4f422f0c0c4d09be02b74dda8e28bd"} Mar 18 18:15:01 crc kubenswrapper[4830]: I0318 18:15:01.437490 4830 generic.go:334] "Generic (PLEG): container finished" podID="3ced8bbc-00e4-4b23-88dc-809962a78be2" containerID="6af9917493b00a644721ff312b506ca93ff17a85baa6b729adbab56f40bea493" exitCode=0 Mar 18 18:15:01 crc kubenswrapper[4830]: I0318 18:15:01.437559 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c" event={"ID":"3ced8bbc-00e4-4b23-88dc-809962a78be2","Type":"ContainerDied","Data":"6af9917493b00a644721ff312b506ca93ff17a85baa6b729adbab56f40bea493"} Mar 18 18:15:01 crc kubenswrapper[4830]: I0318 18:15:01.439174 4830 generic.go:334] "Generic (PLEG): container finished" podID="9445ef0c-ef2a-4b12-8505-6864f39e0f59" containerID="6dda02652eb9fc1c81cb1853f7bc8fafd3ce695d1587d32cbcfcc5071858808e" exitCode=0 Mar 18 18:15:01 crc kubenswrapper[4830]: I0318 18:15:01.439242 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rbh58" event={"ID":"9445ef0c-ef2a-4b12-8505-6864f39e0f59","Type":"ContainerDied","Data":"6dda02652eb9fc1c81cb1853f7bc8fafd3ce695d1587d32cbcfcc5071858808e"} Mar 18 18:15:01 crc kubenswrapper[4830]: I0318 18:15:01.439273 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rbh58" event={"ID":"9445ef0c-ef2a-4b12-8505-6864f39e0f59","Type":"ContainerStarted","Data":"78c03fc65e45342605e97d4988ae70a53023dfea72dda838940ee6eead4c7bdd"} Mar 18 18:15:01 crc kubenswrapper[4830]: I0318 18:15:01.441464 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 18:15:02 crc kubenswrapper[4830]: I0318 18:15:02.727496 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-g725v" Mar 18 18:15:02 crc kubenswrapper[4830]: I0318 18:15:02.792713 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c" Mar 18 18:15:02 crc kubenswrapper[4830]: I0318 18:15:02.837677 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5lvx\" (UniqueName: \"kubernetes.io/projected/cfb11874-1ec9-48f8-9312-718370bab9d1-kube-api-access-w5lvx\") pod \"cfb11874-1ec9-48f8-9312-718370bab9d1\" (UID: \"cfb11874-1ec9-48f8-9312-718370bab9d1\") " Mar 18 18:15:02 crc kubenswrapper[4830]: I0318 18:15:02.837937 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfb11874-1ec9-48f8-9312-718370bab9d1-config-volume\") pod \"cfb11874-1ec9-48f8-9312-718370bab9d1\" (UID: \"cfb11874-1ec9-48f8-9312-718370bab9d1\") " Mar 18 18:15:02 crc kubenswrapper[4830]: I0318 18:15:02.838136 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cfb11874-1ec9-48f8-9312-718370bab9d1-secret-volume\") pod \"cfb11874-1ec9-48f8-9312-718370bab9d1\" (UID: \"cfb11874-1ec9-48f8-9312-718370bab9d1\") " Mar 18 18:15:02 crc kubenswrapper[4830]: I0318 18:15:02.838957 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfb11874-1ec9-48f8-9312-718370bab9d1-config-volume" (OuterVolumeSpecName: "config-volume") pod "cfb11874-1ec9-48f8-9312-718370bab9d1" (UID: "cfb11874-1ec9-48f8-9312-718370bab9d1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:15:02 crc kubenswrapper[4830]: I0318 18:15:02.843060 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfb11874-1ec9-48f8-9312-718370bab9d1-kube-api-access-w5lvx" (OuterVolumeSpecName: "kube-api-access-w5lvx") pod "cfb11874-1ec9-48f8-9312-718370bab9d1" (UID: "cfb11874-1ec9-48f8-9312-718370bab9d1"). InnerVolumeSpecName "kube-api-access-w5lvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:15:02 crc kubenswrapper[4830]: I0318 18:15:02.843158 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfb11874-1ec9-48f8-9312-718370bab9d1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cfb11874-1ec9-48f8-9312-718370bab9d1" (UID: "cfb11874-1ec9-48f8-9312-718370bab9d1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:15:02 crc kubenswrapper[4830]: I0318 18:15:02.939405 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3ced8bbc-00e4-4b23-88dc-809962a78be2-bundle\") pod \"3ced8bbc-00e4-4b23-88dc-809962a78be2\" (UID: \"3ced8bbc-00e4-4b23-88dc-809962a78be2\") " Mar 18 18:15:02 crc kubenswrapper[4830]: I0318 18:15:02.939489 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fskkt\" (UniqueName: \"kubernetes.io/projected/3ced8bbc-00e4-4b23-88dc-809962a78be2-kube-api-access-fskkt\") pod \"3ced8bbc-00e4-4b23-88dc-809962a78be2\" (UID: \"3ced8bbc-00e4-4b23-88dc-809962a78be2\") " Mar 18 18:15:02 crc kubenswrapper[4830]: I0318 18:15:02.939622 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3ced8bbc-00e4-4b23-88dc-809962a78be2-util\") pod \"3ced8bbc-00e4-4b23-88dc-809962a78be2\" (UID: \"3ced8bbc-00e4-4b23-88dc-809962a78be2\") " Mar 18 18:15:02 crc kubenswrapper[4830]: I0318 18:15:02.939873 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5lvx\" (UniqueName: \"kubernetes.io/projected/cfb11874-1ec9-48f8-9312-718370bab9d1-kube-api-access-w5lvx\") on node \"crc\" DevicePath \"\"" Mar 18 18:15:02 crc kubenswrapper[4830]: I0318 18:15:02.939886 4830 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfb11874-1ec9-48f8-9312-718370bab9d1-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 18:15:02 crc kubenswrapper[4830]: I0318 18:15:02.939896 4830 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cfb11874-1ec9-48f8-9312-718370bab9d1-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 18:15:02 crc kubenswrapper[4830]: I0318 18:15:02.941715 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ced8bbc-00e4-4b23-88dc-809962a78be2-bundle" (OuterVolumeSpecName: "bundle") pod "3ced8bbc-00e4-4b23-88dc-809962a78be2" (UID: "3ced8bbc-00e4-4b23-88dc-809962a78be2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:15:02 crc kubenswrapper[4830]: I0318 18:15:02.944304 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ced8bbc-00e4-4b23-88dc-809962a78be2-kube-api-access-fskkt" (OuterVolumeSpecName: "kube-api-access-fskkt") pod "3ced8bbc-00e4-4b23-88dc-809962a78be2" (UID: "3ced8bbc-00e4-4b23-88dc-809962a78be2"). InnerVolumeSpecName "kube-api-access-fskkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:15:03 crc kubenswrapper[4830]: I0318 18:15:03.041503 4830 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3ced8bbc-00e4-4b23-88dc-809962a78be2-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:15:03 crc kubenswrapper[4830]: I0318 18:15:03.041548 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fskkt\" (UniqueName: \"kubernetes.io/projected/3ced8bbc-00e4-4b23-88dc-809962a78be2-kube-api-access-fskkt\") on node \"crc\" DevicePath \"\"" Mar 18 18:15:03 crc kubenswrapper[4830]: I0318 18:15:03.250227 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ced8bbc-00e4-4b23-88dc-809962a78be2-util" (OuterVolumeSpecName: "util") pod "3ced8bbc-00e4-4b23-88dc-809962a78be2" (UID: "3ced8bbc-00e4-4b23-88dc-809962a78be2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:15:03 crc kubenswrapper[4830]: I0318 18:15:03.347635 4830 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3ced8bbc-00e4-4b23-88dc-809962a78be2-util\") on node \"crc\" DevicePath \"\"" Mar 18 18:15:03 crc kubenswrapper[4830]: I0318 18:15:03.455900 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c" Mar 18 18:15:03 crc kubenswrapper[4830]: I0318 18:15:03.455906 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c" event={"ID":"3ced8bbc-00e4-4b23-88dc-809962a78be2","Type":"ContainerDied","Data":"211fc0b3dcae4367d17da55ecd85fce91afd80590a1f490dcaa258b1feeb7e8e"} Mar 18 18:15:03 crc kubenswrapper[4830]: I0318 18:15:03.456441 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="211fc0b3dcae4367d17da55ecd85fce91afd80590a1f490dcaa258b1feeb7e8e" Mar 18 18:15:03 crc kubenswrapper[4830]: I0318 18:15:03.457698 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rbh58" event={"ID":"9445ef0c-ef2a-4b12-8505-6864f39e0f59","Type":"ContainerStarted","Data":"f29e99cff57897b1d74720bda3153c91977b23d21f762c3cde36b8656e1cd65b"} Mar 18 18:15:03 crc kubenswrapper[4830]: I0318 18:15:03.459915 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-g725v" event={"ID":"cfb11874-1ec9-48f8-9312-718370bab9d1","Type":"ContainerDied","Data":"c26cce99b54288e72b7b9c90e1a18fc51f4f422f0c0c4d09be02b74dda8e28bd"} Mar 18 18:15:03 crc kubenswrapper[4830]: I0318 18:15:03.460046 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c26cce99b54288e72b7b9c90e1a18fc51f4f422f0c0c4d09be02b74dda8e28bd" Mar 18 18:15:03 crc kubenswrapper[4830]: I0318 18:15:03.460025 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-g725v" Mar 18 18:15:04 crc kubenswrapper[4830]: I0318 18:15:04.477173 4830 generic.go:334] "Generic (PLEG): container finished" podID="9445ef0c-ef2a-4b12-8505-6864f39e0f59" containerID="f29e99cff57897b1d74720bda3153c91977b23d21f762c3cde36b8656e1cd65b" exitCode=0 Mar 18 18:15:04 crc kubenswrapper[4830]: I0318 18:15:04.477234 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rbh58" event={"ID":"9445ef0c-ef2a-4b12-8505-6864f39e0f59","Type":"ContainerDied","Data":"f29e99cff57897b1d74720bda3153c91977b23d21f762c3cde36b8656e1cd65b"} Mar 18 18:15:05 crc kubenswrapper[4830]: I0318 18:15:05.488696 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rbh58" event={"ID":"9445ef0c-ef2a-4b12-8505-6864f39e0f59","Type":"ContainerStarted","Data":"a60e139c459ad24dc3e74e191e2360247344c065d05599e12094baa1562c6473"} Mar 18 18:15:05 crc kubenswrapper[4830]: I0318 18:15:05.517845 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rbh58" podStartSLOduration=1.9898724479999998 podStartE2EDuration="5.51782182s" podCreationTimestamp="2026-03-18 18:15:00 +0000 UTC" firstStartedPulling="2026-03-18 18:15:01.441094007 +0000 UTC m=+736.008724359" lastFinishedPulling="2026-03-18 18:15:04.969043369 +0000 UTC m=+739.536673731" observedRunningTime="2026-03-18 18:15:05.513290606 +0000 UTC m=+740.080921018" watchObservedRunningTime="2026-03-18 18:15:05.51782182 +0000 UTC m=+740.085452162" Mar 18 18:15:10 crc kubenswrapper[4830]: I0318 18:15:10.526968 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rbh58" Mar 18 18:15:10 crc kubenswrapper[4830]: I0318 18:15:10.527394 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rbh58" Mar 18 18:15:11 crc kubenswrapper[4830]: I0318 18:15:11.618631 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rbh58" podUID="9445ef0c-ef2a-4b12-8505-6864f39e0f59" containerName="registry-server" probeResult="failure" output=< Mar 18 18:15:11 crc kubenswrapper[4830]: timeout: failed to connect service ":50051" within 1s Mar 18 18:15:11 crc kubenswrapper[4830]: > Mar 18 18:15:16 crc kubenswrapper[4830]: I0318 18:15:16.742739 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-86dfc68bcc-zxmhg"] Mar 18 18:15:16 crc kubenswrapper[4830]: E0318 18:15:16.743505 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ced8bbc-00e4-4b23-88dc-809962a78be2" containerName="pull" Mar 18 18:15:16 crc kubenswrapper[4830]: I0318 18:15:16.743530 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ced8bbc-00e4-4b23-88dc-809962a78be2" containerName="pull" Mar 18 18:15:16 crc kubenswrapper[4830]: E0318 18:15:16.743552 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb11874-1ec9-48f8-9312-718370bab9d1" containerName="collect-profiles" Mar 18 18:15:16 crc kubenswrapper[4830]: I0318 18:15:16.743558 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb11874-1ec9-48f8-9312-718370bab9d1" containerName="collect-profiles" Mar 18 18:15:16 crc kubenswrapper[4830]: E0318 18:15:16.743573 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ced8bbc-00e4-4b23-88dc-809962a78be2" containerName="extract" Mar 18 18:15:16 crc kubenswrapper[4830]: I0318 18:15:16.743579 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ced8bbc-00e4-4b23-88dc-809962a78be2" containerName="extract" Mar 18 18:15:16 crc kubenswrapper[4830]: E0318 18:15:16.743602 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ced8bbc-00e4-4b23-88dc-809962a78be2" containerName="util" Mar 18 18:15:16 crc kubenswrapper[4830]: I0318 18:15:16.743608 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ced8bbc-00e4-4b23-88dc-809962a78be2" containerName="util" Mar 18 18:15:16 crc kubenswrapper[4830]: I0318 18:15:16.743707 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ced8bbc-00e4-4b23-88dc-809962a78be2" containerName="extract" Mar 18 18:15:16 crc kubenswrapper[4830]: I0318 18:15:16.743720 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfb11874-1ec9-48f8-9312-718370bab9d1" containerName="collect-profiles" Mar 18 18:15:16 crc kubenswrapper[4830]: I0318 18:15:16.744175 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-86dfc68bcc-zxmhg" Mar 18 18:15:16 crc kubenswrapper[4830]: I0318 18:15:16.747967 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 18 18:15:16 crc kubenswrapper[4830]: I0318 18:15:16.748310 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 18 18:15:16 crc kubenswrapper[4830]: I0318 18:15:16.750059 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 18 18:15:16 crc kubenswrapper[4830]: I0318 18:15:16.750511 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 18 18:15:16 crc kubenswrapper[4830]: I0318 18:15:16.750657 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-tgbqh" Mar 18 18:15:16 crc kubenswrapper[4830]: I0318 18:15:16.766072 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-86dfc68bcc-zxmhg"] Mar 18 18:15:16 crc kubenswrapper[4830]: I0318 18:15:16.831932 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/684ba3af-f549-47d2-81b9-52a1993a93ff-webhook-cert\") pod \"metallb-operator-controller-manager-86dfc68bcc-zxmhg\" (UID: \"684ba3af-f549-47d2-81b9-52a1993a93ff\") " pod="metallb-system/metallb-operator-controller-manager-86dfc68bcc-zxmhg" Mar 18 18:15:16 crc kubenswrapper[4830]: I0318 18:15:16.831986 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/684ba3af-f549-47d2-81b9-52a1993a93ff-apiservice-cert\") pod \"metallb-operator-controller-manager-86dfc68bcc-zxmhg\" (UID: \"684ba3af-f549-47d2-81b9-52a1993a93ff\") " pod="metallb-system/metallb-operator-controller-manager-86dfc68bcc-zxmhg" Mar 18 18:15:16 crc kubenswrapper[4830]: I0318 18:15:16.832010 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r98rq\" (UniqueName: \"kubernetes.io/projected/684ba3af-f549-47d2-81b9-52a1993a93ff-kube-api-access-r98rq\") pod \"metallb-operator-controller-manager-86dfc68bcc-zxmhg\" (UID: \"684ba3af-f549-47d2-81b9-52a1993a93ff\") " pod="metallb-system/metallb-operator-controller-manager-86dfc68bcc-zxmhg" Mar 18 18:15:16 crc kubenswrapper[4830]: I0318 18:15:16.933166 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/684ba3af-f549-47d2-81b9-52a1993a93ff-webhook-cert\") pod \"metallb-operator-controller-manager-86dfc68bcc-zxmhg\" (UID: \"684ba3af-f549-47d2-81b9-52a1993a93ff\") " pod="metallb-system/metallb-operator-controller-manager-86dfc68bcc-zxmhg" Mar 18 18:15:16 crc kubenswrapper[4830]: I0318 18:15:16.933211 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/684ba3af-f549-47d2-81b9-52a1993a93ff-apiservice-cert\") pod \"metallb-operator-controller-manager-86dfc68bcc-zxmhg\" (UID: \"684ba3af-f549-47d2-81b9-52a1993a93ff\") " pod="metallb-system/metallb-operator-controller-manager-86dfc68bcc-zxmhg" Mar 18 18:15:16 crc kubenswrapper[4830]: I0318 18:15:16.933230 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r98rq\" (UniqueName: \"kubernetes.io/projected/684ba3af-f549-47d2-81b9-52a1993a93ff-kube-api-access-r98rq\") pod \"metallb-operator-controller-manager-86dfc68bcc-zxmhg\" (UID: \"684ba3af-f549-47d2-81b9-52a1993a93ff\") " pod="metallb-system/metallb-operator-controller-manager-86dfc68bcc-zxmhg" Mar 18 18:15:16 crc kubenswrapper[4830]: I0318 18:15:16.940450 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/684ba3af-f549-47d2-81b9-52a1993a93ff-webhook-cert\") pod \"metallb-operator-controller-manager-86dfc68bcc-zxmhg\" (UID: \"684ba3af-f549-47d2-81b9-52a1993a93ff\") " pod="metallb-system/metallb-operator-controller-manager-86dfc68bcc-zxmhg" Mar 18 18:15:16 crc kubenswrapper[4830]: I0318 18:15:16.942545 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/684ba3af-f549-47d2-81b9-52a1993a93ff-apiservice-cert\") pod \"metallb-operator-controller-manager-86dfc68bcc-zxmhg\" (UID: \"684ba3af-f549-47d2-81b9-52a1993a93ff\") " pod="metallb-system/metallb-operator-controller-manager-86dfc68bcc-zxmhg" Mar 18 18:15:16 crc kubenswrapper[4830]: I0318 18:15:16.949554 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r98rq\" (UniqueName: \"kubernetes.io/projected/684ba3af-f549-47d2-81b9-52a1993a93ff-kube-api-access-r98rq\") pod \"metallb-operator-controller-manager-86dfc68bcc-zxmhg\" (UID: \"684ba3af-f549-47d2-81b9-52a1993a93ff\") " pod="metallb-system/metallb-operator-controller-manager-86dfc68bcc-zxmhg" Mar 18 18:15:16 crc kubenswrapper[4830]: I0318 18:15:16.990533 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-85f6f6858-qtgzb"] Mar 18 18:15:16 crc kubenswrapper[4830]: I0318 18:15:16.991241 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-85f6f6858-qtgzb" Mar 18 18:15:16 crc kubenswrapper[4830]: I0318 18:15:16.993726 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 18 18:15:16 crc kubenswrapper[4830]: I0318 18:15:16.994155 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 18 18:15:16 crc kubenswrapper[4830]: I0318 18:15:16.996905 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-nqckz" Mar 18 18:15:17 crc kubenswrapper[4830]: I0318 18:15:17.014293 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-85f6f6858-qtgzb"] Mar 18 18:15:17 crc kubenswrapper[4830]: I0318 18:15:17.060155 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-86dfc68bcc-zxmhg" Mar 18 18:15:17 crc kubenswrapper[4830]: I0318 18:15:17.135552 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1ca766c2-0f41-45d7-b219-d5293e66ca65-apiservice-cert\") pod \"metallb-operator-webhook-server-85f6f6858-qtgzb\" (UID: \"1ca766c2-0f41-45d7-b219-d5293e66ca65\") " pod="metallb-system/metallb-operator-webhook-server-85f6f6858-qtgzb" Mar 18 18:15:17 crc kubenswrapper[4830]: I0318 18:15:17.136105 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1ca766c2-0f41-45d7-b219-d5293e66ca65-webhook-cert\") pod \"metallb-operator-webhook-server-85f6f6858-qtgzb\" (UID: \"1ca766c2-0f41-45d7-b219-d5293e66ca65\") " pod="metallb-system/metallb-operator-webhook-server-85f6f6858-qtgzb" Mar 18 18:15:17 crc kubenswrapper[4830]: I0318 18:15:17.136144 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxgd8\" (UniqueName: \"kubernetes.io/projected/1ca766c2-0f41-45d7-b219-d5293e66ca65-kube-api-access-kxgd8\") pod \"metallb-operator-webhook-server-85f6f6858-qtgzb\" (UID: \"1ca766c2-0f41-45d7-b219-d5293e66ca65\") " pod="metallb-system/metallb-operator-webhook-server-85f6f6858-qtgzb" Mar 18 18:15:17 crc kubenswrapper[4830]: I0318 18:15:17.239480 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1ca766c2-0f41-45d7-b219-d5293e66ca65-apiservice-cert\") pod \"metallb-operator-webhook-server-85f6f6858-qtgzb\" (UID: \"1ca766c2-0f41-45d7-b219-d5293e66ca65\") " pod="metallb-system/metallb-operator-webhook-server-85f6f6858-qtgzb" Mar 18 18:15:17 crc kubenswrapper[4830]: I0318 18:15:17.239535 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1ca766c2-0f41-45d7-b219-d5293e66ca65-webhook-cert\") pod \"metallb-operator-webhook-server-85f6f6858-qtgzb\" (UID: \"1ca766c2-0f41-45d7-b219-d5293e66ca65\") " pod="metallb-system/metallb-operator-webhook-server-85f6f6858-qtgzb" Mar 18 18:15:17 crc kubenswrapper[4830]: I0318 18:15:17.239561 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxgd8\" (UniqueName: \"kubernetes.io/projected/1ca766c2-0f41-45d7-b219-d5293e66ca65-kube-api-access-kxgd8\") pod \"metallb-operator-webhook-server-85f6f6858-qtgzb\" (UID: \"1ca766c2-0f41-45d7-b219-d5293e66ca65\") " pod="metallb-system/metallb-operator-webhook-server-85f6f6858-qtgzb" Mar 18 18:15:17 crc kubenswrapper[4830]: I0318 18:15:17.245810 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1ca766c2-0f41-45d7-b219-d5293e66ca65-webhook-cert\") pod \"metallb-operator-webhook-server-85f6f6858-qtgzb\" (UID: \"1ca766c2-0f41-45d7-b219-d5293e66ca65\") " pod="metallb-system/metallb-operator-webhook-server-85f6f6858-qtgzb" Mar 18 18:15:17 crc kubenswrapper[4830]: I0318 18:15:17.245855 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1ca766c2-0f41-45d7-b219-d5293e66ca65-apiservice-cert\") pod \"metallb-operator-webhook-server-85f6f6858-qtgzb\" (UID: \"1ca766c2-0f41-45d7-b219-d5293e66ca65\") " pod="metallb-system/metallb-operator-webhook-server-85f6f6858-qtgzb" Mar 18 18:15:17 crc kubenswrapper[4830]: I0318 18:15:17.258913 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxgd8\" (UniqueName: \"kubernetes.io/projected/1ca766c2-0f41-45d7-b219-d5293e66ca65-kube-api-access-kxgd8\") pod \"metallb-operator-webhook-server-85f6f6858-qtgzb\" (UID: \"1ca766c2-0f41-45d7-b219-d5293e66ca65\") " pod="metallb-system/metallb-operator-webhook-server-85f6f6858-qtgzb" Mar 18 18:15:17 crc kubenswrapper[4830]: I0318 18:15:17.308385 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-85f6f6858-qtgzb" Mar 18 18:15:17 crc kubenswrapper[4830]: I0318 18:15:17.331092 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-86dfc68bcc-zxmhg"] Mar 18 18:15:17 crc kubenswrapper[4830]: I0318 18:15:17.576212 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-86dfc68bcc-zxmhg" event={"ID":"684ba3af-f549-47d2-81b9-52a1993a93ff","Type":"ContainerStarted","Data":"d636c214721f1995d162c089b1967705e1bb4f14783d765c61dabe2b249e469f"} Mar 18 18:15:17 crc kubenswrapper[4830]: I0318 18:15:17.817640 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-85f6f6858-qtgzb"] Mar 18 18:15:17 crc kubenswrapper[4830]: W0318 18:15:17.823789 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ca766c2_0f41_45d7_b219_d5293e66ca65.slice/crio-32bf6a27269c61f22b8e2b5ba45f3dc8fb013367a8298901584f9ec6cf1234dd WatchSource:0}: Error finding container 32bf6a27269c61f22b8e2b5ba45f3dc8fb013367a8298901584f9ec6cf1234dd: Status 404 returned error can't find the container with id 32bf6a27269c61f22b8e2b5ba45f3dc8fb013367a8298901584f9ec6cf1234dd Mar 18 18:15:18 crc kubenswrapper[4830]: I0318 18:15:18.582840 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-85f6f6858-qtgzb" event={"ID":"1ca766c2-0f41-45d7-b219-d5293e66ca65","Type":"ContainerStarted","Data":"32bf6a27269c61f22b8e2b5ba45f3dc8fb013367a8298901584f9ec6cf1234dd"} Mar 18 18:15:20 crc kubenswrapper[4830]: I0318 18:15:20.591080 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rbh58" Mar 18 18:15:20 crc kubenswrapper[4830]: I0318 18:15:20.647049 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rbh58" Mar 18 18:15:20 crc kubenswrapper[4830]: I0318 18:15:20.833377 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rbh58"] Mar 18 18:15:22 crc kubenswrapper[4830]: I0318 18:15:22.633485 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-86dfc68bcc-zxmhg" event={"ID":"684ba3af-f549-47d2-81b9-52a1993a93ff","Type":"ContainerStarted","Data":"276cbd03ab2cd2929b51f49545d980627418d87ebcf376ee4b0847095b3d095e"} Mar 18 18:15:22 crc kubenswrapper[4830]: I0318 18:15:22.634540 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-86dfc68bcc-zxmhg" Mar 18 18:15:22 crc kubenswrapper[4830]: I0318 18:15:22.634868 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rbh58" podUID="9445ef0c-ef2a-4b12-8505-6864f39e0f59" containerName="registry-server" containerID="cri-o://a60e139c459ad24dc3e74e191e2360247344c065d05599e12094baa1562c6473" gracePeriod=2 Mar 18 18:15:22 crc kubenswrapper[4830]: I0318 18:15:22.668200 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-86dfc68bcc-zxmhg" podStartSLOduration=2.241174971 podStartE2EDuration="6.668171191s" podCreationTimestamp="2026-03-18 18:15:16 +0000 UTC" firstStartedPulling="2026-03-18 18:15:17.344070629 +0000 UTC m=+751.911700961" lastFinishedPulling="2026-03-18 18:15:21.771066849 +0000 UTC m=+756.338697181" observedRunningTime="2026-03-18 18:15:22.659164844 +0000 UTC m=+757.226795176" watchObservedRunningTime="2026-03-18 18:15:22.668171191 +0000 UTC m=+757.235801563" Mar 18 18:15:22 crc kubenswrapper[4830]: I0318 18:15:22.980704 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rbh58" Mar 18 18:15:23 crc kubenswrapper[4830]: I0318 18:15:23.118696 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgmq8\" (UniqueName: \"kubernetes.io/projected/9445ef0c-ef2a-4b12-8505-6864f39e0f59-kube-api-access-mgmq8\") pod \"9445ef0c-ef2a-4b12-8505-6864f39e0f59\" (UID: \"9445ef0c-ef2a-4b12-8505-6864f39e0f59\") " Mar 18 18:15:23 crc kubenswrapper[4830]: I0318 18:15:23.119946 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9445ef0c-ef2a-4b12-8505-6864f39e0f59-utilities\") pod \"9445ef0c-ef2a-4b12-8505-6864f39e0f59\" (UID: \"9445ef0c-ef2a-4b12-8505-6864f39e0f59\") " Mar 18 18:15:23 crc kubenswrapper[4830]: I0318 18:15:23.120037 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9445ef0c-ef2a-4b12-8505-6864f39e0f59-catalog-content\") pod \"9445ef0c-ef2a-4b12-8505-6864f39e0f59\" (UID: \"9445ef0c-ef2a-4b12-8505-6864f39e0f59\") " Mar 18 18:15:23 crc kubenswrapper[4830]: I0318 18:15:23.120730 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9445ef0c-ef2a-4b12-8505-6864f39e0f59-utilities" (OuterVolumeSpecName: "utilities") pod "9445ef0c-ef2a-4b12-8505-6864f39e0f59" (UID: "9445ef0c-ef2a-4b12-8505-6864f39e0f59"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:15:23 crc kubenswrapper[4830]: I0318 18:15:23.133051 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9445ef0c-ef2a-4b12-8505-6864f39e0f59-kube-api-access-mgmq8" (OuterVolumeSpecName: "kube-api-access-mgmq8") pod "9445ef0c-ef2a-4b12-8505-6864f39e0f59" (UID: "9445ef0c-ef2a-4b12-8505-6864f39e0f59"). InnerVolumeSpecName "kube-api-access-mgmq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:15:23 crc kubenswrapper[4830]: I0318 18:15:23.230448 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgmq8\" (UniqueName: \"kubernetes.io/projected/9445ef0c-ef2a-4b12-8505-6864f39e0f59-kube-api-access-mgmq8\") on node \"crc\" DevicePath \"\"" Mar 18 18:15:23 crc kubenswrapper[4830]: I0318 18:15:23.230496 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9445ef0c-ef2a-4b12-8505-6864f39e0f59-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:15:23 crc kubenswrapper[4830]: I0318 18:15:23.277158 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9445ef0c-ef2a-4b12-8505-6864f39e0f59-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9445ef0c-ef2a-4b12-8505-6864f39e0f59" (UID: "9445ef0c-ef2a-4b12-8505-6864f39e0f59"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:15:23 crc kubenswrapper[4830]: I0318 18:15:23.331492 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9445ef0c-ef2a-4b12-8505-6864f39e0f59-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:15:23 crc kubenswrapper[4830]: I0318 18:15:23.677170 4830 generic.go:334] "Generic (PLEG): container finished" podID="9445ef0c-ef2a-4b12-8505-6864f39e0f59" containerID="a60e139c459ad24dc3e74e191e2360247344c065d05599e12094baa1562c6473" exitCode=0 Mar 18 18:15:23 crc kubenswrapper[4830]: I0318 18:15:23.677249 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rbh58" Mar 18 18:15:23 crc kubenswrapper[4830]: I0318 18:15:23.677311 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rbh58" event={"ID":"9445ef0c-ef2a-4b12-8505-6864f39e0f59","Type":"ContainerDied","Data":"a60e139c459ad24dc3e74e191e2360247344c065d05599e12094baa1562c6473"} Mar 18 18:15:23 crc kubenswrapper[4830]: I0318 18:15:23.677350 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rbh58" event={"ID":"9445ef0c-ef2a-4b12-8505-6864f39e0f59","Type":"ContainerDied","Data":"78c03fc65e45342605e97d4988ae70a53023dfea72dda838940ee6eead4c7bdd"} Mar 18 18:15:23 crc kubenswrapper[4830]: I0318 18:15:23.677374 4830 scope.go:117] "RemoveContainer" containerID="a60e139c459ad24dc3e74e191e2360247344c065d05599e12094baa1562c6473" Mar 18 18:15:23 crc kubenswrapper[4830]: I0318 18:15:23.711533 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rbh58"] Mar 18 18:15:23 crc kubenswrapper[4830]: I0318 18:15:23.722430 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rbh58"] Mar 18 18:15:24 crc kubenswrapper[4830]: I0318 18:15:24.240840 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9445ef0c-ef2a-4b12-8505-6864f39e0f59" path="/var/lib/kubelet/pods/9445ef0c-ef2a-4b12-8505-6864f39e0f59/volumes" Mar 18 18:15:24 crc kubenswrapper[4830]: I0318 18:15:24.825354 4830 scope.go:117] "RemoveContainer" containerID="f29e99cff57897b1d74720bda3153c91977b23d21f762c3cde36b8656e1cd65b" Mar 18 18:15:24 crc kubenswrapper[4830]: I0318 18:15:24.873269 4830 scope.go:117] "RemoveContainer" containerID="6dda02652eb9fc1c81cb1853f7bc8fafd3ce695d1587d32cbcfcc5071858808e" Mar 18 18:15:24 crc kubenswrapper[4830]: I0318 18:15:24.896607 4830 scope.go:117] "RemoveContainer" containerID="a60e139c459ad24dc3e74e191e2360247344c065d05599e12094baa1562c6473" Mar 18 18:15:24 crc kubenswrapper[4830]: E0318 18:15:24.897114 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a60e139c459ad24dc3e74e191e2360247344c065d05599e12094baa1562c6473\": container with ID starting with a60e139c459ad24dc3e74e191e2360247344c065d05599e12094baa1562c6473 not found: ID does not exist" containerID="a60e139c459ad24dc3e74e191e2360247344c065d05599e12094baa1562c6473" Mar 18 18:15:24 crc kubenswrapper[4830]: I0318 18:15:24.897166 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a60e139c459ad24dc3e74e191e2360247344c065d05599e12094baa1562c6473"} err="failed to get container status \"a60e139c459ad24dc3e74e191e2360247344c065d05599e12094baa1562c6473\": rpc error: code = NotFound desc = could not find container \"a60e139c459ad24dc3e74e191e2360247344c065d05599e12094baa1562c6473\": container with ID starting with a60e139c459ad24dc3e74e191e2360247344c065d05599e12094baa1562c6473 not found: ID does not exist" Mar 18 18:15:24 crc kubenswrapper[4830]: I0318 18:15:24.897201 4830 scope.go:117] "RemoveContainer" containerID="f29e99cff57897b1d74720bda3153c91977b23d21f762c3cde36b8656e1cd65b" Mar 18 18:15:24 crc kubenswrapper[4830]: E0318 18:15:24.898379 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f29e99cff57897b1d74720bda3153c91977b23d21f762c3cde36b8656e1cd65b\": container with ID starting with f29e99cff57897b1d74720bda3153c91977b23d21f762c3cde36b8656e1cd65b not found: ID does not exist" containerID="f29e99cff57897b1d74720bda3153c91977b23d21f762c3cde36b8656e1cd65b" Mar 18 18:15:24 crc kubenswrapper[4830]: I0318 18:15:24.898429 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f29e99cff57897b1d74720bda3153c91977b23d21f762c3cde36b8656e1cd65b"} err="failed to get container status \"f29e99cff57897b1d74720bda3153c91977b23d21f762c3cde36b8656e1cd65b\": rpc error: code = NotFound desc = could not find container \"f29e99cff57897b1d74720bda3153c91977b23d21f762c3cde36b8656e1cd65b\": container with ID starting with f29e99cff57897b1d74720bda3153c91977b23d21f762c3cde36b8656e1cd65b not found: ID does not exist" Mar 18 18:15:24 crc kubenswrapper[4830]: I0318 18:15:24.898460 4830 scope.go:117] "RemoveContainer" containerID="6dda02652eb9fc1c81cb1853f7bc8fafd3ce695d1587d32cbcfcc5071858808e" Mar 18 18:15:24 crc kubenswrapper[4830]: E0318 18:15:24.898745 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dda02652eb9fc1c81cb1853f7bc8fafd3ce695d1587d32cbcfcc5071858808e\": container with ID starting with 6dda02652eb9fc1c81cb1853f7bc8fafd3ce695d1587d32cbcfcc5071858808e not found: ID does not exist" containerID="6dda02652eb9fc1c81cb1853f7bc8fafd3ce695d1587d32cbcfcc5071858808e" Mar 18 18:15:24 crc kubenswrapper[4830]: I0318 18:15:24.898796 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dda02652eb9fc1c81cb1853f7bc8fafd3ce695d1587d32cbcfcc5071858808e"} err="failed to get container status \"6dda02652eb9fc1c81cb1853f7bc8fafd3ce695d1587d32cbcfcc5071858808e\": rpc error: code = NotFound desc = could not find container \"6dda02652eb9fc1c81cb1853f7bc8fafd3ce695d1587d32cbcfcc5071858808e\": container with ID starting with 6dda02652eb9fc1c81cb1853f7bc8fafd3ce695d1587d32cbcfcc5071858808e not found: ID does not exist" Mar 18 18:15:25 crc kubenswrapper[4830]: I0318 18:15:25.688942 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-85f6f6858-qtgzb" event={"ID":"1ca766c2-0f41-45d7-b219-d5293e66ca65","Type":"ContainerStarted","Data":"19292b5e3d91ac4558dd41ad9096a9d8ed4db38e1d5a4fabb5ede31e1d3a62c4"} Mar 18 18:15:25 crc kubenswrapper[4830]: I0318 18:15:25.690554 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-85f6f6858-qtgzb" Mar 18 18:15:25 crc kubenswrapper[4830]: I0318 18:15:25.723546 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-85f6f6858-qtgzb" podStartSLOduration=2.652169938 podStartE2EDuration="9.723524836s" podCreationTimestamp="2026-03-18 18:15:16 +0000 UTC" firstStartedPulling="2026-03-18 18:15:17.826548269 +0000 UTC m=+752.394178631" lastFinishedPulling="2026-03-18 18:15:24.897903197 +0000 UTC m=+759.465533529" observedRunningTime="2026-03-18 18:15:25.71858056 +0000 UTC m=+760.286210892" watchObservedRunningTime="2026-03-18 18:15:25.723524836 +0000 UTC m=+760.291155168" Mar 18 18:15:37 crc kubenswrapper[4830]: I0318 18:15:37.314666 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-85f6f6858-qtgzb" Mar 18 18:15:57 crc kubenswrapper[4830]: I0318 18:15:57.065415 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-86dfc68bcc-zxmhg" Mar 18 18:15:57 crc kubenswrapper[4830]: I0318 18:15:57.930614 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-jhmfc"] Mar 18 18:15:57 crc kubenswrapper[4830]: E0318 18:15:57.930887 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9445ef0c-ef2a-4b12-8505-6864f39e0f59" containerName="extract-utilities" Mar 18 18:15:57 crc kubenswrapper[4830]: I0318 18:15:57.930907 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="9445ef0c-ef2a-4b12-8505-6864f39e0f59" containerName="extract-utilities" Mar 18 18:15:57 crc kubenswrapper[4830]: E0318 18:15:57.930937 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9445ef0c-ef2a-4b12-8505-6864f39e0f59" containerName="extract-content" Mar 18 18:15:57 crc kubenswrapper[4830]: I0318 18:15:57.930946 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="9445ef0c-ef2a-4b12-8505-6864f39e0f59" containerName="extract-content" Mar 18 18:15:57 crc kubenswrapper[4830]: E0318 18:15:57.930960 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9445ef0c-ef2a-4b12-8505-6864f39e0f59" containerName="registry-server" Mar 18 18:15:57 crc kubenswrapper[4830]: I0318 18:15:57.930971 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="9445ef0c-ef2a-4b12-8505-6864f39e0f59" containerName="registry-server" Mar 18 18:15:57 crc kubenswrapper[4830]: I0318 18:15:57.931109 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="9445ef0c-ef2a-4b12-8505-6864f39e0f59" containerName="registry-server" Mar 18 18:15:57 crc kubenswrapper[4830]: I0318 18:15:57.933147 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-jhmfc" Mar 18 18:15:57 crc kubenswrapper[4830]: I0318 18:15:57.938325 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-bmn4r" Mar 18 18:15:57 crc kubenswrapper[4830]: I0318 18:15:57.938564 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 18 18:15:57 crc kubenswrapper[4830]: I0318 18:15:57.938612 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 18 18:15:57 crc kubenswrapper[4830]: I0318 18:15:57.948699 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-kjllh"] Mar 18 18:15:57 crc kubenswrapper[4830]: I0318 18:15:57.949489 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kjllh" Mar 18 18:15:57 crc kubenswrapper[4830]: I0318 18:15:57.952898 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 18 18:15:57 crc kubenswrapper[4830]: I0318 18:15:57.968646 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-kjllh"] Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.034920 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-7csnn"] Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.035852 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-7csnn" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.040143 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.040176 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.040242 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-svm96" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.040189 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.045765 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-xcvlh"] Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.046843 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-xcvlh" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.048061 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.058418 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-xcvlh"] Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.082531 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac9878d6-cca1-49b1-bca8-3ad035256043-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-kjllh\" (UID: \"ac9878d6-cca1-49b1-bca8-3ad035256043\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kjllh" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.082596 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0f86ec1c-6e52-4bd0-af13-dbb311f12c6b-frr-sockets\") pod \"frr-k8s-jhmfc\" (UID: \"0f86ec1c-6e52-4bd0-af13-dbb311f12c6b\") " pod="metallb-system/frr-k8s-jhmfc" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.082621 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0f86ec1c-6e52-4bd0-af13-dbb311f12c6b-metrics\") pod \"frr-k8s-jhmfc\" (UID: \"0f86ec1c-6e52-4bd0-af13-dbb311f12c6b\") " pod="metallb-system/frr-k8s-jhmfc" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.082662 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slk7q\" (UniqueName: \"kubernetes.io/projected/0f86ec1c-6e52-4bd0-af13-dbb311f12c6b-kube-api-access-slk7q\") pod \"frr-k8s-jhmfc\" (UID: \"0f86ec1c-6e52-4bd0-af13-dbb311f12c6b\") " pod="metallb-system/frr-k8s-jhmfc" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.082723 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0f86ec1c-6e52-4bd0-af13-dbb311f12c6b-reloader\") pod \"frr-k8s-jhmfc\" (UID: \"0f86ec1c-6e52-4bd0-af13-dbb311f12c6b\") " pod="metallb-system/frr-k8s-jhmfc" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.082815 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0f86ec1c-6e52-4bd0-af13-dbb311f12c6b-frr-conf\") pod \"frr-k8s-jhmfc\" (UID: \"0f86ec1c-6e52-4bd0-af13-dbb311f12c6b\") " pod="metallb-system/frr-k8s-jhmfc" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.083065 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f86ec1c-6e52-4bd0-af13-dbb311f12c6b-metrics-certs\") pod \"frr-k8s-jhmfc\" (UID: \"0f86ec1c-6e52-4bd0-af13-dbb311f12c6b\") " pod="metallb-system/frr-k8s-jhmfc" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.083147 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0f86ec1c-6e52-4bd0-af13-dbb311f12c6b-frr-startup\") pod \"frr-k8s-jhmfc\" (UID: \"0f86ec1c-6e52-4bd0-af13-dbb311f12c6b\") " pod="metallb-system/frr-k8s-jhmfc" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.083214 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dhth\" (UniqueName: \"kubernetes.io/projected/ac9878d6-cca1-49b1-bca8-3ad035256043-kube-api-access-2dhth\") pod \"frr-k8s-webhook-server-bcc4b6f68-kjllh\" (UID: \"ac9878d6-cca1-49b1-bca8-3ad035256043\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kjllh" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.184909 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac9878d6-cca1-49b1-bca8-3ad035256043-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-kjllh\" (UID: \"ac9878d6-cca1-49b1-bca8-3ad035256043\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kjllh" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.184991 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0f86ec1c-6e52-4bd0-af13-dbb311f12c6b-frr-sockets\") pod \"frr-k8s-jhmfc\" (UID: \"0f86ec1c-6e52-4bd0-af13-dbb311f12c6b\") " pod="metallb-system/frr-k8s-jhmfc" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.185014 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0f86ec1c-6e52-4bd0-af13-dbb311f12c6b-metrics\") pod \"frr-k8s-jhmfc\" (UID: \"0f86ec1c-6e52-4bd0-af13-dbb311f12c6b\") " pod="metallb-system/frr-k8s-jhmfc" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.185058 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c470043c-dedf-46ee-a690-ccc828a69f63-memberlist\") pod \"speaker-7csnn\" (UID: \"c470043c-dedf-46ee-a690-ccc828a69f63\") " pod="metallb-system/speaker-7csnn" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.185127 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slk7q\" (UniqueName: \"kubernetes.io/projected/0f86ec1c-6e52-4bd0-af13-dbb311f12c6b-kube-api-access-slk7q\") pod \"frr-k8s-jhmfc\" (UID: \"0f86ec1c-6e52-4bd0-af13-dbb311f12c6b\") " pod="metallb-system/frr-k8s-jhmfc" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.185153 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdlb6\" (UniqueName: \"kubernetes.io/projected/73874e54-172a-4960-8e37-e495e16e4ff7-kube-api-access-sdlb6\") pod \"controller-7bb4cc7c98-xcvlh\" (UID: \"73874e54-172a-4960-8e37-e495e16e4ff7\") " pod="metallb-system/controller-7bb4cc7c98-xcvlh" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.185215 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0f86ec1c-6e52-4bd0-af13-dbb311f12c6b-reloader\") pod \"frr-k8s-jhmfc\" (UID: \"0f86ec1c-6e52-4bd0-af13-dbb311f12c6b\") " pod="metallb-system/frr-k8s-jhmfc" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.185251 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/73874e54-172a-4960-8e37-e495e16e4ff7-metrics-certs\") pod \"controller-7bb4cc7c98-xcvlh\" (UID: \"73874e54-172a-4960-8e37-e495e16e4ff7\") " pod="metallb-system/controller-7bb4cc7c98-xcvlh" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.185269 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99dkg\" (UniqueName: \"kubernetes.io/projected/c470043c-dedf-46ee-a690-ccc828a69f63-kube-api-access-99dkg\") pod \"speaker-7csnn\" (UID: \"c470043c-dedf-46ee-a690-ccc828a69f63\") " pod="metallb-system/speaker-7csnn" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.185340 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0f86ec1c-6e52-4bd0-af13-dbb311f12c6b-frr-conf\") pod \"frr-k8s-jhmfc\" (UID: \"0f86ec1c-6e52-4bd0-af13-dbb311f12c6b\") " pod="metallb-system/frr-k8s-jhmfc" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.185394 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/73874e54-172a-4960-8e37-e495e16e4ff7-cert\") pod \"controller-7bb4cc7c98-xcvlh\" (UID: \"73874e54-172a-4960-8e37-e495e16e4ff7\") " pod="metallb-system/controller-7bb4cc7c98-xcvlh" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.185417 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f86ec1c-6e52-4bd0-af13-dbb311f12c6b-metrics-certs\") pod \"frr-k8s-jhmfc\" (UID: \"0f86ec1c-6e52-4bd0-af13-dbb311f12c6b\") " pod="metallb-system/frr-k8s-jhmfc" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.185436 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c470043c-dedf-46ee-a690-ccc828a69f63-metallb-excludel2\") pod \"speaker-7csnn\" (UID: \"c470043c-dedf-46ee-a690-ccc828a69f63\") " pod="metallb-system/speaker-7csnn" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.185476 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0f86ec1c-6e52-4bd0-af13-dbb311f12c6b-frr-startup\") pod \"frr-k8s-jhmfc\" (UID: \"0f86ec1c-6e52-4bd0-af13-dbb311f12c6b\") " pod="metallb-system/frr-k8s-jhmfc" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.185502 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dhth\" (UniqueName: \"kubernetes.io/projected/ac9878d6-cca1-49b1-bca8-3ad035256043-kube-api-access-2dhth\") pod \"frr-k8s-webhook-server-bcc4b6f68-kjllh\" (UID: \"ac9878d6-cca1-49b1-bca8-3ad035256043\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kjllh" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.185521 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c470043c-dedf-46ee-a690-ccc828a69f63-metrics-certs\") pod \"speaker-7csnn\" (UID: \"c470043c-dedf-46ee-a690-ccc828a69f63\") " pod="metallb-system/speaker-7csnn" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.186223 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0f86ec1c-6e52-4bd0-af13-dbb311f12c6b-frr-conf\") pod \"frr-k8s-jhmfc\" (UID: \"0f86ec1c-6e52-4bd0-af13-dbb311f12c6b\") " pod="metallb-system/frr-k8s-jhmfc" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.186341 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0f86ec1c-6e52-4bd0-af13-dbb311f12c6b-metrics\") pod \"frr-k8s-jhmfc\" (UID: \"0f86ec1c-6e52-4bd0-af13-dbb311f12c6b\") " pod="metallb-system/frr-k8s-jhmfc" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.186631 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0f86ec1c-6e52-4bd0-af13-dbb311f12c6b-reloader\") pod \"frr-k8s-jhmfc\" (UID: \"0f86ec1c-6e52-4bd0-af13-dbb311f12c6b\") " pod="metallb-system/frr-k8s-jhmfc" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.186680 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0f86ec1c-6e52-4bd0-af13-dbb311f12c6b-frr-sockets\") pod \"frr-k8s-jhmfc\" (UID: \"0f86ec1c-6e52-4bd0-af13-dbb311f12c6b\") " pod="metallb-system/frr-k8s-jhmfc" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.187169 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0f86ec1c-6e52-4bd0-af13-dbb311f12c6b-frr-startup\") pod \"frr-k8s-jhmfc\" (UID: \"0f86ec1c-6e52-4bd0-af13-dbb311f12c6b\") " pod="metallb-system/frr-k8s-jhmfc" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.192234 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac9878d6-cca1-49b1-bca8-3ad035256043-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-kjllh\" (UID: \"ac9878d6-cca1-49b1-bca8-3ad035256043\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kjllh" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.195312 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f86ec1c-6e52-4bd0-af13-dbb311f12c6b-metrics-certs\") pod \"frr-k8s-jhmfc\" (UID: \"0f86ec1c-6e52-4bd0-af13-dbb311f12c6b\") " pod="metallb-system/frr-k8s-jhmfc" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.201389 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slk7q\" (UniqueName: \"kubernetes.io/projected/0f86ec1c-6e52-4bd0-af13-dbb311f12c6b-kube-api-access-slk7q\") pod \"frr-k8s-jhmfc\" (UID: \"0f86ec1c-6e52-4bd0-af13-dbb311f12c6b\") " pod="metallb-system/frr-k8s-jhmfc" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.204354 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dhth\" (UniqueName: \"kubernetes.io/projected/ac9878d6-cca1-49b1-bca8-3ad035256043-kube-api-access-2dhth\") pod \"frr-k8s-webhook-server-bcc4b6f68-kjllh\" (UID: \"ac9878d6-cca1-49b1-bca8-3ad035256043\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kjllh" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.255040 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-jhmfc" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.276204 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kjllh" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.286773 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/73874e54-172a-4960-8e37-e495e16e4ff7-metrics-certs\") pod \"controller-7bb4cc7c98-xcvlh\" (UID: \"73874e54-172a-4960-8e37-e495e16e4ff7\") " pod="metallb-system/controller-7bb4cc7c98-xcvlh" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.286840 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99dkg\" (UniqueName: \"kubernetes.io/projected/c470043c-dedf-46ee-a690-ccc828a69f63-kube-api-access-99dkg\") pod \"speaker-7csnn\" (UID: \"c470043c-dedf-46ee-a690-ccc828a69f63\") " pod="metallb-system/speaker-7csnn" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.286886 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/73874e54-172a-4960-8e37-e495e16e4ff7-cert\") pod \"controller-7bb4cc7c98-xcvlh\" (UID: \"73874e54-172a-4960-8e37-e495e16e4ff7\") " pod="metallb-system/controller-7bb4cc7c98-xcvlh" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.286912 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c470043c-dedf-46ee-a690-ccc828a69f63-metallb-excludel2\") pod \"speaker-7csnn\" (UID: \"c470043c-dedf-46ee-a690-ccc828a69f63\") " pod="metallb-system/speaker-7csnn" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.286947 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c470043c-dedf-46ee-a690-ccc828a69f63-metrics-certs\") pod \"speaker-7csnn\" (UID: \"c470043c-dedf-46ee-a690-ccc828a69f63\") " pod="metallb-system/speaker-7csnn" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.286973 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c470043c-dedf-46ee-a690-ccc828a69f63-memberlist\") pod \"speaker-7csnn\" (UID: \"c470043c-dedf-46ee-a690-ccc828a69f63\") " pod="metallb-system/speaker-7csnn" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.287015 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdlb6\" (UniqueName: \"kubernetes.io/projected/73874e54-172a-4960-8e37-e495e16e4ff7-kube-api-access-sdlb6\") pod \"controller-7bb4cc7c98-xcvlh\" (UID: \"73874e54-172a-4960-8e37-e495e16e4ff7\") " pod="metallb-system/controller-7bb4cc7c98-xcvlh" Mar 18 18:15:58 crc kubenswrapper[4830]: E0318 18:15:58.287526 4830 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 18 18:15:58 crc kubenswrapper[4830]: E0318 18:15:58.287612 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c470043c-dedf-46ee-a690-ccc828a69f63-memberlist podName:c470043c-dedf-46ee-a690-ccc828a69f63 nodeName:}" failed. No retries permitted until 2026-03-18 18:15:58.787588103 +0000 UTC m=+793.355218435 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c470043c-dedf-46ee-a690-ccc828a69f63-memberlist") pod "speaker-7csnn" (UID: "c470043c-dedf-46ee-a690-ccc828a69f63") : secret "metallb-memberlist" not found Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.288157 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c470043c-dedf-46ee-a690-ccc828a69f63-metallb-excludel2\") pod \"speaker-7csnn\" (UID: \"c470043c-dedf-46ee-a690-ccc828a69f63\") " pod="metallb-system/speaker-7csnn" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.291053 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/73874e54-172a-4960-8e37-e495e16e4ff7-metrics-certs\") pod \"controller-7bb4cc7c98-xcvlh\" (UID: \"73874e54-172a-4960-8e37-e495e16e4ff7\") " pod="metallb-system/controller-7bb4cc7c98-xcvlh" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.292107 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c470043c-dedf-46ee-a690-ccc828a69f63-metrics-certs\") pod \"speaker-7csnn\" (UID: \"c470043c-dedf-46ee-a690-ccc828a69f63\") " pod="metallb-system/speaker-7csnn" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.303231 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.306871 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/73874e54-172a-4960-8e37-e495e16e4ff7-cert\") pod \"controller-7bb4cc7c98-xcvlh\" (UID: \"73874e54-172a-4960-8e37-e495e16e4ff7\") " pod="metallb-system/controller-7bb4cc7c98-xcvlh" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.309828 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99dkg\" (UniqueName: \"kubernetes.io/projected/c470043c-dedf-46ee-a690-ccc828a69f63-kube-api-access-99dkg\") pod \"speaker-7csnn\" (UID: \"c470043c-dedf-46ee-a690-ccc828a69f63\") " pod="metallb-system/speaker-7csnn" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.317719 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdlb6\" (UniqueName: \"kubernetes.io/projected/73874e54-172a-4960-8e37-e495e16e4ff7-kube-api-access-sdlb6\") pod \"controller-7bb4cc7c98-xcvlh\" (UID: \"73874e54-172a-4960-8e37-e495e16e4ff7\") " pod="metallb-system/controller-7bb4cc7c98-xcvlh" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.368266 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-xcvlh" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.512414 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-kjllh"] Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.604093 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-xcvlh"] Mar 18 18:15:58 crc kubenswrapper[4830]: W0318 18:15:58.607803 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73874e54_172a_4960_8e37_e495e16e4ff7.slice/crio-fbf09dd55906df21144db1d766bf9fcb06de875f41f1cf4abef16fd61ed6b3b7 WatchSource:0}: Error finding container fbf09dd55906df21144db1d766bf9fcb06de875f41f1cf4abef16fd61ed6b3b7: Status 404 returned error can't find the container with id fbf09dd55906df21144db1d766bf9fcb06de875f41f1cf4abef16fd61ed6b3b7 Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.793129 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c470043c-dedf-46ee-a690-ccc828a69f63-memberlist\") pod \"speaker-7csnn\" (UID: \"c470043c-dedf-46ee-a690-ccc828a69f63\") " pod="metallb-system/speaker-7csnn" Mar 18 18:15:58 crc kubenswrapper[4830]: E0318 18:15:58.793357 4830 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 18 18:15:58 crc kubenswrapper[4830]: E0318 18:15:58.793417 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c470043c-dedf-46ee-a690-ccc828a69f63-memberlist podName:c470043c-dedf-46ee-a690-ccc828a69f63 nodeName:}" failed. No retries permitted until 2026-03-18 18:15:59.793397213 +0000 UTC m=+794.361027555 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c470043c-dedf-46ee-a690-ccc828a69f63-memberlist") pod "speaker-7csnn" (UID: "c470043c-dedf-46ee-a690-ccc828a69f63") : secret "metallb-memberlist" not found Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.931705 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kjllh" event={"ID":"ac9878d6-cca1-49b1-bca8-3ad035256043","Type":"ContainerStarted","Data":"b93b423d78ebab8a8fd8e0351bb054883b3c2d21e28c473fca357001976144be"} Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.933039 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jhmfc" event={"ID":"0f86ec1c-6e52-4bd0-af13-dbb311f12c6b","Type":"ContainerStarted","Data":"6db00de407a66a117919f44314dda781dafb78feeca8ca88688de6753c56fc27"} Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.935549 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-xcvlh" event={"ID":"73874e54-172a-4960-8e37-e495e16e4ff7","Type":"ContainerStarted","Data":"a59d3aff5b874cc4a49c81964b7b95b6cd45a38ddee88ac63c61a01d861982a2"} Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.935577 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-xcvlh" event={"ID":"73874e54-172a-4960-8e37-e495e16e4ff7","Type":"ContainerStarted","Data":"08b5c6a76a957b453b27649724cf6c543b2e294887d7cf7a4e0453cceae7cb74"} Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.935587 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-xcvlh" event={"ID":"73874e54-172a-4960-8e37-e495e16e4ff7","Type":"ContainerStarted","Data":"fbf09dd55906df21144db1d766bf9fcb06de875f41f1cf4abef16fd61ed6b3b7"} Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.935811 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-xcvlh" Mar 18 18:15:58 crc kubenswrapper[4830]: I0318 18:15:58.956032 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-xcvlh" podStartSLOduration=0.956004882 podStartE2EDuration="956.004882ms" podCreationTimestamp="2026-03-18 18:15:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:15:58.954777598 +0000 UTC m=+793.522407940" watchObservedRunningTime="2026-03-18 18:15:58.956004882 +0000 UTC m=+793.523635214" Mar 18 18:15:59 crc kubenswrapper[4830]: I0318 18:15:59.509933 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:15:59 crc kubenswrapper[4830]: I0318 18:15:59.510379 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:15:59 crc kubenswrapper[4830]: I0318 18:15:59.809434 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c470043c-dedf-46ee-a690-ccc828a69f63-memberlist\") pod \"speaker-7csnn\" (UID: \"c470043c-dedf-46ee-a690-ccc828a69f63\") " pod="metallb-system/speaker-7csnn" Mar 18 18:15:59 crc kubenswrapper[4830]: I0318 18:15:59.817746 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c470043c-dedf-46ee-a690-ccc828a69f63-memberlist\") pod \"speaker-7csnn\" (UID: \"c470043c-dedf-46ee-a690-ccc828a69f63\") " pod="metallb-system/speaker-7csnn" Mar 18 18:15:59 crc kubenswrapper[4830]: I0318 18:15:59.857069 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-7csnn" Mar 18 18:15:59 crc kubenswrapper[4830]: W0318 18:15:59.891318 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc470043c_dedf_46ee_a690_ccc828a69f63.slice/crio-7b7d6b8c3e54a914a6022f88bda67b0ad6fa5e467b86a27450b175197a50bf34 WatchSource:0}: Error finding container 7b7d6b8c3e54a914a6022f88bda67b0ad6fa5e467b86a27450b175197a50bf34: Status 404 returned error can't find the container with id 7b7d6b8c3e54a914a6022f88bda67b0ad6fa5e467b86a27450b175197a50bf34 Mar 18 18:15:59 crc kubenswrapper[4830]: I0318 18:15:59.975022 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7csnn" event={"ID":"c470043c-dedf-46ee-a690-ccc828a69f63","Type":"ContainerStarted","Data":"7b7d6b8c3e54a914a6022f88bda67b0ad6fa5e467b86a27450b175197a50bf34"} Mar 18 18:16:00 crc kubenswrapper[4830]: I0318 18:16:00.125486 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564296-x6d6s"] Mar 18 18:16:00 crc kubenswrapper[4830]: I0318 18:16:00.126424 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564296-x6d6s" Mar 18 18:16:00 crc kubenswrapper[4830]: I0318 18:16:00.131820 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:16:00 crc kubenswrapper[4830]: I0318 18:16:00.132160 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 18:16:00 crc kubenswrapper[4830]: I0318 18:16:00.132277 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:16:00 crc kubenswrapper[4830]: I0318 18:16:00.155654 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564296-x6d6s"] Mar 18 18:16:00 crc kubenswrapper[4830]: I0318 18:16:00.215594 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9tj7\" (UniqueName: \"kubernetes.io/projected/e4491dda-341a-42c5-b842-c42e7ac2f5bd-kube-api-access-d9tj7\") pod \"auto-csr-approver-29564296-x6d6s\" (UID: \"e4491dda-341a-42c5-b842-c42e7ac2f5bd\") " pod="openshift-infra/auto-csr-approver-29564296-x6d6s" Mar 18 18:16:00 crc kubenswrapper[4830]: I0318 18:16:00.317444 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9tj7\" (UniqueName: \"kubernetes.io/projected/e4491dda-341a-42c5-b842-c42e7ac2f5bd-kube-api-access-d9tj7\") pod \"auto-csr-approver-29564296-x6d6s\" (UID: \"e4491dda-341a-42c5-b842-c42e7ac2f5bd\") " pod="openshift-infra/auto-csr-approver-29564296-x6d6s" Mar 18 18:16:00 crc kubenswrapper[4830]: I0318 18:16:00.793556 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9tj7\" (UniqueName: \"kubernetes.io/projected/e4491dda-341a-42c5-b842-c42e7ac2f5bd-kube-api-access-d9tj7\") pod \"auto-csr-approver-29564296-x6d6s\" (UID: \"e4491dda-341a-42c5-b842-c42e7ac2f5bd\") " pod="openshift-infra/auto-csr-approver-29564296-x6d6s" Mar 18 18:16:00 crc kubenswrapper[4830]: I0318 18:16:00.998955 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7csnn" event={"ID":"c470043c-dedf-46ee-a690-ccc828a69f63","Type":"ContainerStarted","Data":"ae0d33a993fb4916e48f2c333919b0eb0abf34f20b85f62db54d381560c0fb05"} Mar 18 18:16:00 crc kubenswrapper[4830]: I0318 18:16:00.999013 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7csnn" event={"ID":"c470043c-dedf-46ee-a690-ccc828a69f63","Type":"ContainerStarted","Data":"31455185d5103d1e6e915072d684c9cbcf7ffbe7b2a44fd853380ec28396ccca"} Mar 18 18:16:01 crc kubenswrapper[4830]: I0318 18:16:01.000241 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-7csnn" Mar 18 18:16:01 crc kubenswrapper[4830]: I0318 18:16:01.016675 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-7csnn" podStartSLOduration=3.016657751 podStartE2EDuration="3.016657751s" podCreationTimestamp="2026-03-18 18:15:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:16:01.015111188 +0000 UTC m=+795.582741520" watchObservedRunningTime="2026-03-18 18:16:01.016657751 +0000 UTC m=+795.584288093" Mar 18 18:16:01 crc kubenswrapper[4830]: I0318 18:16:01.065219 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564296-x6d6s" Mar 18 18:16:01 crc kubenswrapper[4830]: I0318 18:16:01.583501 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564296-x6d6s"] Mar 18 18:16:02 crc kubenswrapper[4830]: I0318 18:16:02.025027 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564296-x6d6s" event={"ID":"e4491dda-341a-42c5-b842-c42e7ac2f5bd","Type":"ContainerStarted","Data":"a245ec951b73124a4ab09ca940fa0dd1d240879ba40123acbecf946e84ab5fd8"} Mar 18 18:16:06 crc kubenswrapper[4830]: I0318 18:16:06.060050 4830 generic.go:334] "Generic (PLEG): container finished" podID="0f86ec1c-6e52-4bd0-af13-dbb311f12c6b" containerID="0c1516d2b668e5c0db1b6c271cb1f34f272752ea6cc5c8d29c9633a78a25601a" exitCode=0 Mar 18 18:16:06 crc kubenswrapper[4830]: I0318 18:16:06.060156 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jhmfc" event={"ID":"0f86ec1c-6e52-4bd0-af13-dbb311f12c6b","Type":"ContainerDied","Data":"0c1516d2b668e5c0db1b6c271cb1f34f272752ea6cc5c8d29c9633a78a25601a"} Mar 18 18:16:06 crc kubenswrapper[4830]: I0318 18:16:06.066061 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kjllh" event={"ID":"ac9878d6-cca1-49b1-bca8-3ad035256043","Type":"ContainerStarted","Data":"8854188eba820fd44ee9fac2c41172ffe66599fc2dc43416aa195784f06d3f25"} Mar 18 18:16:06 crc kubenswrapper[4830]: I0318 18:16:06.066275 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kjllh" Mar 18 18:16:06 crc kubenswrapper[4830]: I0318 18:16:06.116487 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kjllh" podStartSLOduration=1.938189252 podStartE2EDuration="9.116464449s" podCreationTimestamp="2026-03-18 18:15:57 +0000 UTC" firstStartedPulling="2026-03-18 18:15:58.527336741 +0000 UTC m=+793.094967073" lastFinishedPulling="2026-03-18 18:16:05.705611898 +0000 UTC m=+800.273242270" observedRunningTime="2026-03-18 18:16:06.109842187 +0000 UTC m=+800.677472519" watchObservedRunningTime="2026-03-18 18:16:06.116464449 +0000 UTC m=+800.684094781" Mar 18 18:16:07 crc kubenswrapper[4830]: I0318 18:16:07.078763 4830 generic.go:334] "Generic (PLEG): container finished" podID="0f86ec1c-6e52-4bd0-af13-dbb311f12c6b" containerID="ea4690e0e46d7d909f1eb4e91b66e541615c51e69fe4f118f0f26b01d2cfc246" exitCode=0 Mar 18 18:16:07 crc kubenswrapper[4830]: I0318 18:16:07.078830 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jhmfc" event={"ID":"0f86ec1c-6e52-4bd0-af13-dbb311f12c6b","Type":"ContainerDied","Data":"ea4690e0e46d7d909f1eb4e91b66e541615c51e69fe4f118f0f26b01d2cfc246"} Mar 18 18:16:07 crc kubenswrapper[4830]: I0318 18:16:07.082614 4830 generic.go:334] "Generic (PLEG): container finished" podID="e4491dda-341a-42c5-b842-c42e7ac2f5bd" containerID="bbd47d7797a6e3b1e959aae752fd9cafe12dda4054864a8b3ab7c23584ffcf72" exitCode=0 Mar 18 18:16:07 crc kubenswrapper[4830]: I0318 18:16:07.082813 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564296-x6d6s" event={"ID":"e4491dda-341a-42c5-b842-c42e7ac2f5bd","Type":"ContainerDied","Data":"bbd47d7797a6e3b1e959aae752fd9cafe12dda4054864a8b3ab7c23584ffcf72"} Mar 18 18:16:08 crc kubenswrapper[4830]: I0318 18:16:08.097754 4830 generic.go:334] "Generic (PLEG): container finished" podID="0f86ec1c-6e52-4bd0-af13-dbb311f12c6b" containerID="4bf86aede4dd97ba567fc6e071c1606d0acde30ddcd2e16ab32196ce1a8c7545" exitCode=0 Mar 18 18:16:08 crc kubenswrapper[4830]: I0318 18:16:08.098043 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jhmfc" event={"ID":"0f86ec1c-6e52-4bd0-af13-dbb311f12c6b","Type":"ContainerDied","Data":"4bf86aede4dd97ba567fc6e071c1606d0acde30ddcd2e16ab32196ce1a8c7545"} Mar 18 18:16:08 crc kubenswrapper[4830]: I0318 18:16:08.375944 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-xcvlh" Mar 18 18:16:08 crc kubenswrapper[4830]: I0318 18:16:08.497560 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564296-x6d6s" Mar 18 18:16:08 crc kubenswrapper[4830]: I0318 18:16:08.594139 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9tj7\" (UniqueName: \"kubernetes.io/projected/e4491dda-341a-42c5-b842-c42e7ac2f5bd-kube-api-access-d9tj7\") pod \"e4491dda-341a-42c5-b842-c42e7ac2f5bd\" (UID: \"e4491dda-341a-42c5-b842-c42e7ac2f5bd\") " Mar 18 18:16:08 crc kubenswrapper[4830]: I0318 18:16:08.603201 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4491dda-341a-42c5-b842-c42e7ac2f5bd-kube-api-access-d9tj7" (OuterVolumeSpecName: "kube-api-access-d9tj7") pod "e4491dda-341a-42c5-b842-c42e7ac2f5bd" (UID: "e4491dda-341a-42c5-b842-c42e7ac2f5bd"). InnerVolumeSpecName "kube-api-access-d9tj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:16:08 crc kubenswrapper[4830]: I0318 18:16:08.695642 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9tj7\" (UniqueName: \"kubernetes.io/projected/e4491dda-341a-42c5-b842-c42e7ac2f5bd-kube-api-access-d9tj7\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:09 crc kubenswrapper[4830]: I0318 18:16:09.118215 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jhmfc" event={"ID":"0f86ec1c-6e52-4bd0-af13-dbb311f12c6b","Type":"ContainerStarted","Data":"5d83d588ae0870c5b85bdaddba43a4f41a904d902bc2d821225028ef351c89b6"} Mar 18 18:16:09 crc kubenswrapper[4830]: I0318 18:16:09.118265 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jhmfc" event={"ID":"0f86ec1c-6e52-4bd0-af13-dbb311f12c6b","Type":"ContainerStarted","Data":"c539c2c3feb8e14d079bb7c9d329a47f2a46f9bdfcc4538ef5a737cd1f799537"} Mar 18 18:16:09 crc kubenswrapper[4830]: I0318 18:16:09.118281 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jhmfc" event={"ID":"0f86ec1c-6e52-4bd0-af13-dbb311f12c6b","Type":"ContainerStarted","Data":"ebcd681a335b289a9c9328feebea4e1753e49af64fd5f9340d06c16316ee011d"} Mar 18 18:16:09 crc kubenswrapper[4830]: I0318 18:16:09.118294 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jhmfc" event={"ID":"0f86ec1c-6e52-4bd0-af13-dbb311f12c6b","Type":"ContainerStarted","Data":"9d61b9adbf5f3515b819a5ba4d85a9b31f235fca9d322902d07efe07b3bda51e"} Mar 18 18:16:09 crc kubenswrapper[4830]: I0318 18:16:09.118306 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jhmfc" event={"ID":"0f86ec1c-6e52-4bd0-af13-dbb311f12c6b","Type":"ContainerStarted","Data":"f7613fa56a63f405b558251f778468292be3ce656bf4c8393a4630b473237aae"} Mar 18 18:16:09 crc kubenswrapper[4830]: I0318 18:16:09.120164 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564296-x6d6s" event={"ID":"e4491dda-341a-42c5-b842-c42e7ac2f5bd","Type":"ContainerDied","Data":"a245ec951b73124a4ab09ca940fa0dd1d240879ba40123acbecf946e84ab5fd8"} Mar 18 18:16:09 crc kubenswrapper[4830]: I0318 18:16:09.120199 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a245ec951b73124a4ab09ca940fa0dd1d240879ba40123acbecf946e84ab5fd8" Mar 18 18:16:09 crc kubenswrapper[4830]: I0318 18:16:09.120235 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564296-x6d6s" Mar 18 18:16:09 crc kubenswrapper[4830]: I0318 18:16:09.575426 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564290-msqq7"] Mar 18 18:16:09 crc kubenswrapper[4830]: I0318 18:16:09.579701 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564290-msqq7"] Mar 18 18:16:10 crc kubenswrapper[4830]: I0318 18:16:10.139727 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jhmfc" event={"ID":"0f86ec1c-6e52-4bd0-af13-dbb311f12c6b","Type":"ContainerStarted","Data":"f3b3826800849ad4206ff4a8cff203cb55fc1622fc87c9035fb99056aabd7cae"} Mar 18 18:16:10 crc kubenswrapper[4830]: I0318 18:16:10.140936 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-jhmfc" Mar 18 18:16:10 crc kubenswrapper[4830]: I0318 18:16:10.182543 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-jhmfc" podStartSLOduration=5.944236913 podStartE2EDuration="13.182496298s" podCreationTimestamp="2026-03-18 18:15:57 +0000 UTC" firstStartedPulling="2026-03-18 18:15:58.460238967 +0000 UTC m=+793.027869309" lastFinishedPulling="2026-03-18 18:16:05.698498322 +0000 UTC m=+800.266128694" observedRunningTime="2026-03-18 18:16:10.176409331 +0000 UTC m=+804.744039703" watchObservedRunningTime="2026-03-18 18:16:10.182496298 +0000 UTC m=+804.750126680" Mar 18 18:16:10 crc kubenswrapper[4830]: I0318 18:16:10.254846 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="319b53d8-aad8-414f-b7d0-204265ab9921" path="/var/lib/kubelet/pods/319b53d8-aad8-414f-b7d0-204265ab9921/volumes" Mar 18 18:16:10 crc kubenswrapper[4830]: I0318 18:16:10.509576 4830 scope.go:117] "RemoveContainer" containerID="de3bb072dd44b76e7b59f4f0e7c9702ffe3cb0ff2228c0454fcbc2eabf8b651d" Mar 18 18:16:13 crc kubenswrapper[4830]: I0318 18:16:13.256670 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-jhmfc" Mar 18 18:16:13 crc kubenswrapper[4830]: I0318 18:16:13.327214 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-jhmfc" Mar 18 18:16:18 crc kubenswrapper[4830]: I0318 18:16:18.260051 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-jhmfc" Mar 18 18:16:18 crc kubenswrapper[4830]: I0318 18:16:18.284849 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kjllh" Mar 18 18:16:19 crc kubenswrapper[4830]: I0318 18:16:19.862654 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-7csnn" Mar 18 18:16:21 crc kubenswrapper[4830]: I0318 18:16:21.559353 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq"] Mar 18 18:16:21 crc kubenswrapper[4830]: E0318 18:16:21.561200 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4491dda-341a-42c5-b842-c42e7ac2f5bd" containerName="oc" Mar 18 18:16:21 crc kubenswrapper[4830]: I0318 18:16:21.561335 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4491dda-341a-42c5-b842-c42e7ac2f5bd" containerName="oc" Mar 18 18:16:21 crc kubenswrapper[4830]: I0318 18:16:21.561551 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4491dda-341a-42c5-b842-c42e7ac2f5bd" containerName="oc" Mar 18 18:16:21 crc kubenswrapper[4830]: I0318 18:16:21.562645 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq" Mar 18 18:16:21 crc kubenswrapper[4830]: I0318 18:16:21.567748 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 18:16:21 crc kubenswrapper[4830]: I0318 18:16:21.586160 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq"] Mar 18 18:16:21 crc kubenswrapper[4830]: I0318 18:16:21.592119 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnpzn\" (UniqueName: \"kubernetes.io/projected/0b224be1-4685-41ee-b31e-8dfbcb80968d-kube-api-access-hnpzn\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq\" (UID: \"0b224be1-4685-41ee-b31e-8dfbcb80968d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq" Mar 18 18:16:21 crc kubenswrapper[4830]: I0318 18:16:21.592299 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0b224be1-4685-41ee-b31e-8dfbcb80968d-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq\" (UID: \"0b224be1-4685-41ee-b31e-8dfbcb80968d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq" Mar 18 18:16:21 crc kubenswrapper[4830]: I0318 18:16:21.592360 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0b224be1-4685-41ee-b31e-8dfbcb80968d-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq\" (UID: \"0b224be1-4685-41ee-b31e-8dfbcb80968d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq" Mar 18 18:16:21 crc kubenswrapper[4830]: I0318 18:16:21.693612 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0b224be1-4685-41ee-b31e-8dfbcb80968d-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq\" (UID: \"0b224be1-4685-41ee-b31e-8dfbcb80968d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq" Mar 18 18:16:21 crc kubenswrapper[4830]: I0318 18:16:21.693674 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0b224be1-4685-41ee-b31e-8dfbcb80968d-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq\" (UID: \"0b224be1-4685-41ee-b31e-8dfbcb80968d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq" Mar 18 18:16:21 crc kubenswrapper[4830]: I0318 18:16:21.693763 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnpzn\" (UniqueName: \"kubernetes.io/projected/0b224be1-4685-41ee-b31e-8dfbcb80968d-kube-api-access-hnpzn\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq\" (UID: \"0b224be1-4685-41ee-b31e-8dfbcb80968d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq" Mar 18 18:16:21 crc kubenswrapper[4830]: I0318 18:16:21.694563 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0b224be1-4685-41ee-b31e-8dfbcb80968d-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq\" (UID: \"0b224be1-4685-41ee-b31e-8dfbcb80968d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq" Mar 18 18:16:21 crc kubenswrapper[4830]: I0318 18:16:21.694570 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0b224be1-4685-41ee-b31e-8dfbcb80968d-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq\" (UID: \"0b224be1-4685-41ee-b31e-8dfbcb80968d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq" Mar 18 18:16:21 crc kubenswrapper[4830]: I0318 18:16:21.712781 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnpzn\" (UniqueName: \"kubernetes.io/projected/0b224be1-4685-41ee-b31e-8dfbcb80968d-kube-api-access-hnpzn\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq\" (UID: \"0b224be1-4685-41ee-b31e-8dfbcb80968d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq" Mar 18 18:16:21 crc kubenswrapper[4830]: I0318 18:16:21.885480 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq" Mar 18 18:16:22 crc kubenswrapper[4830]: W0318 18:16:22.429979 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b224be1_4685_41ee_b31e_8dfbcb80968d.slice/crio-a0f8745eeee00c44344d571def5a4f317b07a5c19dcfbf7ea79c5103d28e75df WatchSource:0}: Error finding container a0f8745eeee00c44344d571def5a4f317b07a5c19dcfbf7ea79c5103d28e75df: Status 404 returned error can't find the container with id a0f8745eeee00c44344d571def5a4f317b07a5c19dcfbf7ea79c5103d28e75df Mar 18 18:16:22 crc kubenswrapper[4830]: I0318 18:16:22.445227 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq"] Mar 18 18:16:23 crc kubenswrapper[4830]: I0318 18:16:23.245536 4830 generic.go:334] "Generic (PLEG): container finished" podID="0b224be1-4685-41ee-b31e-8dfbcb80968d" containerID="2804188d362fc007549dc72516b89272d9a4bd133eae9e9cb2a46b4fedadd193" exitCode=0 Mar 18 18:16:23 crc kubenswrapper[4830]: I0318 18:16:23.245705 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq" event={"ID":"0b224be1-4685-41ee-b31e-8dfbcb80968d","Type":"ContainerDied","Data":"2804188d362fc007549dc72516b89272d9a4bd133eae9e9cb2a46b4fedadd193"} Mar 18 18:16:23 crc kubenswrapper[4830]: I0318 18:16:23.246003 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq" event={"ID":"0b224be1-4685-41ee-b31e-8dfbcb80968d","Type":"ContainerStarted","Data":"a0f8745eeee00c44344d571def5a4f317b07a5c19dcfbf7ea79c5103d28e75df"} Mar 18 18:16:25 crc kubenswrapper[4830]: I0318 18:16:25.271064 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hgwk5"] Mar 18 18:16:25 crc kubenswrapper[4830]: I0318 18:16:25.280094 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hgwk5" Mar 18 18:16:25 crc kubenswrapper[4830]: I0318 18:16:25.297951 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hgwk5"] Mar 18 18:16:25 crc kubenswrapper[4830]: I0318 18:16:25.467373 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jczh\" (UniqueName: \"kubernetes.io/projected/e913277a-53a4-47ec-b203-2db929dc1034-kube-api-access-4jczh\") pod \"community-operators-hgwk5\" (UID: \"e913277a-53a4-47ec-b203-2db929dc1034\") " pod="openshift-marketplace/community-operators-hgwk5" Mar 18 18:16:25 crc kubenswrapper[4830]: I0318 18:16:25.467431 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e913277a-53a4-47ec-b203-2db929dc1034-catalog-content\") pod \"community-operators-hgwk5\" (UID: \"e913277a-53a4-47ec-b203-2db929dc1034\") " pod="openshift-marketplace/community-operators-hgwk5" Mar 18 18:16:25 crc kubenswrapper[4830]: I0318 18:16:25.467456 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e913277a-53a4-47ec-b203-2db929dc1034-utilities\") pod \"community-operators-hgwk5\" (UID: \"e913277a-53a4-47ec-b203-2db929dc1034\") " pod="openshift-marketplace/community-operators-hgwk5" Mar 18 18:16:25 crc kubenswrapper[4830]: I0318 18:16:25.568562 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jczh\" (UniqueName: \"kubernetes.io/projected/e913277a-53a4-47ec-b203-2db929dc1034-kube-api-access-4jczh\") pod \"community-operators-hgwk5\" (UID: \"e913277a-53a4-47ec-b203-2db929dc1034\") " pod="openshift-marketplace/community-operators-hgwk5" Mar 18 18:16:25 crc kubenswrapper[4830]: I0318 18:16:25.568621 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e913277a-53a4-47ec-b203-2db929dc1034-catalog-content\") pod \"community-operators-hgwk5\" (UID: \"e913277a-53a4-47ec-b203-2db929dc1034\") " pod="openshift-marketplace/community-operators-hgwk5" Mar 18 18:16:25 crc kubenswrapper[4830]: I0318 18:16:25.568656 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e913277a-53a4-47ec-b203-2db929dc1034-utilities\") pod \"community-operators-hgwk5\" (UID: \"e913277a-53a4-47ec-b203-2db929dc1034\") " pod="openshift-marketplace/community-operators-hgwk5" Mar 18 18:16:25 crc kubenswrapper[4830]: I0318 18:16:25.569226 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e913277a-53a4-47ec-b203-2db929dc1034-utilities\") pod \"community-operators-hgwk5\" (UID: \"e913277a-53a4-47ec-b203-2db929dc1034\") " pod="openshift-marketplace/community-operators-hgwk5" Mar 18 18:16:25 crc kubenswrapper[4830]: I0318 18:16:25.569333 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e913277a-53a4-47ec-b203-2db929dc1034-catalog-content\") pod \"community-operators-hgwk5\" (UID: \"e913277a-53a4-47ec-b203-2db929dc1034\") " pod="openshift-marketplace/community-operators-hgwk5" Mar 18 18:16:25 crc kubenswrapper[4830]: I0318 18:16:25.593131 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jczh\" (UniqueName: \"kubernetes.io/projected/e913277a-53a4-47ec-b203-2db929dc1034-kube-api-access-4jczh\") pod \"community-operators-hgwk5\" (UID: \"e913277a-53a4-47ec-b203-2db929dc1034\") " pod="openshift-marketplace/community-operators-hgwk5" Mar 18 18:16:25 crc kubenswrapper[4830]: I0318 18:16:25.609026 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hgwk5" Mar 18 18:16:26 crc kubenswrapper[4830]: I0318 18:16:26.962250 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hgwk5"] Mar 18 18:16:26 crc kubenswrapper[4830]: W0318 18:16:26.988134 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode913277a_53a4_47ec_b203_2db929dc1034.slice/crio-48272e5c3452231cab4de74223cfc30c698844f04676c0acab0063f2712bf94b WatchSource:0}: Error finding container 48272e5c3452231cab4de74223cfc30c698844f04676c0acab0063f2712bf94b: Status 404 returned error can't find the container with id 48272e5c3452231cab4de74223cfc30c698844f04676c0acab0063f2712bf94b Mar 18 18:16:27 crc kubenswrapper[4830]: I0318 18:16:27.284527 4830 generic.go:334] "Generic (PLEG): container finished" podID="0b224be1-4685-41ee-b31e-8dfbcb80968d" containerID="841801ad18823c6e0acb51d30a179c69d03e3292be72e9b36c3b855dbd69c58e" exitCode=0 Mar 18 18:16:27 crc kubenswrapper[4830]: I0318 18:16:27.284637 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq" event={"ID":"0b224be1-4685-41ee-b31e-8dfbcb80968d","Type":"ContainerDied","Data":"841801ad18823c6e0acb51d30a179c69d03e3292be72e9b36c3b855dbd69c58e"} Mar 18 18:16:27 crc kubenswrapper[4830]: I0318 18:16:27.289764 4830 generic.go:334] "Generic (PLEG): container finished" podID="e913277a-53a4-47ec-b203-2db929dc1034" containerID="2214f95becf4e4d24da36608b1224b5d43f329a7100f2789cb7550d0348901f1" exitCode=0 Mar 18 18:16:27 crc kubenswrapper[4830]: I0318 18:16:27.289861 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgwk5" event={"ID":"e913277a-53a4-47ec-b203-2db929dc1034","Type":"ContainerDied","Data":"2214f95becf4e4d24da36608b1224b5d43f329a7100f2789cb7550d0348901f1"} Mar 18 18:16:27 crc kubenswrapper[4830]: I0318 18:16:27.289945 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgwk5" event={"ID":"e913277a-53a4-47ec-b203-2db929dc1034","Type":"ContainerStarted","Data":"48272e5c3452231cab4de74223cfc30c698844f04676c0acab0063f2712bf94b"} Mar 18 18:16:28 crc kubenswrapper[4830]: I0318 18:16:28.302932 4830 generic.go:334] "Generic (PLEG): container finished" podID="0b224be1-4685-41ee-b31e-8dfbcb80968d" containerID="5a43c573865c629a04fcaf12f60b868bc21362586238bc95f9b93ec09920c374" exitCode=0 Mar 18 18:16:28 crc kubenswrapper[4830]: I0318 18:16:28.303072 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq" event={"ID":"0b224be1-4685-41ee-b31e-8dfbcb80968d","Type":"ContainerDied","Data":"5a43c573865c629a04fcaf12f60b868bc21362586238bc95f9b93ec09920c374"} Mar 18 18:16:28 crc kubenswrapper[4830]: I0318 18:16:28.308840 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgwk5" event={"ID":"e913277a-53a4-47ec-b203-2db929dc1034","Type":"ContainerStarted","Data":"c6585a536d8190a7495d7a8d53e44676b7b8731a1cb8fc5486934f0f97b2499e"} Mar 18 18:16:29 crc kubenswrapper[4830]: I0318 18:16:29.320339 4830 generic.go:334] "Generic (PLEG): container finished" podID="e913277a-53a4-47ec-b203-2db929dc1034" containerID="c6585a536d8190a7495d7a8d53e44676b7b8731a1cb8fc5486934f0f97b2499e" exitCode=0 Mar 18 18:16:29 crc kubenswrapper[4830]: I0318 18:16:29.321072 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgwk5" event={"ID":"e913277a-53a4-47ec-b203-2db929dc1034","Type":"ContainerDied","Data":"c6585a536d8190a7495d7a8d53e44676b7b8731a1cb8fc5486934f0f97b2499e"} Mar 18 18:16:29 crc kubenswrapper[4830]: I0318 18:16:29.509729 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:16:29 crc kubenswrapper[4830]: I0318 18:16:29.509841 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:16:29 crc kubenswrapper[4830]: I0318 18:16:29.678229 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq" Mar 18 18:16:29 crc kubenswrapper[4830]: I0318 18:16:29.829988 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0b224be1-4685-41ee-b31e-8dfbcb80968d-bundle\") pod \"0b224be1-4685-41ee-b31e-8dfbcb80968d\" (UID: \"0b224be1-4685-41ee-b31e-8dfbcb80968d\") " Mar 18 18:16:29 crc kubenswrapper[4830]: I0318 18:16:29.830125 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnpzn\" (UniqueName: \"kubernetes.io/projected/0b224be1-4685-41ee-b31e-8dfbcb80968d-kube-api-access-hnpzn\") pod \"0b224be1-4685-41ee-b31e-8dfbcb80968d\" (UID: \"0b224be1-4685-41ee-b31e-8dfbcb80968d\") " Mar 18 18:16:29 crc kubenswrapper[4830]: I0318 18:16:29.830179 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0b224be1-4685-41ee-b31e-8dfbcb80968d-util\") pod \"0b224be1-4685-41ee-b31e-8dfbcb80968d\" (UID: \"0b224be1-4685-41ee-b31e-8dfbcb80968d\") " Mar 18 18:16:29 crc kubenswrapper[4830]: I0318 18:16:29.831969 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b224be1-4685-41ee-b31e-8dfbcb80968d-bundle" (OuterVolumeSpecName: "bundle") pod "0b224be1-4685-41ee-b31e-8dfbcb80968d" (UID: "0b224be1-4685-41ee-b31e-8dfbcb80968d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:16:29 crc kubenswrapper[4830]: I0318 18:16:29.839546 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b224be1-4685-41ee-b31e-8dfbcb80968d-kube-api-access-hnpzn" (OuterVolumeSpecName: "kube-api-access-hnpzn") pod "0b224be1-4685-41ee-b31e-8dfbcb80968d" (UID: "0b224be1-4685-41ee-b31e-8dfbcb80968d"). InnerVolumeSpecName "kube-api-access-hnpzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:16:29 crc kubenswrapper[4830]: I0318 18:16:29.852591 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b224be1-4685-41ee-b31e-8dfbcb80968d-util" (OuterVolumeSpecName: "util") pod "0b224be1-4685-41ee-b31e-8dfbcb80968d" (UID: "0b224be1-4685-41ee-b31e-8dfbcb80968d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:16:29 crc kubenswrapper[4830]: I0318 18:16:29.932071 4830 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0b224be1-4685-41ee-b31e-8dfbcb80968d-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:29 crc kubenswrapper[4830]: I0318 18:16:29.932409 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnpzn\" (UniqueName: \"kubernetes.io/projected/0b224be1-4685-41ee-b31e-8dfbcb80968d-kube-api-access-hnpzn\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:29 crc kubenswrapper[4830]: I0318 18:16:29.932589 4830 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0b224be1-4685-41ee-b31e-8dfbcb80968d-util\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:30 crc kubenswrapper[4830]: I0318 18:16:30.331618 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq" event={"ID":"0b224be1-4685-41ee-b31e-8dfbcb80968d","Type":"ContainerDied","Data":"a0f8745eeee00c44344d571def5a4f317b07a5c19dcfbf7ea79c5103d28e75df"} Mar 18 18:16:30 crc kubenswrapper[4830]: I0318 18:16:30.332403 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0f8745eeee00c44344d571def5a4f317b07a5c19dcfbf7ea79c5103d28e75df" Mar 18 18:16:30 crc kubenswrapper[4830]: I0318 18:16:30.331716 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq" Mar 18 18:16:31 crc kubenswrapper[4830]: I0318 18:16:31.341430 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgwk5" event={"ID":"e913277a-53a4-47ec-b203-2db929dc1034","Type":"ContainerStarted","Data":"96d9e58b3f693aad5fcecdb1e7c4f61d719d57a867694d32dff1e4bd1de011d3"} Mar 18 18:16:33 crc kubenswrapper[4830]: I0318 18:16:33.454407 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hgwk5" podStartSLOduration=4.639761439 podStartE2EDuration="8.454385519s" podCreationTimestamp="2026-03-18 18:16:25 +0000 UTC" firstStartedPulling="2026-03-18 18:16:27.291277445 +0000 UTC m=+821.858907787" lastFinishedPulling="2026-03-18 18:16:31.105901525 +0000 UTC m=+825.673531867" observedRunningTime="2026-03-18 18:16:31.368655426 +0000 UTC m=+825.936285768" watchObservedRunningTime="2026-03-18 18:16:33.454385519 +0000 UTC m=+828.022015851" Mar 18 18:16:33 crc kubenswrapper[4830]: I0318 18:16:33.459050 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vlrgx"] Mar 18 18:16:33 crc kubenswrapper[4830]: E0318 18:16:33.459649 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b224be1-4685-41ee-b31e-8dfbcb80968d" containerName="pull" Mar 18 18:16:33 crc kubenswrapper[4830]: I0318 18:16:33.459672 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b224be1-4685-41ee-b31e-8dfbcb80968d" containerName="pull" Mar 18 18:16:33 crc kubenswrapper[4830]: E0318 18:16:33.459697 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b224be1-4685-41ee-b31e-8dfbcb80968d" containerName="util" Mar 18 18:16:33 crc kubenswrapper[4830]: I0318 18:16:33.459706 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b224be1-4685-41ee-b31e-8dfbcb80968d" containerName="util" Mar 18 18:16:33 crc kubenswrapper[4830]: E0318 18:16:33.459717 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b224be1-4685-41ee-b31e-8dfbcb80968d" containerName="extract" Mar 18 18:16:33 crc kubenswrapper[4830]: I0318 18:16:33.459726 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b224be1-4685-41ee-b31e-8dfbcb80968d" containerName="extract" Mar 18 18:16:33 crc kubenswrapper[4830]: I0318 18:16:33.459880 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b224be1-4685-41ee-b31e-8dfbcb80968d" containerName="extract" Mar 18 18:16:33 crc kubenswrapper[4830]: I0318 18:16:33.460384 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vlrgx" Mar 18 18:16:33 crc kubenswrapper[4830]: I0318 18:16:33.462580 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 18 18:16:33 crc kubenswrapper[4830]: I0318 18:16:33.462789 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 18 18:16:33 crc kubenswrapper[4830]: I0318 18:16:33.462900 4830 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-ghwtq" Mar 18 18:16:33 crc kubenswrapper[4830]: I0318 18:16:33.476125 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vlrgx"] Mar 18 18:16:33 crc kubenswrapper[4830]: I0318 18:16:33.586371 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nqh2\" (UniqueName: \"kubernetes.io/projected/79a0edc8-46ee-4263-86a6-b7cd379c046f-kube-api-access-6nqh2\") pod \"cert-manager-operator-controller-manager-66c8bdd694-vlrgx\" (UID: \"79a0edc8-46ee-4263-86a6-b7cd379c046f\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vlrgx" Mar 18 18:16:33 crc kubenswrapper[4830]: I0318 18:16:33.586613 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/79a0edc8-46ee-4263-86a6-b7cd379c046f-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-vlrgx\" (UID: \"79a0edc8-46ee-4263-86a6-b7cd379c046f\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vlrgx" Mar 18 18:16:33 crc kubenswrapper[4830]: I0318 18:16:33.688489 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nqh2\" (UniqueName: \"kubernetes.io/projected/79a0edc8-46ee-4263-86a6-b7cd379c046f-kube-api-access-6nqh2\") pod \"cert-manager-operator-controller-manager-66c8bdd694-vlrgx\" (UID: \"79a0edc8-46ee-4263-86a6-b7cd379c046f\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vlrgx" Mar 18 18:16:33 crc kubenswrapper[4830]: I0318 18:16:33.688564 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/79a0edc8-46ee-4263-86a6-b7cd379c046f-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-vlrgx\" (UID: \"79a0edc8-46ee-4263-86a6-b7cd379c046f\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vlrgx" Mar 18 18:16:33 crc kubenswrapper[4830]: I0318 18:16:33.689346 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/79a0edc8-46ee-4263-86a6-b7cd379c046f-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-vlrgx\" (UID: \"79a0edc8-46ee-4263-86a6-b7cd379c046f\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vlrgx" Mar 18 18:16:33 crc kubenswrapper[4830]: I0318 18:16:33.709966 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nqh2\" (UniqueName: \"kubernetes.io/projected/79a0edc8-46ee-4263-86a6-b7cd379c046f-kube-api-access-6nqh2\") pod \"cert-manager-operator-controller-manager-66c8bdd694-vlrgx\" (UID: \"79a0edc8-46ee-4263-86a6-b7cd379c046f\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vlrgx" Mar 18 18:16:33 crc kubenswrapper[4830]: I0318 18:16:33.777033 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vlrgx" Mar 18 18:16:34 crc kubenswrapper[4830]: I0318 18:16:34.251306 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vlrgx"] Mar 18 18:16:34 crc kubenswrapper[4830]: I0318 18:16:34.361797 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vlrgx" event={"ID":"79a0edc8-46ee-4263-86a6-b7cd379c046f","Type":"ContainerStarted","Data":"a70e807faadb5f3366907c0b08706066dfa63c126ecd173204b869c16969566b"} Mar 18 18:16:35 crc kubenswrapper[4830]: I0318 18:16:35.270399 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-257kx"] Mar 18 18:16:35 crc kubenswrapper[4830]: I0318 18:16:35.272465 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-257kx" Mar 18 18:16:35 crc kubenswrapper[4830]: I0318 18:16:35.291403 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-257kx"] Mar 18 18:16:35 crc kubenswrapper[4830]: I0318 18:16:35.416327 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6771e8c7-44a1-4009-b125-0dbd5194b97c-utilities\") pod \"redhat-marketplace-257kx\" (UID: \"6771e8c7-44a1-4009-b125-0dbd5194b97c\") " pod="openshift-marketplace/redhat-marketplace-257kx" Mar 18 18:16:35 crc kubenswrapper[4830]: I0318 18:16:35.416401 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6771e8c7-44a1-4009-b125-0dbd5194b97c-catalog-content\") pod \"redhat-marketplace-257kx\" (UID: \"6771e8c7-44a1-4009-b125-0dbd5194b97c\") " pod="openshift-marketplace/redhat-marketplace-257kx" Mar 18 18:16:35 crc kubenswrapper[4830]: I0318 18:16:35.416437 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8spq\" (UniqueName: \"kubernetes.io/projected/6771e8c7-44a1-4009-b125-0dbd5194b97c-kube-api-access-p8spq\") pod \"redhat-marketplace-257kx\" (UID: \"6771e8c7-44a1-4009-b125-0dbd5194b97c\") " pod="openshift-marketplace/redhat-marketplace-257kx" Mar 18 18:16:35 crc kubenswrapper[4830]: I0318 18:16:35.518059 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6771e8c7-44a1-4009-b125-0dbd5194b97c-utilities\") pod \"redhat-marketplace-257kx\" (UID: \"6771e8c7-44a1-4009-b125-0dbd5194b97c\") " pod="openshift-marketplace/redhat-marketplace-257kx" Mar 18 18:16:35 crc kubenswrapper[4830]: I0318 18:16:35.518520 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6771e8c7-44a1-4009-b125-0dbd5194b97c-catalog-content\") pod \"redhat-marketplace-257kx\" (UID: \"6771e8c7-44a1-4009-b125-0dbd5194b97c\") " pod="openshift-marketplace/redhat-marketplace-257kx" Mar 18 18:16:35 crc kubenswrapper[4830]: I0318 18:16:35.518707 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8spq\" (UniqueName: \"kubernetes.io/projected/6771e8c7-44a1-4009-b125-0dbd5194b97c-kube-api-access-p8spq\") pod \"redhat-marketplace-257kx\" (UID: \"6771e8c7-44a1-4009-b125-0dbd5194b97c\") " pod="openshift-marketplace/redhat-marketplace-257kx" Mar 18 18:16:35 crc kubenswrapper[4830]: I0318 18:16:35.524124 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6771e8c7-44a1-4009-b125-0dbd5194b97c-utilities\") pod \"redhat-marketplace-257kx\" (UID: \"6771e8c7-44a1-4009-b125-0dbd5194b97c\") " pod="openshift-marketplace/redhat-marketplace-257kx" Mar 18 18:16:35 crc kubenswrapper[4830]: I0318 18:16:35.524163 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6771e8c7-44a1-4009-b125-0dbd5194b97c-catalog-content\") pod \"redhat-marketplace-257kx\" (UID: \"6771e8c7-44a1-4009-b125-0dbd5194b97c\") " pod="openshift-marketplace/redhat-marketplace-257kx" Mar 18 18:16:35 crc kubenswrapper[4830]: I0318 18:16:35.547958 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8spq\" (UniqueName: \"kubernetes.io/projected/6771e8c7-44a1-4009-b125-0dbd5194b97c-kube-api-access-p8spq\") pod \"redhat-marketplace-257kx\" (UID: \"6771e8c7-44a1-4009-b125-0dbd5194b97c\") " pod="openshift-marketplace/redhat-marketplace-257kx" Mar 18 18:16:35 crc kubenswrapper[4830]: I0318 18:16:35.610076 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hgwk5" Mar 18 18:16:35 crc kubenswrapper[4830]: I0318 18:16:35.610596 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hgwk5" Mar 18 18:16:35 crc kubenswrapper[4830]: I0318 18:16:35.614757 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-257kx" Mar 18 18:16:35 crc kubenswrapper[4830]: I0318 18:16:35.667145 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hgwk5" Mar 18 18:16:36 crc kubenswrapper[4830]: I0318 18:16:36.083177 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-257kx"] Mar 18 18:16:36 crc kubenswrapper[4830]: I0318 18:16:36.377546 4830 generic.go:334] "Generic (PLEG): container finished" podID="6771e8c7-44a1-4009-b125-0dbd5194b97c" containerID="4fe558d81a12fb05a1f49c008f1589f8c72a88d38d9873054f404b12f8f92a8d" exitCode=0 Mar 18 18:16:36 crc kubenswrapper[4830]: I0318 18:16:36.377596 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-257kx" event={"ID":"6771e8c7-44a1-4009-b125-0dbd5194b97c","Type":"ContainerDied","Data":"4fe558d81a12fb05a1f49c008f1589f8c72a88d38d9873054f404b12f8f92a8d"} Mar 18 18:16:36 crc kubenswrapper[4830]: I0318 18:16:36.377650 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-257kx" event={"ID":"6771e8c7-44a1-4009-b125-0dbd5194b97c","Type":"ContainerStarted","Data":"f3f097c113f31e0eaf30f29a7cdd555e5c6159ace59e29659a41b14f50156c58"} Mar 18 18:16:36 crc kubenswrapper[4830]: I0318 18:16:36.427422 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hgwk5" Mar 18 18:16:38 crc kubenswrapper[4830]: I0318 18:16:38.393746 4830 generic.go:334] "Generic (PLEG): container finished" podID="6771e8c7-44a1-4009-b125-0dbd5194b97c" containerID="ad3d2c7b0ef4a199ab8654ec5e419efcb84aa764be1e9adda39b17cf2acf24ee" exitCode=0 Mar 18 18:16:38 crc kubenswrapper[4830]: I0318 18:16:38.393850 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-257kx" event={"ID":"6771e8c7-44a1-4009-b125-0dbd5194b97c","Type":"ContainerDied","Data":"ad3d2c7b0ef4a199ab8654ec5e419efcb84aa764be1e9adda39b17cf2acf24ee"} Mar 18 18:16:38 crc kubenswrapper[4830]: I0318 18:16:38.398014 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vlrgx" event={"ID":"79a0edc8-46ee-4263-86a6-b7cd379c046f","Type":"ContainerStarted","Data":"c5d65208f58bf881adbef2be1235a0401184374748b583d28a1ec39fbfaa99d4"} Mar 18 18:16:38 crc kubenswrapper[4830]: I0318 18:16:38.451467 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vlrgx" podStartSLOduration=1.76631181 podStartE2EDuration="5.451451009s" podCreationTimestamp="2026-03-18 18:16:33 +0000 UTC" firstStartedPulling="2026-03-18 18:16:34.26162303 +0000 UTC m=+828.829253362" lastFinishedPulling="2026-03-18 18:16:37.946762209 +0000 UTC m=+832.514392561" observedRunningTime="2026-03-18 18:16:38.448610036 +0000 UTC m=+833.016240368" watchObservedRunningTime="2026-03-18 18:16:38.451451009 +0000 UTC m=+833.019081341" Mar 18 18:16:39 crc kubenswrapper[4830]: I0318 18:16:39.266508 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hgwk5"] Mar 18 18:16:39 crc kubenswrapper[4830]: I0318 18:16:39.406612 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-257kx" event={"ID":"6771e8c7-44a1-4009-b125-0dbd5194b97c","Type":"ContainerStarted","Data":"2922b21adac55e0d116c28fdff44e568eb97818de417abe3087fdb8c3142fd30"} Mar 18 18:16:39 crc kubenswrapper[4830]: I0318 18:16:39.406795 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hgwk5" podUID="e913277a-53a4-47ec-b203-2db929dc1034" containerName="registry-server" containerID="cri-o://96d9e58b3f693aad5fcecdb1e7c4f61d719d57a867694d32dff1e4bd1de011d3" gracePeriod=2 Mar 18 18:16:39 crc kubenswrapper[4830]: I0318 18:16:39.439177 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-257kx" podStartSLOduration=1.9816489160000001 podStartE2EDuration="4.439158644s" podCreationTimestamp="2026-03-18 18:16:35 +0000 UTC" firstStartedPulling="2026-03-18 18:16:36.381374687 +0000 UTC m=+830.949005019" lastFinishedPulling="2026-03-18 18:16:38.838884375 +0000 UTC m=+833.406514747" observedRunningTime="2026-03-18 18:16:39.432927132 +0000 UTC m=+834.000557464" watchObservedRunningTime="2026-03-18 18:16:39.439158644 +0000 UTC m=+834.006788976" Mar 18 18:16:39 crc kubenswrapper[4830]: I0318 18:16:39.814809 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hgwk5" Mar 18 18:16:39 crc kubenswrapper[4830]: I0318 18:16:39.889951 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e913277a-53a4-47ec-b203-2db929dc1034-utilities\") pod \"e913277a-53a4-47ec-b203-2db929dc1034\" (UID: \"e913277a-53a4-47ec-b203-2db929dc1034\") " Mar 18 18:16:39 crc kubenswrapper[4830]: I0318 18:16:39.890397 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jczh\" (UniqueName: \"kubernetes.io/projected/e913277a-53a4-47ec-b203-2db929dc1034-kube-api-access-4jczh\") pod \"e913277a-53a4-47ec-b203-2db929dc1034\" (UID: \"e913277a-53a4-47ec-b203-2db929dc1034\") " Mar 18 18:16:39 crc kubenswrapper[4830]: I0318 18:16:39.890467 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e913277a-53a4-47ec-b203-2db929dc1034-catalog-content\") pod \"e913277a-53a4-47ec-b203-2db929dc1034\" (UID: \"e913277a-53a4-47ec-b203-2db929dc1034\") " Mar 18 18:16:39 crc kubenswrapper[4830]: I0318 18:16:39.890971 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e913277a-53a4-47ec-b203-2db929dc1034-utilities" (OuterVolumeSpecName: "utilities") pod "e913277a-53a4-47ec-b203-2db929dc1034" (UID: "e913277a-53a4-47ec-b203-2db929dc1034"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:16:39 crc kubenswrapper[4830]: I0318 18:16:39.899128 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e913277a-53a4-47ec-b203-2db929dc1034-kube-api-access-4jczh" (OuterVolumeSpecName: "kube-api-access-4jczh") pod "e913277a-53a4-47ec-b203-2db929dc1034" (UID: "e913277a-53a4-47ec-b203-2db929dc1034"). InnerVolumeSpecName "kube-api-access-4jczh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:16:39 crc kubenswrapper[4830]: I0318 18:16:39.946027 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e913277a-53a4-47ec-b203-2db929dc1034-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e913277a-53a4-47ec-b203-2db929dc1034" (UID: "e913277a-53a4-47ec-b203-2db929dc1034"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:16:39 crc kubenswrapper[4830]: I0318 18:16:39.991944 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jczh\" (UniqueName: \"kubernetes.io/projected/e913277a-53a4-47ec-b203-2db929dc1034-kube-api-access-4jczh\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:39 crc kubenswrapper[4830]: I0318 18:16:39.991988 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e913277a-53a4-47ec-b203-2db929dc1034-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:39 crc kubenswrapper[4830]: I0318 18:16:39.991998 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e913277a-53a4-47ec-b203-2db929dc1034-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:40 crc kubenswrapper[4830]: I0318 18:16:40.414924 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgwk5" event={"ID":"e913277a-53a4-47ec-b203-2db929dc1034","Type":"ContainerDied","Data":"96d9e58b3f693aad5fcecdb1e7c4f61d719d57a867694d32dff1e4bd1de011d3"} Mar 18 18:16:40 crc kubenswrapper[4830]: I0318 18:16:40.414957 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hgwk5" Mar 18 18:16:40 crc kubenswrapper[4830]: I0318 18:16:40.414995 4830 scope.go:117] "RemoveContainer" containerID="96d9e58b3f693aad5fcecdb1e7c4f61d719d57a867694d32dff1e4bd1de011d3" Mar 18 18:16:40 crc kubenswrapper[4830]: I0318 18:16:40.414863 4830 generic.go:334] "Generic (PLEG): container finished" podID="e913277a-53a4-47ec-b203-2db929dc1034" containerID="96d9e58b3f693aad5fcecdb1e7c4f61d719d57a867694d32dff1e4bd1de011d3" exitCode=0 Mar 18 18:16:40 crc kubenswrapper[4830]: I0318 18:16:40.415255 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgwk5" event={"ID":"e913277a-53a4-47ec-b203-2db929dc1034","Type":"ContainerDied","Data":"48272e5c3452231cab4de74223cfc30c698844f04676c0acab0063f2712bf94b"} Mar 18 18:16:40 crc kubenswrapper[4830]: I0318 18:16:40.432119 4830 scope.go:117] "RemoveContainer" containerID="c6585a536d8190a7495d7a8d53e44676b7b8731a1cb8fc5486934f0f97b2499e" Mar 18 18:16:40 crc kubenswrapper[4830]: I0318 18:16:40.458903 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hgwk5"] Mar 18 18:16:40 crc kubenswrapper[4830]: I0318 18:16:40.459346 4830 scope.go:117] "RemoveContainer" containerID="2214f95becf4e4d24da36608b1224b5d43f329a7100f2789cb7550d0348901f1" Mar 18 18:16:40 crc kubenswrapper[4830]: I0318 18:16:40.465297 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hgwk5"] Mar 18 18:16:40 crc kubenswrapper[4830]: I0318 18:16:40.484096 4830 scope.go:117] "RemoveContainer" containerID="96d9e58b3f693aad5fcecdb1e7c4f61d719d57a867694d32dff1e4bd1de011d3" Mar 18 18:16:40 crc kubenswrapper[4830]: E0318 18:16:40.484719 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96d9e58b3f693aad5fcecdb1e7c4f61d719d57a867694d32dff1e4bd1de011d3\": container with ID starting with 96d9e58b3f693aad5fcecdb1e7c4f61d719d57a867694d32dff1e4bd1de011d3 not found: ID does not exist" containerID="96d9e58b3f693aad5fcecdb1e7c4f61d719d57a867694d32dff1e4bd1de011d3" Mar 18 18:16:40 crc kubenswrapper[4830]: I0318 18:16:40.484755 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96d9e58b3f693aad5fcecdb1e7c4f61d719d57a867694d32dff1e4bd1de011d3"} err="failed to get container status \"96d9e58b3f693aad5fcecdb1e7c4f61d719d57a867694d32dff1e4bd1de011d3\": rpc error: code = NotFound desc = could not find container \"96d9e58b3f693aad5fcecdb1e7c4f61d719d57a867694d32dff1e4bd1de011d3\": container with ID starting with 96d9e58b3f693aad5fcecdb1e7c4f61d719d57a867694d32dff1e4bd1de011d3 not found: ID does not exist" Mar 18 18:16:40 crc kubenswrapper[4830]: I0318 18:16:40.484791 4830 scope.go:117] "RemoveContainer" containerID="c6585a536d8190a7495d7a8d53e44676b7b8731a1cb8fc5486934f0f97b2499e" Mar 18 18:16:40 crc kubenswrapper[4830]: E0318 18:16:40.485273 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6585a536d8190a7495d7a8d53e44676b7b8731a1cb8fc5486934f0f97b2499e\": container with ID starting with c6585a536d8190a7495d7a8d53e44676b7b8731a1cb8fc5486934f0f97b2499e not found: ID does not exist" containerID="c6585a536d8190a7495d7a8d53e44676b7b8731a1cb8fc5486934f0f97b2499e" Mar 18 18:16:40 crc kubenswrapper[4830]: I0318 18:16:40.485325 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6585a536d8190a7495d7a8d53e44676b7b8731a1cb8fc5486934f0f97b2499e"} err="failed to get container status \"c6585a536d8190a7495d7a8d53e44676b7b8731a1cb8fc5486934f0f97b2499e\": rpc error: code = NotFound desc = could not find container \"c6585a536d8190a7495d7a8d53e44676b7b8731a1cb8fc5486934f0f97b2499e\": container with ID starting with c6585a536d8190a7495d7a8d53e44676b7b8731a1cb8fc5486934f0f97b2499e not found: ID does not exist" Mar 18 18:16:40 crc kubenswrapper[4830]: I0318 18:16:40.485367 4830 scope.go:117] "RemoveContainer" containerID="2214f95becf4e4d24da36608b1224b5d43f329a7100f2789cb7550d0348901f1" Mar 18 18:16:40 crc kubenswrapper[4830]: E0318 18:16:40.485890 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2214f95becf4e4d24da36608b1224b5d43f329a7100f2789cb7550d0348901f1\": container with ID starting with 2214f95becf4e4d24da36608b1224b5d43f329a7100f2789cb7550d0348901f1 not found: ID does not exist" containerID="2214f95becf4e4d24da36608b1224b5d43f329a7100f2789cb7550d0348901f1" Mar 18 18:16:40 crc kubenswrapper[4830]: I0318 18:16:40.485921 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2214f95becf4e4d24da36608b1224b5d43f329a7100f2789cb7550d0348901f1"} err="failed to get container status \"2214f95becf4e4d24da36608b1224b5d43f329a7100f2789cb7550d0348901f1\": rpc error: code = NotFound desc = could not find container \"2214f95becf4e4d24da36608b1224b5d43f329a7100f2789cb7550d0348901f1\": container with ID starting with 2214f95becf4e4d24da36608b1224b5d43f329a7100f2789cb7550d0348901f1 not found: ID does not exist" Mar 18 18:16:40 crc kubenswrapper[4830]: I0318 18:16:40.991390 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-k5pz4"] Mar 18 18:16:40 crc kubenswrapper[4830]: E0318 18:16:40.991655 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e913277a-53a4-47ec-b203-2db929dc1034" containerName="registry-server" Mar 18 18:16:40 crc kubenswrapper[4830]: I0318 18:16:40.991670 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e913277a-53a4-47ec-b203-2db929dc1034" containerName="registry-server" Mar 18 18:16:40 crc kubenswrapper[4830]: E0318 18:16:40.991686 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e913277a-53a4-47ec-b203-2db929dc1034" containerName="extract-content" Mar 18 18:16:40 crc kubenswrapper[4830]: I0318 18:16:40.991694 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e913277a-53a4-47ec-b203-2db929dc1034" containerName="extract-content" Mar 18 18:16:40 crc kubenswrapper[4830]: E0318 18:16:40.991712 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e913277a-53a4-47ec-b203-2db929dc1034" containerName="extract-utilities" Mar 18 18:16:40 crc kubenswrapper[4830]: I0318 18:16:40.991721 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e913277a-53a4-47ec-b203-2db929dc1034" containerName="extract-utilities" Mar 18 18:16:40 crc kubenswrapper[4830]: I0318 18:16:40.991875 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="e913277a-53a4-47ec-b203-2db929dc1034" containerName="registry-server" Mar 18 18:16:40 crc kubenswrapper[4830]: I0318 18:16:40.992347 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-k5pz4" Mar 18 18:16:40 crc kubenswrapper[4830]: I0318 18:16:40.997935 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 18 18:16:40 crc kubenswrapper[4830]: I0318 18:16:40.998224 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 18 18:16:41 crc kubenswrapper[4830]: I0318 18:16:41.002189 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-k5pz4"] Mar 18 18:16:41 crc kubenswrapper[4830]: I0318 18:16:41.038076 4830 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-6b6mf" Mar 18 18:16:41 crc kubenswrapper[4830]: I0318 18:16:41.110958 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4668ca7c-f03c-4236-83f8-17c1c156c754-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-k5pz4\" (UID: \"4668ca7c-f03c-4236-83f8-17c1c156c754\") " pod="cert-manager/cert-manager-webhook-6888856db4-k5pz4" Mar 18 18:16:41 crc kubenswrapper[4830]: I0318 18:16:41.111013 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv9wh\" (UniqueName: \"kubernetes.io/projected/4668ca7c-f03c-4236-83f8-17c1c156c754-kube-api-access-sv9wh\") pod \"cert-manager-webhook-6888856db4-k5pz4\" (UID: \"4668ca7c-f03c-4236-83f8-17c1c156c754\") " pod="cert-manager/cert-manager-webhook-6888856db4-k5pz4" Mar 18 18:16:41 crc kubenswrapper[4830]: I0318 18:16:41.212514 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4668ca7c-f03c-4236-83f8-17c1c156c754-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-k5pz4\" (UID: \"4668ca7c-f03c-4236-83f8-17c1c156c754\") " pod="cert-manager/cert-manager-webhook-6888856db4-k5pz4" Mar 18 18:16:41 crc kubenswrapper[4830]: I0318 18:16:41.212573 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv9wh\" (UniqueName: \"kubernetes.io/projected/4668ca7c-f03c-4236-83f8-17c1c156c754-kube-api-access-sv9wh\") pod \"cert-manager-webhook-6888856db4-k5pz4\" (UID: \"4668ca7c-f03c-4236-83f8-17c1c156c754\") " pod="cert-manager/cert-manager-webhook-6888856db4-k5pz4" Mar 18 18:16:41 crc kubenswrapper[4830]: I0318 18:16:41.231573 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4668ca7c-f03c-4236-83f8-17c1c156c754-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-k5pz4\" (UID: \"4668ca7c-f03c-4236-83f8-17c1c156c754\") " pod="cert-manager/cert-manager-webhook-6888856db4-k5pz4" Mar 18 18:16:41 crc kubenswrapper[4830]: I0318 18:16:41.232950 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv9wh\" (UniqueName: \"kubernetes.io/projected/4668ca7c-f03c-4236-83f8-17c1c156c754-kube-api-access-sv9wh\") pod \"cert-manager-webhook-6888856db4-k5pz4\" (UID: \"4668ca7c-f03c-4236-83f8-17c1c156c754\") " pod="cert-manager/cert-manager-webhook-6888856db4-k5pz4" Mar 18 18:16:41 crc kubenswrapper[4830]: I0318 18:16:41.345959 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-k5pz4" Mar 18 18:16:41 crc kubenswrapper[4830]: I0318 18:16:41.623029 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-k5pz4"] Mar 18 18:16:42 crc kubenswrapper[4830]: I0318 18:16:42.246890 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e913277a-53a4-47ec-b203-2db929dc1034" path="/var/lib/kubelet/pods/e913277a-53a4-47ec-b203-2db929dc1034/volumes" Mar 18 18:16:42 crc kubenswrapper[4830]: I0318 18:16:42.438552 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-k5pz4" event={"ID":"4668ca7c-f03c-4236-83f8-17c1c156c754","Type":"ContainerStarted","Data":"66a3c1ec750147e7d26b102f773e366308b913984942d977b29338707b1e961d"} Mar 18 18:16:44 crc kubenswrapper[4830]: I0318 18:16:44.760531 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-l4ncg"] Mar 18 18:16:44 crc kubenswrapper[4830]: I0318 18:16:44.761833 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-l4ncg" Mar 18 18:16:44 crc kubenswrapper[4830]: I0318 18:16:44.768254 4830 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-gt966" Mar 18 18:16:44 crc kubenswrapper[4830]: I0318 18:16:44.798746 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-l4ncg"] Mar 18 18:16:44 crc kubenswrapper[4830]: I0318 18:16:44.865485 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rln9b\" (UniqueName: \"kubernetes.io/projected/286e3d3b-4f86-45b5-819c-1e6c107eb985-kube-api-access-rln9b\") pod \"cert-manager-cainjector-5545bd876-l4ncg\" (UID: \"286e3d3b-4f86-45b5-819c-1e6c107eb985\") " pod="cert-manager/cert-manager-cainjector-5545bd876-l4ncg" Mar 18 18:16:44 crc kubenswrapper[4830]: I0318 18:16:44.865592 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/286e3d3b-4f86-45b5-819c-1e6c107eb985-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-l4ncg\" (UID: \"286e3d3b-4f86-45b5-819c-1e6c107eb985\") " pod="cert-manager/cert-manager-cainjector-5545bd876-l4ncg" Mar 18 18:16:44 crc kubenswrapper[4830]: I0318 18:16:44.967111 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/286e3d3b-4f86-45b5-819c-1e6c107eb985-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-l4ncg\" (UID: \"286e3d3b-4f86-45b5-819c-1e6c107eb985\") " pod="cert-manager/cert-manager-cainjector-5545bd876-l4ncg" Mar 18 18:16:44 crc kubenswrapper[4830]: I0318 18:16:44.967171 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rln9b\" (UniqueName: \"kubernetes.io/projected/286e3d3b-4f86-45b5-819c-1e6c107eb985-kube-api-access-rln9b\") pod \"cert-manager-cainjector-5545bd876-l4ncg\" (UID: \"286e3d3b-4f86-45b5-819c-1e6c107eb985\") " pod="cert-manager/cert-manager-cainjector-5545bd876-l4ncg" Mar 18 18:16:44 crc kubenswrapper[4830]: I0318 18:16:44.987329 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/286e3d3b-4f86-45b5-819c-1e6c107eb985-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-l4ncg\" (UID: \"286e3d3b-4f86-45b5-819c-1e6c107eb985\") " pod="cert-manager/cert-manager-cainjector-5545bd876-l4ncg" Mar 18 18:16:44 crc kubenswrapper[4830]: I0318 18:16:44.991811 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rln9b\" (UniqueName: \"kubernetes.io/projected/286e3d3b-4f86-45b5-819c-1e6c107eb985-kube-api-access-rln9b\") pod \"cert-manager-cainjector-5545bd876-l4ncg\" (UID: \"286e3d3b-4f86-45b5-819c-1e6c107eb985\") " pod="cert-manager/cert-manager-cainjector-5545bd876-l4ncg" Mar 18 18:16:45 crc kubenswrapper[4830]: I0318 18:16:45.132670 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-l4ncg" Mar 18 18:16:45 crc kubenswrapper[4830]: I0318 18:16:45.347523 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-l4ncg"] Mar 18 18:16:45 crc kubenswrapper[4830]: I0318 18:16:45.459911 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-l4ncg" event={"ID":"286e3d3b-4f86-45b5-819c-1e6c107eb985","Type":"ContainerStarted","Data":"315e62cf83d7198b69056aef80cefd94de16c3eb4f1d92c859d69c0d5a2f5c11"} Mar 18 18:16:45 crc kubenswrapper[4830]: I0318 18:16:45.615886 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-257kx" Mar 18 18:16:45 crc kubenswrapper[4830]: I0318 18:16:45.616171 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-257kx" Mar 18 18:16:45 crc kubenswrapper[4830]: I0318 18:16:45.672797 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-257kx" Mar 18 18:16:46 crc kubenswrapper[4830]: I0318 18:16:46.516208 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-257kx" Mar 18 18:16:47 crc kubenswrapper[4830]: I0318 18:16:47.260725 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-257kx"] Mar 18 18:16:48 crc kubenswrapper[4830]: I0318 18:16:48.482011 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-l4ncg" event={"ID":"286e3d3b-4f86-45b5-819c-1e6c107eb985","Type":"ContainerStarted","Data":"60fcf6be477f8f3a68ed4c268a0314e9b08df2b02e917fcd63cbbbfd6f9a7e53"} Mar 18 18:16:48 crc kubenswrapper[4830]: I0318 18:16:48.484441 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-257kx" podUID="6771e8c7-44a1-4009-b125-0dbd5194b97c" containerName="registry-server" containerID="cri-o://2922b21adac55e0d116c28fdff44e568eb97818de417abe3087fdb8c3142fd30" gracePeriod=2 Mar 18 18:16:48 crc kubenswrapper[4830]: I0318 18:16:48.484994 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-k5pz4" event={"ID":"4668ca7c-f03c-4236-83f8-17c1c156c754","Type":"ContainerStarted","Data":"7d5741d47f0dc378a99a22f9dc190fc330d6821f16eaa24cc33a8878e17baf7b"} Mar 18 18:16:48 crc kubenswrapper[4830]: I0318 18:16:48.485025 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-k5pz4" Mar 18 18:16:48 crc kubenswrapper[4830]: I0318 18:16:48.527913 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-l4ncg" podStartSLOduration=2.481590496 podStartE2EDuration="4.527895223s" podCreationTimestamp="2026-03-18 18:16:44 +0000 UTC" firstStartedPulling="2026-03-18 18:16:45.358112161 +0000 UTC m=+839.925742493" lastFinishedPulling="2026-03-18 18:16:47.404416888 +0000 UTC m=+841.972047220" observedRunningTime="2026-03-18 18:16:48.502014029 +0000 UTC m=+843.069644361" watchObservedRunningTime="2026-03-18 18:16:48.527895223 +0000 UTC m=+843.095525555" Mar 18 18:16:48 crc kubenswrapper[4830]: I0318 18:16:48.531731 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-k5pz4" podStartSLOduration=2.756466071 podStartE2EDuration="8.53172128s" podCreationTimestamp="2026-03-18 18:16:40 +0000 UTC" firstStartedPulling="2026-03-18 18:16:41.646840266 +0000 UTC m=+836.214470598" lastFinishedPulling="2026-03-18 18:16:47.422095475 +0000 UTC m=+841.989725807" observedRunningTime="2026-03-18 18:16:48.526429212 +0000 UTC m=+843.094059544" watchObservedRunningTime="2026-03-18 18:16:48.53172128 +0000 UTC m=+843.099351612" Mar 18 18:16:49 crc kubenswrapper[4830]: I0318 18:16:49.424119 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-257kx" Mar 18 18:16:49 crc kubenswrapper[4830]: I0318 18:16:49.492124 4830 generic.go:334] "Generic (PLEG): container finished" podID="6771e8c7-44a1-4009-b125-0dbd5194b97c" containerID="2922b21adac55e0d116c28fdff44e568eb97818de417abe3087fdb8c3142fd30" exitCode=0 Mar 18 18:16:49 crc kubenswrapper[4830]: I0318 18:16:49.492185 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-257kx" Mar 18 18:16:49 crc kubenswrapper[4830]: I0318 18:16:49.492936 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-257kx" event={"ID":"6771e8c7-44a1-4009-b125-0dbd5194b97c","Type":"ContainerDied","Data":"2922b21adac55e0d116c28fdff44e568eb97818de417abe3087fdb8c3142fd30"} Mar 18 18:16:49 crc kubenswrapper[4830]: I0318 18:16:49.493032 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-257kx" event={"ID":"6771e8c7-44a1-4009-b125-0dbd5194b97c","Type":"ContainerDied","Data":"f3f097c113f31e0eaf30f29a7cdd555e5c6159ace59e29659a41b14f50156c58"} Mar 18 18:16:49 crc kubenswrapper[4830]: I0318 18:16:49.493099 4830 scope.go:117] "RemoveContainer" containerID="2922b21adac55e0d116c28fdff44e568eb97818de417abe3087fdb8c3142fd30" Mar 18 18:16:49 crc kubenswrapper[4830]: I0318 18:16:49.510866 4830 scope.go:117] "RemoveContainer" containerID="ad3d2c7b0ef4a199ab8654ec5e419efcb84aa764be1e9adda39b17cf2acf24ee" Mar 18 18:16:49 crc kubenswrapper[4830]: I0318 18:16:49.528566 4830 scope.go:117] "RemoveContainer" containerID="4fe558d81a12fb05a1f49c008f1589f8c72a88d38d9873054f404b12f8f92a8d" Mar 18 18:16:49 crc kubenswrapper[4830]: I0318 18:16:49.532333 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8spq\" (UniqueName: \"kubernetes.io/projected/6771e8c7-44a1-4009-b125-0dbd5194b97c-kube-api-access-p8spq\") pod \"6771e8c7-44a1-4009-b125-0dbd5194b97c\" (UID: \"6771e8c7-44a1-4009-b125-0dbd5194b97c\") " Mar 18 18:16:49 crc kubenswrapper[4830]: I0318 18:16:49.532412 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6771e8c7-44a1-4009-b125-0dbd5194b97c-utilities\") pod \"6771e8c7-44a1-4009-b125-0dbd5194b97c\" (UID: \"6771e8c7-44a1-4009-b125-0dbd5194b97c\") " Mar 18 18:16:49 crc kubenswrapper[4830]: I0318 18:16:49.532457 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6771e8c7-44a1-4009-b125-0dbd5194b97c-catalog-content\") pod \"6771e8c7-44a1-4009-b125-0dbd5194b97c\" (UID: \"6771e8c7-44a1-4009-b125-0dbd5194b97c\") " Mar 18 18:16:49 crc kubenswrapper[4830]: I0318 18:16:49.534942 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6771e8c7-44a1-4009-b125-0dbd5194b97c-utilities" (OuterVolumeSpecName: "utilities") pod "6771e8c7-44a1-4009-b125-0dbd5194b97c" (UID: "6771e8c7-44a1-4009-b125-0dbd5194b97c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:16:49 crc kubenswrapper[4830]: I0318 18:16:49.539762 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6771e8c7-44a1-4009-b125-0dbd5194b97c-kube-api-access-p8spq" (OuterVolumeSpecName: "kube-api-access-p8spq") pod "6771e8c7-44a1-4009-b125-0dbd5194b97c" (UID: "6771e8c7-44a1-4009-b125-0dbd5194b97c"). InnerVolumeSpecName "kube-api-access-p8spq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:16:49 crc kubenswrapper[4830]: I0318 18:16:49.550138 4830 scope.go:117] "RemoveContainer" containerID="2922b21adac55e0d116c28fdff44e568eb97818de417abe3087fdb8c3142fd30" Mar 18 18:16:49 crc kubenswrapper[4830]: E0318 18:16:49.551449 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2922b21adac55e0d116c28fdff44e568eb97818de417abe3087fdb8c3142fd30\": container with ID starting with 2922b21adac55e0d116c28fdff44e568eb97818de417abe3087fdb8c3142fd30 not found: ID does not exist" containerID="2922b21adac55e0d116c28fdff44e568eb97818de417abe3087fdb8c3142fd30" Mar 18 18:16:49 crc kubenswrapper[4830]: I0318 18:16:49.551481 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2922b21adac55e0d116c28fdff44e568eb97818de417abe3087fdb8c3142fd30"} err="failed to get container status \"2922b21adac55e0d116c28fdff44e568eb97818de417abe3087fdb8c3142fd30\": rpc error: code = NotFound desc = could not find container \"2922b21adac55e0d116c28fdff44e568eb97818de417abe3087fdb8c3142fd30\": container with ID starting with 2922b21adac55e0d116c28fdff44e568eb97818de417abe3087fdb8c3142fd30 not found: ID does not exist" Mar 18 18:16:49 crc kubenswrapper[4830]: I0318 18:16:49.551510 4830 scope.go:117] "RemoveContainer" containerID="ad3d2c7b0ef4a199ab8654ec5e419efcb84aa764be1e9adda39b17cf2acf24ee" Mar 18 18:16:49 crc kubenswrapper[4830]: E0318 18:16:49.551784 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad3d2c7b0ef4a199ab8654ec5e419efcb84aa764be1e9adda39b17cf2acf24ee\": container with ID starting with ad3d2c7b0ef4a199ab8654ec5e419efcb84aa764be1e9adda39b17cf2acf24ee not found: ID does not exist" containerID="ad3d2c7b0ef4a199ab8654ec5e419efcb84aa764be1e9adda39b17cf2acf24ee" Mar 18 18:16:49 crc kubenswrapper[4830]: I0318 18:16:49.551808 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad3d2c7b0ef4a199ab8654ec5e419efcb84aa764be1e9adda39b17cf2acf24ee"} err="failed to get container status \"ad3d2c7b0ef4a199ab8654ec5e419efcb84aa764be1e9adda39b17cf2acf24ee\": rpc error: code = NotFound desc = could not find container \"ad3d2c7b0ef4a199ab8654ec5e419efcb84aa764be1e9adda39b17cf2acf24ee\": container with ID starting with ad3d2c7b0ef4a199ab8654ec5e419efcb84aa764be1e9adda39b17cf2acf24ee not found: ID does not exist" Mar 18 18:16:49 crc kubenswrapper[4830]: I0318 18:16:49.551822 4830 scope.go:117] "RemoveContainer" containerID="4fe558d81a12fb05a1f49c008f1589f8c72a88d38d9873054f404b12f8f92a8d" Mar 18 18:16:49 crc kubenswrapper[4830]: E0318 18:16:49.552154 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fe558d81a12fb05a1f49c008f1589f8c72a88d38d9873054f404b12f8f92a8d\": container with ID starting with 4fe558d81a12fb05a1f49c008f1589f8c72a88d38d9873054f404b12f8f92a8d not found: ID does not exist" containerID="4fe558d81a12fb05a1f49c008f1589f8c72a88d38d9873054f404b12f8f92a8d" Mar 18 18:16:49 crc kubenswrapper[4830]: I0318 18:16:49.552228 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fe558d81a12fb05a1f49c008f1589f8c72a88d38d9873054f404b12f8f92a8d"} err="failed to get container status \"4fe558d81a12fb05a1f49c008f1589f8c72a88d38d9873054f404b12f8f92a8d\": rpc error: code = NotFound desc = could not find container \"4fe558d81a12fb05a1f49c008f1589f8c72a88d38d9873054f404b12f8f92a8d\": container with ID starting with 4fe558d81a12fb05a1f49c008f1589f8c72a88d38d9873054f404b12f8f92a8d not found: ID does not exist" Mar 18 18:16:49 crc kubenswrapper[4830]: I0318 18:16:49.567377 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6771e8c7-44a1-4009-b125-0dbd5194b97c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6771e8c7-44a1-4009-b125-0dbd5194b97c" (UID: "6771e8c7-44a1-4009-b125-0dbd5194b97c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:16:49 crc kubenswrapper[4830]: I0318 18:16:49.634797 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6771e8c7-44a1-4009-b125-0dbd5194b97c-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:49 crc kubenswrapper[4830]: I0318 18:16:49.634855 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6771e8c7-44a1-4009-b125-0dbd5194b97c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:49 crc kubenswrapper[4830]: I0318 18:16:49.634877 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8spq\" (UniqueName: \"kubernetes.io/projected/6771e8c7-44a1-4009-b125-0dbd5194b97c-kube-api-access-p8spq\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:49 crc kubenswrapper[4830]: I0318 18:16:49.837354 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-257kx"] Mar 18 18:16:49 crc kubenswrapper[4830]: I0318 18:16:49.843208 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-257kx"] Mar 18 18:16:50 crc kubenswrapper[4830]: I0318 18:16:50.249884 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6771e8c7-44a1-4009-b125-0dbd5194b97c" path="/var/lib/kubelet/pods/6771e8c7-44a1-4009-b125-0dbd5194b97c/volumes" Mar 18 18:16:56 crc kubenswrapper[4830]: I0318 18:16:56.350229 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-k5pz4" Mar 18 18:16:59 crc kubenswrapper[4830]: I0318 18:16:59.510028 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:16:59 crc kubenswrapper[4830]: I0318 18:16:59.510488 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:16:59 crc kubenswrapper[4830]: I0318 18:16:59.510583 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" Mar 18 18:16:59 crc kubenswrapper[4830]: I0318 18:16:59.511873 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3a569bf099365538438bf2523866621050b3b655b0210e45d89e9932425c1a49"} pod="openshift-machine-config-operator/machine-config-daemon-plzpb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 18:16:59 crc kubenswrapper[4830]: I0318 18:16:59.512034 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" containerID="cri-o://3a569bf099365538438bf2523866621050b3b655b0210e45d89e9932425c1a49" gracePeriod=600 Mar 18 18:17:00 crc kubenswrapper[4830]: I0318 18:17:00.575348 4830 generic.go:334] "Generic (PLEG): container finished" podID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerID="3a569bf099365538438bf2523866621050b3b655b0210e45d89e9932425c1a49" exitCode=0 Mar 18 18:17:00 crc kubenswrapper[4830]: I0318 18:17:00.575436 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerDied","Data":"3a569bf099365538438bf2523866621050b3b655b0210e45d89e9932425c1a49"} Mar 18 18:17:00 crc kubenswrapper[4830]: I0318 18:17:00.576400 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerStarted","Data":"0f0582e7c69a5ff0a523a01804a4f3c9becc735481bb91df9516cfe7387f2359"} Mar 18 18:17:00 crc kubenswrapper[4830]: I0318 18:17:00.576440 4830 scope.go:117] "RemoveContainer" containerID="00c3c6a3091f8f5d9397121aaf2ddaed1a26f2cb7f216702ce3187e6b6274afc" Mar 18 18:17:00 crc kubenswrapper[4830]: I0318 18:17:00.899075 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-6d85b"] Mar 18 18:17:00 crc kubenswrapper[4830]: E0318 18:17:00.899418 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6771e8c7-44a1-4009-b125-0dbd5194b97c" containerName="extract-utilities" Mar 18 18:17:00 crc kubenswrapper[4830]: I0318 18:17:00.899437 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="6771e8c7-44a1-4009-b125-0dbd5194b97c" containerName="extract-utilities" Mar 18 18:17:00 crc kubenswrapper[4830]: E0318 18:17:00.899452 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6771e8c7-44a1-4009-b125-0dbd5194b97c" containerName="extract-content" Mar 18 18:17:00 crc kubenswrapper[4830]: I0318 18:17:00.899463 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="6771e8c7-44a1-4009-b125-0dbd5194b97c" containerName="extract-content" Mar 18 18:17:00 crc kubenswrapper[4830]: E0318 18:17:00.899493 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6771e8c7-44a1-4009-b125-0dbd5194b97c" containerName="registry-server" Mar 18 18:17:00 crc kubenswrapper[4830]: I0318 18:17:00.899503 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="6771e8c7-44a1-4009-b125-0dbd5194b97c" containerName="registry-server" Mar 18 18:17:00 crc kubenswrapper[4830]: I0318 18:17:00.899673 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="6771e8c7-44a1-4009-b125-0dbd5194b97c" containerName="registry-server" Mar 18 18:17:00 crc kubenswrapper[4830]: I0318 18:17:00.900316 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-6d85b" Mar 18 18:17:00 crc kubenswrapper[4830]: I0318 18:17:00.902824 4830 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-8t77c" Mar 18 18:17:00 crc kubenswrapper[4830]: I0318 18:17:00.905532 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-6d85b"] Mar 18 18:17:01 crc kubenswrapper[4830]: I0318 18:17:01.068633 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m7kw\" (UniqueName: \"kubernetes.io/projected/c3a619f8-3b2a-4852-996d-eb0fb7c0ae8e-kube-api-access-5m7kw\") pod \"cert-manager-545d4d4674-6d85b\" (UID: \"c3a619f8-3b2a-4852-996d-eb0fb7c0ae8e\") " pod="cert-manager/cert-manager-545d4d4674-6d85b" Mar 18 18:17:01 crc kubenswrapper[4830]: I0318 18:17:01.068861 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3a619f8-3b2a-4852-996d-eb0fb7c0ae8e-bound-sa-token\") pod \"cert-manager-545d4d4674-6d85b\" (UID: \"c3a619f8-3b2a-4852-996d-eb0fb7c0ae8e\") " pod="cert-manager/cert-manager-545d4d4674-6d85b" Mar 18 18:17:01 crc kubenswrapper[4830]: I0318 18:17:01.170924 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3a619f8-3b2a-4852-996d-eb0fb7c0ae8e-bound-sa-token\") pod \"cert-manager-545d4d4674-6d85b\" (UID: \"c3a619f8-3b2a-4852-996d-eb0fb7c0ae8e\") " pod="cert-manager/cert-manager-545d4d4674-6d85b" Mar 18 18:17:01 crc kubenswrapper[4830]: I0318 18:17:01.171089 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m7kw\" (UniqueName: \"kubernetes.io/projected/c3a619f8-3b2a-4852-996d-eb0fb7c0ae8e-kube-api-access-5m7kw\") pod \"cert-manager-545d4d4674-6d85b\" (UID: \"c3a619f8-3b2a-4852-996d-eb0fb7c0ae8e\") " pod="cert-manager/cert-manager-545d4d4674-6d85b" Mar 18 18:17:01 crc kubenswrapper[4830]: I0318 18:17:01.190837 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3a619f8-3b2a-4852-996d-eb0fb7c0ae8e-bound-sa-token\") pod \"cert-manager-545d4d4674-6d85b\" (UID: \"c3a619f8-3b2a-4852-996d-eb0fb7c0ae8e\") " pod="cert-manager/cert-manager-545d4d4674-6d85b" Mar 18 18:17:01 crc kubenswrapper[4830]: I0318 18:17:01.192123 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m7kw\" (UniqueName: \"kubernetes.io/projected/c3a619f8-3b2a-4852-996d-eb0fb7c0ae8e-kube-api-access-5m7kw\") pod \"cert-manager-545d4d4674-6d85b\" (UID: \"c3a619f8-3b2a-4852-996d-eb0fb7c0ae8e\") " pod="cert-manager/cert-manager-545d4d4674-6d85b" Mar 18 18:17:01 crc kubenswrapper[4830]: I0318 18:17:01.221438 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-6d85b" Mar 18 18:17:01 crc kubenswrapper[4830]: I0318 18:17:01.483361 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-6d85b"] Mar 18 18:17:01 crc kubenswrapper[4830]: I0318 18:17:01.588788 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-6d85b" event={"ID":"c3a619f8-3b2a-4852-996d-eb0fb7c0ae8e","Type":"ContainerStarted","Data":"48e7b652e6b008fb6dd13881c000f43675b0acb38e0a8f90a9146fed5254a275"} Mar 18 18:17:02 crc kubenswrapper[4830]: I0318 18:17:02.599540 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-6d85b" event={"ID":"c3a619f8-3b2a-4852-996d-eb0fb7c0ae8e","Type":"ContainerStarted","Data":"6a11fd8a64f1504a76a12c2b23f44355b037125d29bc4912a7da3f0b49239e78"} Mar 18 18:17:02 crc kubenswrapper[4830]: I0318 18:17:02.618619 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-6d85b" podStartSLOduration=2.618594587 podStartE2EDuration="2.618594587s" podCreationTimestamp="2026-03-18 18:17:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:17:02.6126136 +0000 UTC m=+857.180243942" watchObservedRunningTime="2026-03-18 18:17:02.618594587 +0000 UTC m=+857.186224939" Mar 18 18:17:02 crc kubenswrapper[4830]: I0318 18:17:02.641453 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zdmkm"] Mar 18 18:17:02 crc kubenswrapper[4830]: I0318 18:17:02.642863 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zdmkm" Mar 18 18:17:02 crc kubenswrapper[4830]: I0318 18:17:02.667722 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zdmkm"] Mar 18 18:17:02 crc kubenswrapper[4830]: I0318 18:17:02.695150 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0368d018-687c-405a-bff7-2c333293f213-utilities\") pod \"certified-operators-zdmkm\" (UID: \"0368d018-687c-405a-bff7-2c333293f213\") " pod="openshift-marketplace/certified-operators-zdmkm" Mar 18 18:17:02 crc kubenswrapper[4830]: I0318 18:17:02.695222 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0368d018-687c-405a-bff7-2c333293f213-catalog-content\") pod \"certified-operators-zdmkm\" (UID: \"0368d018-687c-405a-bff7-2c333293f213\") " pod="openshift-marketplace/certified-operators-zdmkm" Mar 18 18:17:02 crc kubenswrapper[4830]: I0318 18:17:02.695452 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsv7x\" (UniqueName: \"kubernetes.io/projected/0368d018-687c-405a-bff7-2c333293f213-kube-api-access-rsv7x\") pod \"certified-operators-zdmkm\" (UID: \"0368d018-687c-405a-bff7-2c333293f213\") " pod="openshift-marketplace/certified-operators-zdmkm" Mar 18 18:17:02 crc kubenswrapper[4830]: I0318 18:17:02.796700 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsv7x\" (UniqueName: \"kubernetes.io/projected/0368d018-687c-405a-bff7-2c333293f213-kube-api-access-rsv7x\") pod \"certified-operators-zdmkm\" (UID: \"0368d018-687c-405a-bff7-2c333293f213\") " pod="openshift-marketplace/certified-operators-zdmkm" Mar 18 18:17:02 crc kubenswrapper[4830]: I0318 18:17:02.796832 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0368d018-687c-405a-bff7-2c333293f213-utilities\") pod \"certified-operators-zdmkm\" (UID: \"0368d018-687c-405a-bff7-2c333293f213\") " pod="openshift-marketplace/certified-operators-zdmkm" Mar 18 18:17:02 crc kubenswrapper[4830]: I0318 18:17:02.796855 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0368d018-687c-405a-bff7-2c333293f213-catalog-content\") pod \"certified-operators-zdmkm\" (UID: \"0368d018-687c-405a-bff7-2c333293f213\") " pod="openshift-marketplace/certified-operators-zdmkm" Mar 18 18:17:02 crc kubenswrapper[4830]: I0318 18:17:02.797299 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0368d018-687c-405a-bff7-2c333293f213-catalog-content\") pod \"certified-operators-zdmkm\" (UID: \"0368d018-687c-405a-bff7-2c333293f213\") " pod="openshift-marketplace/certified-operators-zdmkm" Mar 18 18:17:02 crc kubenswrapper[4830]: I0318 18:17:02.797641 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0368d018-687c-405a-bff7-2c333293f213-utilities\") pod \"certified-operators-zdmkm\" (UID: \"0368d018-687c-405a-bff7-2c333293f213\") " pod="openshift-marketplace/certified-operators-zdmkm" Mar 18 18:17:02 crc kubenswrapper[4830]: I0318 18:17:02.831631 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsv7x\" (UniqueName: \"kubernetes.io/projected/0368d018-687c-405a-bff7-2c333293f213-kube-api-access-rsv7x\") pod \"certified-operators-zdmkm\" (UID: \"0368d018-687c-405a-bff7-2c333293f213\") " pod="openshift-marketplace/certified-operators-zdmkm" Mar 18 18:17:02 crc kubenswrapper[4830]: I0318 18:17:02.962858 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zdmkm" Mar 18 18:17:03 crc kubenswrapper[4830]: I0318 18:17:03.584061 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zdmkm"] Mar 18 18:17:03 crc kubenswrapper[4830]: I0318 18:17:03.605174 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zdmkm" event={"ID":"0368d018-687c-405a-bff7-2c333293f213","Type":"ContainerStarted","Data":"f9ea7b200fd21d2ad433a3f73ae488422a7eb280dd4ed7987ba256836d735566"} Mar 18 18:17:04 crc kubenswrapper[4830]: I0318 18:17:04.636350 4830 generic.go:334] "Generic (PLEG): container finished" podID="0368d018-687c-405a-bff7-2c333293f213" containerID="b0b885f3718750727524984ca6de78c869ffdb06052e584e28c4e9b399f9a0c6" exitCode=0 Mar 18 18:17:04 crc kubenswrapper[4830]: I0318 18:17:04.636922 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zdmkm" event={"ID":"0368d018-687c-405a-bff7-2c333293f213","Type":"ContainerDied","Data":"b0b885f3718750727524984ca6de78c869ffdb06052e584e28c4e9b399f9a0c6"} Mar 18 18:17:05 crc kubenswrapper[4830]: I0318 18:17:05.647103 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zdmkm" event={"ID":"0368d018-687c-405a-bff7-2c333293f213","Type":"ContainerStarted","Data":"32ff27bc7065620a91aa262d8f66d790188d45cee32cc8fb86c423cf543b4cd2"} Mar 18 18:17:06 crc kubenswrapper[4830]: I0318 18:17:06.657094 4830 generic.go:334] "Generic (PLEG): container finished" podID="0368d018-687c-405a-bff7-2c333293f213" containerID="32ff27bc7065620a91aa262d8f66d790188d45cee32cc8fb86c423cf543b4cd2" exitCode=0 Mar 18 18:17:06 crc kubenswrapper[4830]: I0318 18:17:06.657162 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zdmkm" event={"ID":"0368d018-687c-405a-bff7-2c333293f213","Type":"ContainerDied","Data":"32ff27bc7065620a91aa262d8f66d790188d45cee32cc8fb86c423cf543b4cd2"} Mar 18 18:17:07 crc kubenswrapper[4830]: I0318 18:17:07.668669 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zdmkm" event={"ID":"0368d018-687c-405a-bff7-2c333293f213","Type":"ContainerStarted","Data":"4c96bb7b75ef2b6b2fe2cfec68f99ced5f9fe58db1a382bedf803cd488815044"} Mar 18 18:17:07 crc kubenswrapper[4830]: I0318 18:17:07.695354 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zdmkm" podStartSLOduration=3.214580903 podStartE2EDuration="5.69533794s" podCreationTimestamp="2026-03-18 18:17:02 +0000 UTC" firstStartedPulling="2026-03-18 18:17:04.638722748 +0000 UTC m=+859.206353090" lastFinishedPulling="2026-03-18 18:17:07.119479785 +0000 UTC m=+861.687110127" observedRunningTime="2026-03-18 18:17:07.690874095 +0000 UTC m=+862.258504427" watchObservedRunningTime="2026-03-18 18:17:07.69533794 +0000 UTC m=+862.262968272" Mar 18 18:17:10 crc kubenswrapper[4830]: I0318 18:17:10.073585 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-7bj7j"] Mar 18 18:17:10 crc kubenswrapper[4830]: I0318 18:17:10.075261 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7bj7j" Mar 18 18:17:10 crc kubenswrapper[4830]: I0318 18:17:10.081560 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7bj7j"] Mar 18 18:17:10 crc kubenswrapper[4830]: I0318 18:17:10.084859 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-kj577" Mar 18 18:17:10 crc kubenswrapper[4830]: I0318 18:17:10.085014 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 18 18:17:10 crc kubenswrapper[4830]: I0318 18:17:10.085143 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 18 18:17:10 crc kubenswrapper[4830]: I0318 18:17:10.111401 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l645g\" (UniqueName: \"kubernetes.io/projected/889b3ae2-68c7-4466-b8df-c3c2b6628999-kube-api-access-l645g\") pod \"openstack-operator-index-7bj7j\" (UID: \"889b3ae2-68c7-4466-b8df-c3c2b6628999\") " pod="openstack-operators/openstack-operator-index-7bj7j" Mar 18 18:17:10 crc kubenswrapper[4830]: I0318 18:17:10.213010 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l645g\" (UniqueName: \"kubernetes.io/projected/889b3ae2-68c7-4466-b8df-c3c2b6628999-kube-api-access-l645g\") pod \"openstack-operator-index-7bj7j\" (UID: \"889b3ae2-68c7-4466-b8df-c3c2b6628999\") " pod="openstack-operators/openstack-operator-index-7bj7j" Mar 18 18:17:10 crc kubenswrapper[4830]: I0318 18:17:10.233325 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l645g\" (UniqueName: \"kubernetes.io/projected/889b3ae2-68c7-4466-b8df-c3c2b6628999-kube-api-access-l645g\") pod \"openstack-operator-index-7bj7j\" (UID: \"889b3ae2-68c7-4466-b8df-c3c2b6628999\") " pod="openstack-operators/openstack-operator-index-7bj7j" Mar 18 18:17:10 crc kubenswrapper[4830]: I0318 18:17:10.409814 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7bj7j" Mar 18 18:17:10 crc kubenswrapper[4830]: I0318 18:17:10.678745 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7bj7j"] Mar 18 18:17:10 crc kubenswrapper[4830]: I0318 18:17:10.711137 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7bj7j" event={"ID":"889b3ae2-68c7-4466-b8df-c3c2b6628999","Type":"ContainerStarted","Data":"11bda6a866ebef53b640e61da811eadbd50ba698343a382fa6ffba0d225a4764"} Mar 18 18:17:12 crc kubenswrapper[4830]: I0318 18:17:12.732077 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7bj7j" event={"ID":"889b3ae2-68c7-4466-b8df-c3c2b6628999","Type":"ContainerStarted","Data":"17cae5820332b19cd365c76134d453d4e36270bc8ae3ef813d00379e032dcc98"} Mar 18 18:17:12 crc kubenswrapper[4830]: I0318 18:17:12.762276 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-7bj7j" podStartSLOduration=1.852395876 podStartE2EDuration="2.762238375s" podCreationTimestamp="2026-03-18 18:17:10 +0000 UTC" firstStartedPulling="2026-03-18 18:17:10.698112565 +0000 UTC m=+865.265742897" lastFinishedPulling="2026-03-18 18:17:11.607955054 +0000 UTC m=+866.175585396" observedRunningTime="2026-03-18 18:17:12.753159201 +0000 UTC m=+867.320789543" watchObservedRunningTime="2026-03-18 18:17:12.762238375 +0000 UTC m=+867.329868747" Mar 18 18:17:12 crc kubenswrapper[4830]: I0318 18:17:12.963456 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zdmkm" Mar 18 18:17:12 crc kubenswrapper[4830]: I0318 18:17:12.963566 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zdmkm" Mar 18 18:17:13 crc kubenswrapper[4830]: I0318 18:17:13.010006 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zdmkm" Mar 18 18:17:14 crc kubenswrapper[4830]: I0318 18:17:14.012806 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zdmkm" Mar 18 18:17:14 crc kubenswrapper[4830]: I0318 18:17:14.603477 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-7bj7j"] Mar 18 18:17:14 crc kubenswrapper[4830]: I0318 18:17:14.745164 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-7bj7j" podUID="889b3ae2-68c7-4466-b8df-c3c2b6628999" containerName="registry-server" containerID="cri-o://17cae5820332b19cd365c76134d453d4e36270bc8ae3ef813d00379e032dcc98" gracePeriod=2 Mar 18 18:17:15 crc kubenswrapper[4830]: I0318 18:17:15.406846 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-f9lwr"] Mar 18 18:17:15 crc kubenswrapper[4830]: I0318 18:17:15.407681 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-f9lwr" Mar 18 18:17:15 crc kubenswrapper[4830]: I0318 18:17:15.436440 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-f9lwr"] Mar 18 18:17:15 crc kubenswrapper[4830]: I0318 18:17:15.590018 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb27p\" (UniqueName: \"kubernetes.io/projected/2f7546fc-e4cb-438e-8091-74ed74bee260-kube-api-access-cb27p\") pod \"openstack-operator-index-f9lwr\" (UID: \"2f7546fc-e4cb-438e-8091-74ed74bee260\") " pod="openstack-operators/openstack-operator-index-f9lwr" Mar 18 18:17:15 crc kubenswrapper[4830]: I0318 18:17:15.687937 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7bj7j" Mar 18 18:17:15 crc kubenswrapper[4830]: I0318 18:17:15.691424 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l645g\" (UniqueName: \"kubernetes.io/projected/889b3ae2-68c7-4466-b8df-c3c2b6628999-kube-api-access-l645g\") pod \"889b3ae2-68c7-4466-b8df-c3c2b6628999\" (UID: \"889b3ae2-68c7-4466-b8df-c3c2b6628999\") " Mar 18 18:17:15 crc kubenswrapper[4830]: I0318 18:17:15.691558 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb27p\" (UniqueName: \"kubernetes.io/projected/2f7546fc-e4cb-438e-8091-74ed74bee260-kube-api-access-cb27p\") pod \"openstack-operator-index-f9lwr\" (UID: \"2f7546fc-e4cb-438e-8091-74ed74bee260\") " pod="openstack-operators/openstack-operator-index-f9lwr" Mar 18 18:17:15 crc kubenswrapper[4830]: I0318 18:17:15.705058 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/889b3ae2-68c7-4466-b8df-c3c2b6628999-kube-api-access-l645g" (OuterVolumeSpecName: "kube-api-access-l645g") pod "889b3ae2-68c7-4466-b8df-c3c2b6628999" (UID: "889b3ae2-68c7-4466-b8df-c3c2b6628999"). InnerVolumeSpecName "kube-api-access-l645g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:17:15 crc kubenswrapper[4830]: I0318 18:17:15.712672 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb27p\" (UniqueName: \"kubernetes.io/projected/2f7546fc-e4cb-438e-8091-74ed74bee260-kube-api-access-cb27p\") pod \"openstack-operator-index-f9lwr\" (UID: \"2f7546fc-e4cb-438e-8091-74ed74bee260\") " pod="openstack-operators/openstack-operator-index-f9lwr" Mar 18 18:17:15 crc kubenswrapper[4830]: I0318 18:17:15.737144 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-f9lwr" Mar 18 18:17:15 crc kubenswrapper[4830]: I0318 18:17:15.754292 4830 generic.go:334] "Generic (PLEG): container finished" podID="889b3ae2-68c7-4466-b8df-c3c2b6628999" containerID="17cae5820332b19cd365c76134d453d4e36270bc8ae3ef813d00379e032dcc98" exitCode=0 Mar 18 18:17:15 crc kubenswrapper[4830]: I0318 18:17:15.754344 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7bj7j" event={"ID":"889b3ae2-68c7-4466-b8df-c3c2b6628999","Type":"ContainerDied","Data":"17cae5820332b19cd365c76134d453d4e36270bc8ae3ef813d00379e032dcc98"} Mar 18 18:17:15 crc kubenswrapper[4830]: I0318 18:17:15.754356 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7bj7j" Mar 18 18:17:15 crc kubenswrapper[4830]: I0318 18:17:15.754376 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7bj7j" event={"ID":"889b3ae2-68c7-4466-b8df-c3c2b6628999","Type":"ContainerDied","Data":"11bda6a866ebef53b640e61da811eadbd50ba698343a382fa6ffba0d225a4764"} Mar 18 18:17:15 crc kubenswrapper[4830]: I0318 18:17:15.754401 4830 scope.go:117] "RemoveContainer" containerID="17cae5820332b19cd365c76134d453d4e36270bc8ae3ef813d00379e032dcc98" Mar 18 18:17:15 crc kubenswrapper[4830]: I0318 18:17:15.790273 4830 scope.go:117] "RemoveContainer" containerID="17cae5820332b19cd365c76134d453d4e36270bc8ae3ef813d00379e032dcc98" Mar 18 18:17:15 crc kubenswrapper[4830]: E0318 18:17:15.791437 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17cae5820332b19cd365c76134d453d4e36270bc8ae3ef813d00379e032dcc98\": container with ID starting with 17cae5820332b19cd365c76134d453d4e36270bc8ae3ef813d00379e032dcc98 not found: ID does not exist" containerID="17cae5820332b19cd365c76134d453d4e36270bc8ae3ef813d00379e032dcc98" Mar 18 18:17:15 crc kubenswrapper[4830]: I0318 18:17:15.791477 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17cae5820332b19cd365c76134d453d4e36270bc8ae3ef813d00379e032dcc98"} err="failed to get container status \"17cae5820332b19cd365c76134d453d4e36270bc8ae3ef813d00379e032dcc98\": rpc error: code = NotFound desc = could not find container \"17cae5820332b19cd365c76134d453d4e36270bc8ae3ef813d00379e032dcc98\": container with ID starting with 17cae5820332b19cd365c76134d453d4e36270bc8ae3ef813d00379e032dcc98 not found: ID does not exist" Mar 18 18:17:15 crc kubenswrapper[4830]: I0318 18:17:15.793057 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l645g\" (UniqueName: \"kubernetes.io/projected/889b3ae2-68c7-4466-b8df-c3c2b6628999-kube-api-access-l645g\") on node \"crc\" DevicePath \"\"" Mar 18 18:17:15 crc kubenswrapper[4830]: I0318 18:17:15.797260 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-7bj7j"] Mar 18 18:17:15 crc kubenswrapper[4830]: I0318 18:17:15.800614 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-7bj7j"] Mar 18 18:17:16 crc kubenswrapper[4830]: I0318 18:17:16.166528 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-f9lwr"] Mar 18 18:17:16 crc kubenswrapper[4830]: W0318 18:17:16.179160 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f7546fc_e4cb_438e_8091_74ed74bee260.slice/crio-6ae5d9a6fe0e950e58140eaf5e02666acfa1e3fc2bdb321f5942003f8fb5a91e WatchSource:0}: Error finding container 6ae5d9a6fe0e950e58140eaf5e02666acfa1e3fc2bdb321f5942003f8fb5a91e: Status 404 returned error can't find the container with id 6ae5d9a6fe0e950e58140eaf5e02666acfa1e3fc2bdb321f5942003f8fb5a91e Mar 18 18:17:16 crc kubenswrapper[4830]: I0318 18:17:16.261383 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="889b3ae2-68c7-4466-b8df-c3c2b6628999" path="/var/lib/kubelet/pods/889b3ae2-68c7-4466-b8df-c3c2b6628999/volumes" Mar 18 18:17:16 crc kubenswrapper[4830]: I0318 18:17:16.762983 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-f9lwr" event={"ID":"2f7546fc-e4cb-438e-8091-74ed74bee260","Type":"ContainerStarted","Data":"6ae5d9a6fe0e950e58140eaf5e02666acfa1e3fc2bdb321f5942003f8fb5a91e"} Mar 18 18:17:17 crc kubenswrapper[4830]: I0318 18:17:17.775311 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-f9lwr" event={"ID":"2f7546fc-e4cb-438e-8091-74ed74bee260","Type":"ContainerStarted","Data":"f83b9205176d5ea5c3d33010a7ffe6e8774a7f3843fb95c4424a88143e1eb77f"} Mar 18 18:17:17 crc kubenswrapper[4830]: I0318 18:17:17.796410 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-f9lwr" podStartSLOduration=2.12385222 podStartE2EDuration="2.796385766s" podCreationTimestamp="2026-03-18 18:17:15 +0000 UTC" firstStartedPulling="2026-03-18 18:17:16.18609169 +0000 UTC m=+870.753722052" lastFinishedPulling="2026-03-18 18:17:16.858625216 +0000 UTC m=+871.426255598" observedRunningTime="2026-03-18 18:17:17.795253994 +0000 UTC m=+872.362884336" watchObservedRunningTime="2026-03-18 18:17:17.796385766 +0000 UTC m=+872.364016118" Mar 18 18:17:19 crc kubenswrapper[4830]: I0318 18:17:19.204744 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zdmkm"] Mar 18 18:17:19 crc kubenswrapper[4830]: I0318 18:17:19.205225 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zdmkm" podUID="0368d018-687c-405a-bff7-2c333293f213" containerName="registry-server" containerID="cri-o://4c96bb7b75ef2b6b2fe2cfec68f99ced5f9fe58db1a382bedf803cd488815044" gracePeriod=2 Mar 18 18:17:19 crc kubenswrapper[4830]: I0318 18:17:19.685444 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zdmkm" Mar 18 18:17:19 crc kubenswrapper[4830]: I0318 18:17:19.790862 4830 generic.go:334] "Generic (PLEG): container finished" podID="0368d018-687c-405a-bff7-2c333293f213" containerID="4c96bb7b75ef2b6b2fe2cfec68f99ced5f9fe58db1a382bedf803cd488815044" exitCode=0 Mar 18 18:17:19 crc kubenswrapper[4830]: I0318 18:17:19.790908 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zdmkm" event={"ID":"0368d018-687c-405a-bff7-2c333293f213","Type":"ContainerDied","Data":"4c96bb7b75ef2b6b2fe2cfec68f99ced5f9fe58db1a382bedf803cd488815044"} Mar 18 18:17:19 crc kubenswrapper[4830]: I0318 18:17:19.790928 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zdmkm" Mar 18 18:17:19 crc kubenswrapper[4830]: I0318 18:17:19.790954 4830 scope.go:117] "RemoveContainer" containerID="4c96bb7b75ef2b6b2fe2cfec68f99ced5f9fe58db1a382bedf803cd488815044" Mar 18 18:17:19 crc kubenswrapper[4830]: I0318 18:17:19.790937 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zdmkm" event={"ID":"0368d018-687c-405a-bff7-2c333293f213","Type":"ContainerDied","Data":"f9ea7b200fd21d2ad433a3f73ae488422a7eb280dd4ed7987ba256836d735566"} Mar 18 18:17:19 crc kubenswrapper[4830]: I0318 18:17:19.809260 4830 scope.go:117] "RemoveContainer" containerID="32ff27bc7065620a91aa262d8f66d790188d45cee32cc8fb86c423cf543b4cd2" Mar 18 18:17:19 crc kubenswrapper[4830]: I0318 18:17:19.826875 4830 scope.go:117] "RemoveContainer" containerID="b0b885f3718750727524984ca6de78c869ffdb06052e584e28c4e9b399f9a0c6" Mar 18 18:17:19 crc kubenswrapper[4830]: I0318 18:17:19.852786 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsv7x\" (UniqueName: \"kubernetes.io/projected/0368d018-687c-405a-bff7-2c333293f213-kube-api-access-rsv7x\") pod \"0368d018-687c-405a-bff7-2c333293f213\" (UID: \"0368d018-687c-405a-bff7-2c333293f213\") " Mar 18 18:17:19 crc kubenswrapper[4830]: I0318 18:17:19.852840 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0368d018-687c-405a-bff7-2c333293f213-catalog-content\") pod \"0368d018-687c-405a-bff7-2c333293f213\" (UID: \"0368d018-687c-405a-bff7-2c333293f213\") " Mar 18 18:17:19 crc kubenswrapper[4830]: I0318 18:17:19.852984 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0368d018-687c-405a-bff7-2c333293f213-utilities\") pod \"0368d018-687c-405a-bff7-2c333293f213\" (UID: \"0368d018-687c-405a-bff7-2c333293f213\") " Mar 18 18:17:19 crc kubenswrapper[4830]: I0318 18:17:19.853920 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0368d018-687c-405a-bff7-2c333293f213-utilities" (OuterVolumeSpecName: "utilities") pod "0368d018-687c-405a-bff7-2c333293f213" (UID: "0368d018-687c-405a-bff7-2c333293f213"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:17:19 crc kubenswrapper[4830]: I0318 18:17:19.854143 4830 scope.go:117] "RemoveContainer" containerID="4c96bb7b75ef2b6b2fe2cfec68f99ced5f9fe58db1a382bedf803cd488815044" Mar 18 18:17:19 crc kubenswrapper[4830]: E0318 18:17:19.854476 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c96bb7b75ef2b6b2fe2cfec68f99ced5f9fe58db1a382bedf803cd488815044\": container with ID starting with 4c96bb7b75ef2b6b2fe2cfec68f99ced5f9fe58db1a382bedf803cd488815044 not found: ID does not exist" containerID="4c96bb7b75ef2b6b2fe2cfec68f99ced5f9fe58db1a382bedf803cd488815044" Mar 18 18:17:19 crc kubenswrapper[4830]: I0318 18:17:19.854506 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c96bb7b75ef2b6b2fe2cfec68f99ced5f9fe58db1a382bedf803cd488815044"} err="failed to get container status \"4c96bb7b75ef2b6b2fe2cfec68f99ced5f9fe58db1a382bedf803cd488815044\": rpc error: code = NotFound desc = could not find container \"4c96bb7b75ef2b6b2fe2cfec68f99ced5f9fe58db1a382bedf803cd488815044\": container with ID starting with 4c96bb7b75ef2b6b2fe2cfec68f99ced5f9fe58db1a382bedf803cd488815044 not found: ID does not exist" Mar 18 18:17:19 crc kubenswrapper[4830]: I0318 18:17:19.854528 4830 scope.go:117] "RemoveContainer" containerID="32ff27bc7065620a91aa262d8f66d790188d45cee32cc8fb86c423cf543b4cd2" Mar 18 18:17:19 crc kubenswrapper[4830]: E0318 18:17:19.854854 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32ff27bc7065620a91aa262d8f66d790188d45cee32cc8fb86c423cf543b4cd2\": container with ID starting with 32ff27bc7065620a91aa262d8f66d790188d45cee32cc8fb86c423cf543b4cd2 not found: ID does not exist" containerID="32ff27bc7065620a91aa262d8f66d790188d45cee32cc8fb86c423cf543b4cd2" Mar 18 18:17:19 crc kubenswrapper[4830]: I0318 18:17:19.854873 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32ff27bc7065620a91aa262d8f66d790188d45cee32cc8fb86c423cf543b4cd2"} err="failed to get container status \"32ff27bc7065620a91aa262d8f66d790188d45cee32cc8fb86c423cf543b4cd2\": rpc error: code = NotFound desc = could not find container \"32ff27bc7065620a91aa262d8f66d790188d45cee32cc8fb86c423cf543b4cd2\": container with ID starting with 32ff27bc7065620a91aa262d8f66d790188d45cee32cc8fb86c423cf543b4cd2 not found: ID does not exist" Mar 18 18:17:19 crc kubenswrapper[4830]: I0318 18:17:19.854885 4830 scope.go:117] "RemoveContainer" containerID="b0b885f3718750727524984ca6de78c869ffdb06052e584e28c4e9b399f9a0c6" Mar 18 18:17:19 crc kubenswrapper[4830]: E0318 18:17:19.855233 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0b885f3718750727524984ca6de78c869ffdb06052e584e28c4e9b399f9a0c6\": container with ID starting with b0b885f3718750727524984ca6de78c869ffdb06052e584e28c4e9b399f9a0c6 not found: ID does not exist" containerID="b0b885f3718750727524984ca6de78c869ffdb06052e584e28c4e9b399f9a0c6" Mar 18 18:17:19 crc kubenswrapper[4830]: I0318 18:17:19.855254 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0b885f3718750727524984ca6de78c869ffdb06052e584e28c4e9b399f9a0c6"} err="failed to get container status \"b0b885f3718750727524984ca6de78c869ffdb06052e584e28c4e9b399f9a0c6\": rpc error: code = NotFound desc = could not find container \"b0b885f3718750727524984ca6de78c869ffdb06052e584e28c4e9b399f9a0c6\": container with ID starting with b0b885f3718750727524984ca6de78c869ffdb06052e584e28c4e9b399f9a0c6 not found: ID does not exist" Mar 18 18:17:19 crc kubenswrapper[4830]: I0318 18:17:19.858088 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0368d018-687c-405a-bff7-2c333293f213-kube-api-access-rsv7x" (OuterVolumeSpecName: "kube-api-access-rsv7x") pod "0368d018-687c-405a-bff7-2c333293f213" (UID: "0368d018-687c-405a-bff7-2c333293f213"). InnerVolumeSpecName "kube-api-access-rsv7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:17:19 crc kubenswrapper[4830]: I0318 18:17:19.907583 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0368d018-687c-405a-bff7-2c333293f213-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0368d018-687c-405a-bff7-2c333293f213" (UID: "0368d018-687c-405a-bff7-2c333293f213"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:17:19 crc kubenswrapper[4830]: I0318 18:17:19.955107 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0368d018-687c-405a-bff7-2c333293f213-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:17:19 crc kubenswrapper[4830]: I0318 18:17:19.955143 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsv7x\" (UniqueName: \"kubernetes.io/projected/0368d018-687c-405a-bff7-2c333293f213-kube-api-access-rsv7x\") on node \"crc\" DevicePath \"\"" Mar 18 18:17:19 crc kubenswrapper[4830]: I0318 18:17:19.955154 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0368d018-687c-405a-bff7-2c333293f213-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:17:20 crc kubenswrapper[4830]: I0318 18:17:20.117058 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zdmkm"] Mar 18 18:17:20 crc kubenswrapper[4830]: I0318 18:17:20.121273 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zdmkm"] Mar 18 18:17:20 crc kubenswrapper[4830]: I0318 18:17:20.241581 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0368d018-687c-405a-bff7-2c333293f213" path="/var/lib/kubelet/pods/0368d018-687c-405a-bff7-2c333293f213/volumes" Mar 18 18:17:25 crc kubenswrapper[4830]: I0318 18:17:25.738996 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-f9lwr" Mar 18 18:17:25 crc kubenswrapper[4830]: I0318 18:17:25.739469 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-f9lwr" Mar 18 18:17:25 crc kubenswrapper[4830]: I0318 18:17:25.787088 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-f9lwr" Mar 18 18:17:25 crc kubenswrapper[4830]: I0318 18:17:25.888989 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-f9lwr" Mar 18 18:17:27 crc kubenswrapper[4830]: I0318 18:17:27.265387 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk"] Mar 18 18:17:27 crc kubenswrapper[4830]: E0318 18:17:27.265959 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="889b3ae2-68c7-4466-b8df-c3c2b6628999" containerName="registry-server" Mar 18 18:17:27 crc kubenswrapper[4830]: I0318 18:17:27.265973 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="889b3ae2-68c7-4466-b8df-c3c2b6628999" containerName="registry-server" Mar 18 18:17:27 crc kubenswrapper[4830]: E0318 18:17:27.265985 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0368d018-687c-405a-bff7-2c333293f213" containerName="extract-utilities" Mar 18 18:17:27 crc kubenswrapper[4830]: I0318 18:17:27.265993 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="0368d018-687c-405a-bff7-2c333293f213" containerName="extract-utilities" Mar 18 18:17:27 crc kubenswrapper[4830]: E0318 18:17:27.266004 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0368d018-687c-405a-bff7-2c333293f213" containerName="registry-server" Mar 18 18:17:27 crc kubenswrapper[4830]: I0318 18:17:27.266010 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="0368d018-687c-405a-bff7-2c333293f213" containerName="registry-server" Mar 18 18:17:27 crc kubenswrapper[4830]: E0318 18:17:27.266024 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0368d018-687c-405a-bff7-2c333293f213" containerName="extract-content" Mar 18 18:17:27 crc kubenswrapper[4830]: I0318 18:17:27.266030 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="0368d018-687c-405a-bff7-2c333293f213" containerName="extract-content" Mar 18 18:17:27 crc kubenswrapper[4830]: I0318 18:17:27.266124 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="889b3ae2-68c7-4466-b8df-c3c2b6628999" containerName="registry-server" Mar 18 18:17:27 crc kubenswrapper[4830]: I0318 18:17:27.266136 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="0368d018-687c-405a-bff7-2c333293f213" containerName="registry-server" Mar 18 18:17:27 crc kubenswrapper[4830]: I0318 18:17:27.266925 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk" Mar 18 18:17:27 crc kubenswrapper[4830]: I0318 18:17:27.270831 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-v746s" Mar 18 18:17:27 crc kubenswrapper[4830]: I0318 18:17:27.276402 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk"] Mar 18 18:17:27 crc kubenswrapper[4830]: I0318 18:17:27.365968 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/469b4b97-e3cf-43f8-b161-3dfe6489da28-util\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk\" (UID: \"469b4b97-e3cf-43f8-b161-3dfe6489da28\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk" Mar 18 18:17:27 crc kubenswrapper[4830]: I0318 18:17:27.366070 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvrl5\" (UniqueName: \"kubernetes.io/projected/469b4b97-e3cf-43f8-b161-3dfe6489da28-kube-api-access-wvrl5\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk\" (UID: \"469b4b97-e3cf-43f8-b161-3dfe6489da28\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk" Mar 18 18:17:27 crc kubenswrapper[4830]: I0318 18:17:27.366118 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/469b4b97-e3cf-43f8-b161-3dfe6489da28-bundle\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk\" (UID: \"469b4b97-e3cf-43f8-b161-3dfe6489da28\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk" Mar 18 18:17:27 crc kubenswrapper[4830]: I0318 18:17:27.468434 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvrl5\" (UniqueName: \"kubernetes.io/projected/469b4b97-e3cf-43f8-b161-3dfe6489da28-kube-api-access-wvrl5\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk\" (UID: \"469b4b97-e3cf-43f8-b161-3dfe6489da28\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk" Mar 18 18:17:27 crc kubenswrapper[4830]: I0318 18:17:27.468499 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/469b4b97-e3cf-43f8-b161-3dfe6489da28-bundle\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk\" (UID: \"469b4b97-e3cf-43f8-b161-3dfe6489da28\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk" Mar 18 18:17:27 crc kubenswrapper[4830]: I0318 18:17:27.468600 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/469b4b97-e3cf-43f8-b161-3dfe6489da28-util\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk\" (UID: \"469b4b97-e3cf-43f8-b161-3dfe6489da28\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk" Mar 18 18:17:27 crc kubenswrapper[4830]: I0318 18:17:27.469149 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/469b4b97-e3cf-43f8-b161-3dfe6489da28-util\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk\" (UID: \"469b4b97-e3cf-43f8-b161-3dfe6489da28\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk" Mar 18 18:17:27 crc kubenswrapper[4830]: I0318 18:17:27.469197 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/469b4b97-e3cf-43f8-b161-3dfe6489da28-bundle\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk\" (UID: \"469b4b97-e3cf-43f8-b161-3dfe6489da28\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk" Mar 18 18:17:27 crc kubenswrapper[4830]: I0318 18:17:27.507027 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvrl5\" (UniqueName: \"kubernetes.io/projected/469b4b97-e3cf-43f8-b161-3dfe6489da28-kube-api-access-wvrl5\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk\" (UID: \"469b4b97-e3cf-43f8-b161-3dfe6489da28\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk" Mar 18 18:17:27 crc kubenswrapper[4830]: I0318 18:17:27.586885 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk" Mar 18 18:17:28 crc kubenswrapper[4830]: I0318 18:17:28.012386 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk"] Mar 18 18:17:28 crc kubenswrapper[4830]: E0318 18:17:28.368927 4830 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod469b4b97_e3cf_43f8_b161_3dfe6489da28.slice/crio-3511f192c1ece7f49fc0ae85fc40e509798549e74080a2279c814d3b3c7e7879.scope\": RecentStats: unable to find data in memory cache]" Mar 18 18:17:28 crc kubenswrapper[4830]: I0318 18:17:28.869881 4830 generic.go:334] "Generic (PLEG): container finished" podID="469b4b97-e3cf-43f8-b161-3dfe6489da28" containerID="3511f192c1ece7f49fc0ae85fc40e509798549e74080a2279c814d3b3c7e7879" exitCode=0 Mar 18 18:17:28 crc kubenswrapper[4830]: I0318 18:17:28.869958 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk" event={"ID":"469b4b97-e3cf-43f8-b161-3dfe6489da28","Type":"ContainerDied","Data":"3511f192c1ece7f49fc0ae85fc40e509798549e74080a2279c814d3b3c7e7879"} Mar 18 18:17:28 crc kubenswrapper[4830]: I0318 18:17:28.870000 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk" event={"ID":"469b4b97-e3cf-43f8-b161-3dfe6489da28","Type":"ContainerStarted","Data":"1384a9642bbbffe9681afe8a3f11623ff6ce4d0d29f7b64df5a655e1f1851115"} Mar 18 18:17:30 crc kubenswrapper[4830]: I0318 18:17:30.886805 4830 generic.go:334] "Generic (PLEG): container finished" podID="469b4b97-e3cf-43f8-b161-3dfe6489da28" containerID="8e995f95b31166f66719e58df84e1c6c3d8cf4936e027f7c22d9f6d0ee8b8724" exitCode=0 Mar 18 18:17:30 crc kubenswrapper[4830]: I0318 18:17:30.886992 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk" event={"ID":"469b4b97-e3cf-43f8-b161-3dfe6489da28","Type":"ContainerDied","Data":"8e995f95b31166f66719e58df84e1c6c3d8cf4936e027f7c22d9f6d0ee8b8724"} Mar 18 18:17:31 crc kubenswrapper[4830]: I0318 18:17:31.899967 4830 generic.go:334] "Generic (PLEG): container finished" podID="469b4b97-e3cf-43f8-b161-3dfe6489da28" containerID="c92ec8718ff06ef95e31b65612ae1c43bd8eaaf4e13d89ceca1474152c152aef" exitCode=0 Mar 18 18:17:31 crc kubenswrapper[4830]: I0318 18:17:31.900042 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk" event={"ID":"469b4b97-e3cf-43f8-b161-3dfe6489da28","Type":"ContainerDied","Data":"c92ec8718ff06ef95e31b65612ae1c43bd8eaaf4e13d89ceca1474152c152aef"} Mar 18 18:17:33 crc kubenswrapper[4830]: I0318 18:17:33.290502 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk" Mar 18 18:17:33 crc kubenswrapper[4830]: I0318 18:17:33.469258 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/469b4b97-e3cf-43f8-b161-3dfe6489da28-bundle\") pod \"469b4b97-e3cf-43f8-b161-3dfe6489da28\" (UID: \"469b4b97-e3cf-43f8-b161-3dfe6489da28\") " Mar 18 18:17:33 crc kubenswrapper[4830]: I0318 18:17:33.469413 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvrl5\" (UniqueName: \"kubernetes.io/projected/469b4b97-e3cf-43f8-b161-3dfe6489da28-kube-api-access-wvrl5\") pod \"469b4b97-e3cf-43f8-b161-3dfe6489da28\" (UID: \"469b4b97-e3cf-43f8-b161-3dfe6489da28\") " Mar 18 18:17:33 crc kubenswrapper[4830]: I0318 18:17:33.469457 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/469b4b97-e3cf-43f8-b161-3dfe6489da28-util\") pod \"469b4b97-e3cf-43f8-b161-3dfe6489da28\" (UID: \"469b4b97-e3cf-43f8-b161-3dfe6489da28\") " Mar 18 18:17:33 crc kubenswrapper[4830]: I0318 18:17:33.471059 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/469b4b97-e3cf-43f8-b161-3dfe6489da28-bundle" (OuterVolumeSpecName: "bundle") pod "469b4b97-e3cf-43f8-b161-3dfe6489da28" (UID: "469b4b97-e3cf-43f8-b161-3dfe6489da28"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:17:33 crc kubenswrapper[4830]: I0318 18:17:33.482074 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/469b4b97-e3cf-43f8-b161-3dfe6489da28-kube-api-access-wvrl5" (OuterVolumeSpecName: "kube-api-access-wvrl5") pod "469b4b97-e3cf-43f8-b161-3dfe6489da28" (UID: "469b4b97-e3cf-43f8-b161-3dfe6489da28"). InnerVolumeSpecName "kube-api-access-wvrl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:17:33 crc kubenswrapper[4830]: I0318 18:17:33.571762 4830 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/469b4b97-e3cf-43f8-b161-3dfe6489da28-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:17:33 crc kubenswrapper[4830]: I0318 18:17:33.571831 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvrl5\" (UniqueName: \"kubernetes.io/projected/469b4b97-e3cf-43f8-b161-3dfe6489da28-kube-api-access-wvrl5\") on node \"crc\" DevicePath \"\"" Mar 18 18:17:33 crc kubenswrapper[4830]: I0318 18:17:33.648688 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/469b4b97-e3cf-43f8-b161-3dfe6489da28-util" (OuterVolumeSpecName: "util") pod "469b4b97-e3cf-43f8-b161-3dfe6489da28" (UID: "469b4b97-e3cf-43f8-b161-3dfe6489da28"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:17:33 crc kubenswrapper[4830]: I0318 18:17:33.673150 4830 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/469b4b97-e3cf-43f8-b161-3dfe6489da28-util\") on node \"crc\" DevicePath \"\"" Mar 18 18:17:33 crc kubenswrapper[4830]: I0318 18:17:33.921787 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk" event={"ID":"469b4b97-e3cf-43f8-b161-3dfe6489da28","Type":"ContainerDied","Data":"1384a9642bbbffe9681afe8a3f11623ff6ce4d0d29f7b64df5a655e1f1851115"} Mar 18 18:17:33 crc kubenswrapper[4830]: I0318 18:17:33.921841 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1384a9642bbbffe9681afe8a3f11623ff6ce4d0d29f7b64df5a655e1f1851115" Mar 18 18:17:33 crc kubenswrapper[4830]: I0318 18:17:33.921898 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk" Mar 18 18:17:36 crc kubenswrapper[4830]: I0318 18:17:36.051562 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-b85c4d696-j9fcx"] Mar 18 18:17:36 crc kubenswrapper[4830]: E0318 18:17:36.052652 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="469b4b97-e3cf-43f8-b161-3dfe6489da28" containerName="util" Mar 18 18:17:36 crc kubenswrapper[4830]: I0318 18:17:36.052669 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="469b4b97-e3cf-43f8-b161-3dfe6489da28" containerName="util" Mar 18 18:17:36 crc kubenswrapper[4830]: E0318 18:17:36.052686 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="469b4b97-e3cf-43f8-b161-3dfe6489da28" containerName="extract" Mar 18 18:17:36 crc kubenswrapper[4830]: I0318 18:17:36.052693 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="469b4b97-e3cf-43f8-b161-3dfe6489da28" containerName="extract" Mar 18 18:17:36 crc kubenswrapper[4830]: E0318 18:17:36.052714 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="469b4b97-e3cf-43f8-b161-3dfe6489da28" containerName="pull" Mar 18 18:17:36 crc kubenswrapper[4830]: I0318 18:17:36.052724 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="469b4b97-e3cf-43f8-b161-3dfe6489da28" containerName="pull" Mar 18 18:17:36 crc kubenswrapper[4830]: I0318 18:17:36.052888 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="469b4b97-e3cf-43f8-b161-3dfe6489da28" containerName="extract" Mar 18 18:17:36 crc kubenswrapper[4830]: I0318 18:17:36.053499 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-j9fcx" Mar 18 18:17:36 crc kubenswrapper[4830]: I0318 18:17:36.060148 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-rj4q5" Mar 18 18:17:36 crc kubenswrapper[4830]: I0318 18:17:36.091678 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-b85c4d696-j9fcx"] Mar 18 18:17:36 crc kubenswrapper[4830]: I0318 18:17:36.107535 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvf6w\" (UniqueName: \"kubernetes.io/projected/2897dfaf-4627-4986-8920-e6c789387c3c-kube-api-access-hvf6w\") pod \"openstack-operator-controller-init-b85c4d696-j9fcx\" (UID: \"2897dfaf-4627-4986-8920-e6c789387c3c\") " pod="openstack-operators/openstack-operator-controller-init-b85c4d696-j9fcx" Mar 18 18:17:36 crc kubenswrapper[4830]: I0318 18:17:36.208435 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvf6w\" (UniqueName: \"kubernetes.io/projected/2897dfaf-4627-4986-8920-e6c789387c3c-kube-api-access-hvf6w\") pod \"openstack-operator-controller-init-b85c4d696-j9fcx\" (UID: \"2897dfaf-4627-4986-8920-e6c789387c3c\") " pod="openstack-operators/openstack-operator-controller-init-b85c4d696-j9fcx" Mar 18 18:17:36 crc kubenswrapper[4830]: I0318 18:17:36.229027 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvf6w\" (UniqueName: \"kubernetes.io/projected/2897dfaf-4627-4986-8920-e6c789387c3c-kube-api-access-hvf6w\") pod \"openstack-operator-controller-init-b85c4d696-j9fcx\" (UID: \"2897dfaf-4627-4986-8920-e6c789387c3c\") " pod="openstack-operators/openstack-operator-controller-init-b85c4d696-j9fcx" Mar 18 18:17:36 crc kubenswrapper[4830]: I0318 18:17:36.370415 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-j9fcx" Mar 18 18:17:36 crc kubenswrapper[4830]: I0318 18:17:36.777686 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-b85c4d696-j9fcx"] Mar 18 18:17:36 crc kubenswrapper[4830]: I0318 18:17:36.943435 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-j9fcx" event={"ID":"2897dfaf-4627-4986-8920-e6c789387c3c","Type":"ContainerStarted","Data":"0744f59996ff5808ee413c43bfccba10ba86fb440c832f1a29eb31fd4ddaa9e6"} Mar 18 18:17:42 crc kubenswrapper[4830]: I0318 18:17:42.984687 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-j9fcx" event={"ID":"2897dfaf-4627-4986-8920-e6c789387c3c","Type":"ContainerStarted","Data":"b61f469b88e48819e911770f65bc708741d500504f53cc447ea5e65d810bc118"} Mar 18 18:17:42 crc kubenswrapper[4830]: I0318 18:17:42.985260 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-j9fcx" Mar 18 18:17:43 crc kubenswrapper[4830]: I0318 18:17:43.016081 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-j9fcx" podStartSLOduration=1.442179231 podStartE2EDuration="7.016056865s" podCreationTimestamp="2026-03-18 18:17:36 +0000 UTC" firstStartedPulling="2026-03-18 18:17:36.790882237 +0000 UTC m=+891.358512559" lastFinishedPulling="2026-03-18 18:17:42.364759861 +0000 UTC m=+896.932390193" observedRunningTime="2026-03-18 18:17:43.009551933 +0000 UTC m=+897.577182305" watchObservedRunningTime="2026-03-18 18:17:43.016056865 +0000 UTC m=+897.583687217" Mar 18 18:17:56 crc kubenswrapper[4830]: I0318 18:17:56.376259 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-j9fcx" Mar 18 18:18:00 crc kubenswrapper[4830]: I0318 18:18:00.149662 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564298-2ll76"] Mar 18 18:18:00 crc kubenswrapper[4830]: I0318 18:18:00.151528 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564298-2ll76" Mar 18 18:18:00 crc kubenswrapper[4830]: I0318 18:18:00.154700 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 18:18:00 crc kubenswrapper[4830]: I0318 18:18:00.154815 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:18:00 crc kubenswrapper[4830]: I0318 18:18:00.154864 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:18:00 crc kubenswrapper[4830]: I0318 18:18:00.169402 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564298-2ll76"] Mar 18 18:18:00 crc kubenswrapper[4830]: I0318 18:18:00.204659 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfvtf\" (UniqueName: \"kubernetes.io/projected/fe310aca-eb6c-414a-ace8-e55bc2fd4133-kube-api-access-sfvtf\") pod \"auto-csr-approver-29564298-2ll76\" (UID: \"fe310aca-eb6c-414a-ace8-e55bc2fd4133\") " pod="openshift-infra/auto-csr-approver-29564298-2ll76" Mar 18 18:18:00 crc kubenswrapper[4830]: I0318 18:18:00.306381 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfvtf\" (UniqueName: \"kubernetes.io/projected/fe310aca-eb6c-414a-ace8-e55bc2fd4133-kube-api-access-sfvtf\") pod \"auto-csr-approver-29564298-2ll76\" (UID: \"fe310aca-eb6c-414a-ace8-e55bc2fd4133\") " pod="openshift-infra/auto-csr-approver-29564298-2ll76" Mar 18 18:18:00 crc kubenswrapper[4830]: I0318 18:18:00.324481 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfvtf\" (UniqueName: \"kubernetes.io/projected/fe310aca-eb6c-414a-ace8-e55bc2fd4133-kube-api-access-sfvtf\") pod \"auto-csr-approver-29564298-2ll76\" (UID: \"fe310aca-eb6c-414a-ace8-e55bc2fd4133\") " pod="openshift-infra/auto-csr-approver-29564298-2ll76" Mar 18 18:18:00 crc kubenswrapper[4830]: I0318 18:18:00.480826 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564298-2ll76" Mar 18 18:18:00 crc kubenswrapper[4830]: I0318 18:18:00.950738 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564298-2ll76"] Mar 18 18:18:01 crc kubenswrapper[4830]: I0318 18:18:01.137204 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564298-2ll76" event={"ID":"fe310aca-eb6c-414a-ace8-e55bc2fd4133","Type":"ContainerStarted","Data":"d8dde249d1d7510ce85ebcb40bd995f8bb34696a425079b7bfc8263dcb0d865a"} Mar 18 18:18:03 crc kubenswrapper[4830]: I0318 18:18:03.156540 4830 generic.go:334] "Generic (PLEG): container finished" podID="fe310aca-eb6c-414a-ace8-e55bc2fd4133" containerID="0d178a10cd244dd97bedc863ccf064d241d53860aa33a14df01d0859301aff05" exitCode=0 Mar 18 18:18:03 crc kubenswrapper[4830]: I0318 18:18:03.156627 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564298-2ll76" event={"ID":"fe310aca-eb6c-414a-ace8-e55bc2fd4133","Type":"ContainerDied","Data":"0d178a10cd244dd97bedc863ccf064d241d53860aa33a14df01d0859301aff05"} Mar 18 18:18:04 crc kubenswrapper[4830]: I0318 18:18:04.506579 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564298-2ll76" Mar 18 18:18:04 crc kubenswrapper[4830]: I0318 18:18:04.616652 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfvtf\" (UniqueName: \"kubernetes.io/projected/fe310aca-eb6c-414a-ace8-e55bc2fd4133-kube-api-access-sfvtf\") pod \"fe310aca-eb6c-414a-ace8-e55bc2fd4133\" (UID: \"fe310aca-eb6c-414a-ace8-e55bc2fd4133\") " Mar 18 18:18:04 crc kubenswrapper[4830]: I0318 18:18:04.624123 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe310aca-eb6c-414a-ace8-e55bc2fd4133-kube-api-access-sfvtf" (OuterVolumeSpecName: "kube-api-access-sfvtf") pod "fe310aca-eb6c-414a-ace8-e55bc2fd4133" (UID: "fe310aca-eb6c-414a-ace8-e55bc2fd4133"). InnerVolumeSpecName "kube-api-access-sfvtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:18:04 crc kubenswrapper[4830]: I0318 18:18:04.718311 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfvtf\" (UniqueName: \"kubernetes.io/projected/fe310aca-eb6c-414a-ace8-e55bc2fd4133-kube-api-access-sfvtf\") on node \"crc\" DevicePath \"\"" Mar 18 18:18:05 crc kubenswrapper[4830]: I0318 18:18:05.201907 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564298-2ll76" event={"ID":"fe310aca-eb6c-414a-ace8-e55bc2fd4133","Type":"ContainerDied","Data":"d8dde249d1d7510ce85ebcb40bd995f8bb34696a425079b7bfc8263dcb0d865a"} Mar 18 18:18:05 crc kubenswrapper[4830]: I0318 18:18:05.202222 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8dde249d1d7510ce85ebcb40bd995f8bb34696a425079b7bfc8263dcb0d865a" Mar 18 18:18:05 crc kubenswrapper[4830]: I0318 18:18:05.202023 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564298-2ll76" Mar 18 18:18:05 crc kubenswrapper[4830]: I0318 18:18:05.564461 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564292-cw525"] Mar 18 18:18:05 crc kubenswrapper[4830]: I0318 18:18:05.569449 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564292-cw525"] Mar 18 18:18:06 crc kubenswrapper[4830]: I0318 18:18:06.243950 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cebfa1a5-fbb1-42df-91aa-7b6b07bf72d0" path="/var/lib/kubelet/pods/cebfa1a5-fbb1-42df-91aa-7b6b07bf72d0/volumes" Mar 18 18:18:10 crc kubenswrapper[4830]: I0318 18:18:10.646332 4830 scope.go:117] "RemoveContainer" containerID="7e77e9b2ec82769db03ddeef037c29764253859bdb47ad46ff58506f55cd808a" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.591587 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-n478g"] Mar 18 18:18:16 crc kubenswrapper[4830]: E0318 18:18:16.594285 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe310aca-eb6c-414a-ace8-e55bc2fd4133" containerName="oc" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.594419 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe310aca-eb6c-414a-ace8-e55bc2fd4133" containerName="oc" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.594714 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe310aca-eb6c-414a-ace8-e55bc2fd4133" containerName="oc" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.595462 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-n478g" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.597883 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-4tflr"] Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.598901 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-4tflr" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.601285 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-mc8gw" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.601826 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-qznf5" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.612444 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-bw766"] Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.613369 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-bw766" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.624032 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-9wmzs" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.635795 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-bw766"] Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.643379 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-4tflr"] Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.668833 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-cspdp"] Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.669837 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cspdp" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.680429 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-xrwvv" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.689906 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-84z2j"] Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.691040 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-84z2j" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.696898 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-f7mfc" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.707573 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9b6s\" (UniqueName: \"kubernetes.io/projected/09f3d007-f621-4c30-a3f8-f3280a7db75d-kube-api-access-c9b6s\") pod \"barbican-operator-controller-manager-59bc569d95-n478g\" (UID: \"09f3d007-f621-4c30-a3f8-f3280a7db75d\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-n478g" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.707624 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch8v9\" (UniqueName: \"kubernetes.io/projected/ff6a5b70-c9ae-4087-b4fd-e24a712e6e33-kube-api-access-ch8v9\") pod \"cinder-operator-controller-manager-8d58dc466-4tflr\" (UID: \"ff6a5b70-c9ae-4087-b4fd-e24a712e6e33\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-4tflr" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.707667 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfpdz\" (UniqueName: \"kubernetes.io/projected/cebd0fbd-7733-464a-aead-539d69b70b04-kube-api-access-gfpdz\") pod \"designate-operator-controller-manager-588d4d986b-bw766\" (UID: \"cebd0fbd-7733-464a-aead-539d69b70b04\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-bw766" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.712188 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-n478g"] Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.740928 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-84z2j"] Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.750154 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-cspdp"] Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.765454 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-b2f94"] Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.766255 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-b2f94" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.779309 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-6xmnn" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.812304 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tmhb\" (UniqueName: \"kubernetes.io/projected/bd94092b-4a34-4f83-9aaa-5ddac374e97a-kube-api-access-8tmhb\") pod \"glance-operator-controller-manager-79df6bcc97-cspdp\" (UID: \"bd94092b-4a34-4f83-9aaa-5ddac374e97a\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cspdp" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.812393 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9b6s\" (UniqueName: \"kubernetes.io/projected/09f3d007-f621-4c30-a3f8-f3280a7db75d-kube-api-access-c9b6s\") pod \"barbican-operator-controller-manager-59bc569d95-n478g\" (UID: \"09f3d007-f621-4c30-a3f8-f3280a7db75d\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-n478g" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.812425 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch8v9\" (UniqueName: \"kubernetes.io/projected/ff6a5b70-c9ae-4087-b4fd-e24a712e6e33-kube-api-access-ch8v9\") pod \"cinder-operator-controller-manager-8d58dc466-4tflr\" (UID: \"ff6a5b70-c9ae-4087-b4fd-e24a712e6e33\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-4tflr" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.812460 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89lc7\" (UniqueName: \"kubernetes.io/projected/b1bf7404-a81d-42e8-bc1b-157c5cd791b0-kube-api-access-89lc7\") pod \"heat-operator-controller-manager-67dd5f86f5-84z2j\" (UID: \"b1bf7404-a81d-42e8-bc1b-157c5cd791b0\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-84z2j" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.812489 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfpdz\" (UniqueName: \"kubernetes.io/projected/cebd0fbd-7733-464a-aead-539d69b70b04-kube-api-access-gfpdz\") pod \"designate-operator-controller-manager-588d4d986b-bw766\" (UID: \"cebd0fbd-7733-464a-aead-539d69b70b04\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-bw766" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.815071 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-mg24p"] Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.816681 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-mg24p" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.819426 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-tx4km" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.819510 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.834435 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-nvkbv"] Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.867061 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch8v9\" (UniqueName: \"kubernetes.io/projected/ff6a5b70-c9ae-4087-b4fd-e24a712e6e33-kube-api-access-ch8v9\") pod \"cinder-operator-controller-manager-8d58dc466-4tflr\" (UID: \"ff6a5b70-c9ae-4087-b4fd-e24a712e6e33\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-4tflr" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.868847 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-nvkbv" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.873194 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-w8x8f" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.876175 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfpdz\" (UniqueName: \"kubernetes.io/projected/cebd0fbd-7733-464a-aead-539d69b70b04-kube-api-access-gfpdz\") pod \"designate-operator-controller-manager-588d4d986b-bw766\" (UID: \"cebd0fbd-7733-464a-aead-539d69b70b04\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-bw766" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.891081 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9b6s\" (UniqueName: \"kubernetes.io/projected/09f3d007-f621-4c30-a3f8-f3280a7db75d-kube-api-access-c9b6s\") pod \"barbican-operator-controller-manager-59bc569d95-n478g\" (UID: \"09f3d007-f621-4c30-a3f8-f3280a7db75d\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-n478g" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.894505 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-nvkbv"] Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.903034 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-mg24p"] Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.909602 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-b2f94"] Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.913642 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44kss\" (UniqueName: \"kubernetes.io/projected/45854217-6284-4678-903f-d64b4088ec29-kube-api-access-44kss\") pod \"horizon-operator-controller-manager-8464cc45fb-b2f94\" (UID: \"45854217-6284-4678-903f-d64b4088ec29\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-b2f94" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.913712 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tmhb\" (UniqueName: \"kubernetes.io/projected/bd94092b-4a34-4f83-9aaa-5ddac374e97a-kube-api-access-8tmhb\") pod \"glance-operator-controller-manager-79df6bcc97-cspdp\" (UID: \"bd94092b-4a34-4f83-9aaa-5ddac374e97a\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cspdp" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.913745 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l6zg\" (UniqueName: \"kubernetes.io/projected/3d9eef66-a93f-432a-8391-f6a55dc3f800-kube-api-access-5l6zg\") pod \"infra-operator-controller-manager-7b9c774f96-mg24p\" (UID: \"3d9eef66-a93f-432a-8391-f6a55dc3f800\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-mg24p" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.913797 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89lc7\" (UniqueName: \"kubernetes.io/projected/b1bf7404-a81d-42e8-bc1b-157c5cd791b0-kube-api-access-89lc7\") pod \"heat-operator-controller-manager-67dd5f86f5-84z2j\" (UID: \"b1bf7404-a81d-42e8-bc1b-157c5cd791b0\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-84z2j" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.913826 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d9eef66-a93f-432a-8391-f6a55dc3f800-cert\") pod \"infra-operator-controller-manager-7b9c774f96-mg24p\" (UID: \"3d9eef66-a93f-432a-8391-f6a55dc3f800\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-mg24p" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.926027 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-dls7x"] Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.926838 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dls7x" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.929166 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-ff7bz" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.938252 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-n478g" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.940753 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tmhb\" (UniqueName: \"kubernetes.io/projected/bd94092b-4a34-4f83-9aaa-5ddac374e97a-kube-api-access-8tmhb\") pod \"glance-operator-controller-manager-79df6bcc97-cspdp\" (UID: \"bd94092b-4a34-4f83-9aaa-5ddac374e97a\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cspdp" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.954124 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-4tflr" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.957859 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89lc7\" (UniqueName: \"kubernetes.io/projected/b1bf7404-a81d-42e8-bc1b-157c5cd791b0-kube-api-access-89lc7\") pod \"heat-operator-controller-manager-67dd5f86f5-84z2j\" (UID: \"b1bf7404-a81d-42e8-bc1b-157c5cd791b0\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-84z2j" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.960333 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-dls7x"] Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.970183 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-bw766" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.977878 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-cpvkm"] Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.978760 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-cpvkm" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.982501 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-29bxs" Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.991677 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-vjzn6"] Mar 18 18:18:16 crc kubenswrapper[4830]: I0318 18:18:16.993464 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-vjzn6" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:16.999654 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-jw9sr" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.005043 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-cpvkm"] Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.012288 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-vjzn6"] Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.014805 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d9eef66-a93f-432a-8391-f6a55dc3f800-cert\") pod \"infra-operator-controller-manager-7b9c774f96-mg24p\" (UID: \"3d9eef66-a93f-432a-8391-f6a55dc3f800\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-mg24p" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.014852 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44kss\" (UniqueName: \"kubernetes.io/projected/45854217-6284-4678-903f-d64b4088ec29-kube-api-access-44kss\") pod \"horizon-operator-controller-manager-8464cc45fb-b2f94\" (UID: \"45854217-6284-4678-903f-d64b4088ec29\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-b2f94" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.014878 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft5wr\" (UniqueName: \"kubernetes.io/projected/7e1c4c11-ddb2-45a3-94eb-8b5b27866996-kube-api-access-ft5wr\") pod \"keystone-operator-controller-manager-768b96df4c-dls7x\" (UID: \"7e1c4c11-ddb2-45a3-94eb-8b5b27866996\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dls7x" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.014913 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmcf9\" (UniqueName: \"kubernetes.io/projected/abc98dd9-9c79-45a2-a641-023633c1b75b-kube-api-access-zmcf9\") pod \"ironic-operator-controller-manager-6f787dddc9-nvkbv\" (UID: \"abc98dd9-9c79-45a2-a641-023633c1b75b\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-nvkbv" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.014951 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l6zg\" (UniqueName: \"kubernetes.io/projected/3d9eef66-a93f-432a-8391-f6a55dc3f800-kube-api-access-5l6zg\") pod \"infra-operator-controller-manager-7b9c774f96-mg24p\" (UID: \"3d9eef66-a93f-432a-8391-f6a55dc3f800\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-mg24p" Mar 18 18:18:17 crc kubenswrapper[4830]: E0318 18:18:17.015126 4830 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 18:18:17 crc kubenswrapper[4830]: E0318 18:18:17.015226 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d9eef66-a93f-432a-8391-f6a55dc3f800-cert podName:3d9eef66-a93f-432a-8391-f6a55dc3f800 nodeName:}" failed. No retries permitted until 2026-03-18 18:18:17.515191544 +0000 UTC m=+932.082821876 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d9eef66-a93f-432a-8391-f6a55dc3f800-cert") pod "infra-operator-controller-manager-7b9c774f96-mg24p" (UID: "3d9eef66-a93f-432a-8391-f6a55dc3f800") : secret "infra-operator-webhook-server-cert" not found Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.023060 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-6jmp6"] Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.023729 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-6jmp6" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.025970 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-kpzsp" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.027997 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-6jmp6"] Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.028289 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cspdp" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.032261 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-4mzzb"] Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.032962 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-4mzzb" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.033536 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l6zg\" (UniqueName: \"kubernetes.io/projected/3d9eef66-a93f-432a-8391-f6a55dc3f800-kube-api-access-5l6zg\") pod \"infra-operator-controller-manager-7b9c774f96-mg24p\" (UID: \"3d9eef66-a93f-432a-8391-f6a55dc3f800\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-mg24p" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.034145 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44kss\" (UniqueName: \"kubernetes.io/projected/45854217-6284-4678-903f-d64b4088ec29-kube-api-access-44kss\") pod \"horizon-operator-controller-manager-8464cc45fb-b2f94\" (UID: \"45854217-6284-4678-903f-d64b4088ec29\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-b2f94" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.036316 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-84z2j" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.036457 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-2nv5t" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.051388 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-4mzzb"] Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.058173 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-6282s"] Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.059262 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6282s" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.063861 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-mw5rd" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.069433 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-6282s"] Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.077290 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899qtfsv"] Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.078081 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899qtfsv" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.082879 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.082976 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-8h78k" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.085661 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-d76n9"] Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.093304 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-d76n9" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.097869 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-p8f6g" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.098258 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-b2f94" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.099852 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899qtfsv"] Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.107197 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-d76n9"] Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.115813 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-ct2qr"] Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.116177 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmcf9\" (UniqueName: \"kubernetes.io/projected/abc98dd9-9c79-45a2-a641-023633c1b75b-kube-api-access-zmcf9\") pod \"ironic-operator-controller-manager-6f787dddc9-nvkbv\" (UID: \"abc98dd9-9c79-45a2-a641-023633c1b75b\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-nvkbv" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.116226 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t62f2\" (UniqueName: \"kubernetes.io/projected/2f1b63b3-9d24-4f33-8d39-7decb4a7e0a8-kube-api-access-t62f2\") pod \"neutron-operator-controller-manager-767865f676-6jmp6\" (UID: \"2f1b63b3-9d24-4f33-8d39-7decb4a7e0a8\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-6jmp6" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.116270 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6mtf\" (UniqueName: \"kubernetes.io/projected/6a1a3dc1-1535-4091-97a8-abc6dc2d1388-kube-api-access-h6mtf\") pod \"mariadb-operator-controller-manager-67ccfc9778-vjzn6\" (UID: \"6a1a3dc1-1535-4091-97a8-abc6dc2d1388\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-vjzn6" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.116306 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtf95\" (UniqueName: \"kubernetes.io/projected/db7f527f-6421-4a26-9eae-5a68054b2a88-kube-api-access-wtf95\") pod \"manila-operator-controller-manager-55f864c847-cpvkm\" (UID: \"db7f527f-6421-4a26-9eae-5a68054b2a88\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-cpvkm" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.116344 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlftd\" (UniqueName: \"kubernetes.io/projected/1adc20ef-dc7a-4dce-ba20-6fe6eb6146f8-kube-api-access-vlftd\") pod \"nova-operator-controller-manager-5d488d59fb-4mzzb\" (UID: \"1adc20ef-dc7a-4dce-ba20-6fe6eb6146f8\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-4mzzb" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.116401 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft5wr\" (UniqueName: \"kubernetes.io/projected/7e1c4c11-ddb2-45a3-94eb-8b5b27866996-kube-api-access-ft5wr\") pod \"keystone-operator-controller-manager-768b96df4c-dls7x\" (UID: \"7e1c4c11-ddb2-45a3-94eb-8b5b27866996\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dls7x" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.123455 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-ct2qr" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.130498 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-ct2qr"] Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.136470 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-thwsc"] Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.137541 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-j885n" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.137681 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-thwsc" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.139894 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-6fr68" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.167447 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmcf9\" (UniqueName: \"kubernetes.io/projected/abc98dd9-9c79-45a2-a641-023633c1b75b-kube-api-access-zmcf9\") pod \"ironic-operator-controller-manager-6f787dddc9-nvkbv\" (UID: \"abc98dd9-9c79-45a2-a641-023633c1b75b\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-nvkbv" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.170636 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft5wr\" (UniqueName: \"kubernetes.io/projected/7e1c4c11-ddb2-45a3-94eb-8b5b27866996-kube-api-access-ft5wr\") pod \"keystone-operator-controller-manager-768b96df4c-dls7x\" (UID: \"7e1c4c11-ddb2-45a3-94eb-8b5b27866996\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dls7x" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.180606 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-thwsc"] Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.206533 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-pfj9c"] Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.207657 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-pfj9c" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.210264 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-zkzv7" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.224066 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a872110-8984-419e-b5ed-177ec5669cfc-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899qtfsv\" (UID: \"7a872110-8984-419e-b5ed-177ec5669cfc\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899qtfsv" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.224148 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlftd\" (UniqueName: \"kubernetes.io/projected/1adc20ef-dc7a-4dce-ba20-6fe6eb6146f8-kube-api-access-vlftd\") pod \"nova-operator-controller-manager-5d488d59fb-4mzzb\" (UID: \"1adc20ef-dc7a-4dce-ba20-6fe6eb6146f8\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-4mzzb" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.224259 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8l5n\" (UniqueName: \"kubernetes.io/projected/d60eef08-1564-405f-b4c1-3f391bbf741d-kube-api-access-w8l5n\") pod \"swift-operator-controller-manager-c674c5965-thwsc\" (UID: \"d60eef08-1564-405f-b4c1-3f391bbf741d\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-thwsc" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.224283 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqh8x\" (UniqueName: \"kubernetes.io/projected/28e6b5ce-47e2-43fb-b524-c2e642dbd166-kube-api-access-bqh8x\") pod \"octavia-operator-controller-manager-5b9f45d989-6282s\" (UID: \"28e6b5ce-47e2-43fb-b524-c2e642dbd166\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6282s" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.224326 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htrsp\" (UniqueName: \"kubernetes.io/projected/ed9931be-8631-4e73-92bd-ff18076dcb69-kube-api-access-htrsp\") pod \"placement-operator-controller-manager-5784578c99-ct2qr\" (UID: \"ed9931be-8631-4e73-92bd-ff18076dcb69\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-ct2qr" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.224347 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t62f2\" (UniqueName: \"kubernetes.io/projected/2f1b63b3-9d24-4f33-8d39-7decb4a7e0a8-kube-api-access-t62f2\") pod \"neutron-operator-controller-manager-767865f676-6jmp6\" (UID: \"2f1b63b3-9d24-4f33-8d39-7decb4a7e0a8\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-6jmp6" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.224394 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5h9z\" (UniqueName: \"kubernetes.io/projected/7a872110-8984-419e-b5ed-177ec5669cfc-kube-api-access-m5h9z\") pod \"openstack-baremetal-operator-controller-manager-74c4796899qtfsv\" (UID: \"7a872110-8984-419e-b5ed-177ec5669cfc\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899qtfsv" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.224416 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6mtf\" (UniqueName: \"kubernetes.io/projected/6a1a3dc1-1535-4091-97a8-abc6dc2d1388-kube-api-access-h6mtf\") pod \"mariadb-operator-controller-manager-67ccfc9778-vjzn6\" (UID: \"6a1a3dc1-1535-4091-97a8-abc6dc2d1388\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-vjzn6" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.224482 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9dv5\" (UniqueName: \"kubernetes.io/projected/e7de1579-8ed8-4434-818e-a5ea0c366cf7-kube-api-access-r9dv5\") pod \"ovn-operator-controller-manager-884679f54-d76n9\" (UID: \"e7de1579-8ed8-4434-818e-a5ea0c366cf7\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-d76n9" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.224511 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtf95\" (UniqueName: \"kubernetes.io/projected/db7f527f-6421-4a26-9eae-5a68054b2a88-kube-api-access-wtf95\") pod \"manila-operator-controller-manager-55f864c847-cpvkm\" (UID: \"db7f527f-6421-4a26-9eae-5a68054b2a88\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-cpvkm" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.225271 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-pfj9c"] Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.227861 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-nvkbv" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.244491 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t62f2\" (UniqueName: \"kubernetes.io/projected/2f1b63b3-9d24-4f33-8d39-7decb4a7e0a8-kube-api-access-t62f2\") pod \"neutron-operator-controller-manager-767865f676-6jmp6\" (UID: \"2f1b63b3-9d24-4f33-8d39-7decb4a7e0a8\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-6jmp6" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.245550 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlftd\" (UniqueName: \"kubernetes.io/projected/1adc20ef-dc7a-4dce-ba20-6fe6eb6146f8-kube-api-access-vlftd\") pod \"nova-operator-controller-manager-5d488d59fb-4mzzb\" (UID: \"1adc20ef-dc7a-4dce-ba20-6fe6eb6146f8\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-4mzzb" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.247344 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtf95\" (UniqueName: \"kubernetes.io/projected/db7f527f-6421-4a26-9eae-5a68054b2a88-kube-api-access-wtf95\") pod \"manila-operator-controller-manager-55f864c847-cpvkm\" (UID: \"db7f527f-6421-4a26-9eae-5a68054b2a88\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-cpvkm" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.264859 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6mtf\" (UniqueName: \"kubernetes.io/projected/6a1a3dc1-1535-4091-97a8-abc6dc2d1388-kube-api-access-h6mtf\") pod \"mariadb-operator-controller-manager-67ccfc9778-vjzn6\" (UID: \"6a1a3dc1-1535-4091-97a8-abc6dc2d1388\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-vjzn6" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.326756 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dls7x" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.328545 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5h9z\" (UniqueName: \"kubernetes.io/projected/7a872110-8984-419e-b5ed-177ec5669cfc-kube-api-access-m5h9z\") pod \"openstack-baremetal-operator-controller-manager-74c4796899qtfsv\" (UID: \"7a872110-8984-419e-b5ed-177ec5669cfc\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899qtfsv" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.328596 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9dv5\" (UniqueName: \"kubernetes.io/projected/e7de1579-8ed8-4434-818e-a5ea0c366cf7-kube-api-access-r9dv5\") pod \"ovn-operator-controller-manager-884679f54-d76n9\" (UID: \"e7de1579-8ed8-4434-818e-a5ea0c366cf7\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-d76n9" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.328649 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a872110-8984-419e-b5ed-177ec5669cfc-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899qtfsv\" (UID: \"7a872110-8984-419e-b5ed-177ec5669cfc\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899qtfsv" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.328726 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8l5n\" (UniqueName: \"kubernetes.io/projected/d60eef08-1564-405f-b4c1-3f391bbf741d-kube-api-access-w8l5n\") pod \"swift-operator-controller-manager-c674c5965-thwsc\" (UID: \"d60eef08-1564-405f-b4c1-3f391bbf741d\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-thwsc" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.328745 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqh8x\" (UniqueName: \"kubernetes.io/projected/28e6b5ce-47e2-43fb-b524-c2e642dbd166-kube-api-access-bqh8x\") pod \"octavia-operator-controller-manager-5b9f45d989-6282s\" (UID: \"28e6b5ce-47e2-43fb-b524-c2e642dbd166\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6282s" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.328766 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htrsp\" (UniqueName: \"kubernetes.io/projected/ed9931be-8631-4e73-92bd-ff18076dcb69-kube-api-access-htrsp\") pod \"placement-operator-controller-manager-5784578c99-ct2qr\" (UID: \"ed9931be-8631-4e73-92bd-ff18076dcb69\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-ct2qr" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.328849 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4rcw\" (UniqueName: \"kubernetes.io/projected/debd9df4-01da-4f5e-b66c-d1f7bd08574f-kube-api-access-b4rcw\") pod \"telemetry-operator-controller-manager-d6b694c5-pfj9c\" (UID: \"debd9df4-01da-4f5e-b66c-d1f7bd08574f\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-pfj9c" Mar 18 18:18:17 crc kubenswrapper[4830]: E0318 18:18:17.329832 4830 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 18:18:17 crc kubenswrapper[4830]: E0318 18:18:17.329876 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a872110-8984-419e-b5ed-177ec5669cfc-cert podName:7a872110-8984-419e-b5ed-177ec5669cfc nodeName:}" failed. No retries permitted until 2026-03-18 18:18:17.829861968 +0000 UTC m=+932.397492300 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7a872110-8984-419e-b5ed-177ec5669cfc-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899qtfsv" (UID: "7a872110-8984-419e-b5ed-177ec5669cfc") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.333033 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-cpvkm" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.350845 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-vjzn6" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.351741 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9dv5\" (UniqueName: \"kubernetes.io/projected/e7de1579-8ed8-4434-818e-a5ea0c366cf7-kube-api-access-r9dv5\") pod \"ovn-operator-controller-manager-884679f54-d76n9\" (UID: \"e7de1579-8ed8-4434-818e-a5ea0c366cf7\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-d76n9" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.351913 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-x9zq5"] Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.353264 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-x9zq5" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.361707 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-6jmp6" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.363804 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-x9zq5"] Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.371487 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htrsp\" (UniqueName: \"kubernetes.io/projected/ed9931be-8631-4e73-92bd-ff18076dcb69-kube-api-access-htrsp\") pod \"placement-operator-controller-manager-5784578c99-ct2qr\" (UID: \"ed9931be-8631-4e73-92bd-ff18076dcb69\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-ct2qr" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.371653 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-g99p4" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.380655 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-4mzzb" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.392475 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqh8x\" (UniqueName: \"kubernetes.io/projected/28e6b5ce-47e2-43fb-b524-c2e642dbd166-kube-api-access-bqh8x\") pod \"octavia-operator-controller-manager-5b9f45d989-6282s\" (UID: \"28e6b5ce-47e2-43fb-b524-c2e642dbd166\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6282s" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.400032 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5h9z\" (UniqueName: \"kubernetes.io/projected/7a872110-8984-419e-b5ed-177ec5669cfc-kube-api-access-m5h9z\") pod \"openstack-baremetal-operator-controller-manager-74c4796899qtfsv\" (UID: \"7a872110-8984-419e-b5ed-177ec5669cfc\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899qtfsv" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.401823 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8l5n\" (UniqueName: \"kubernetes.io/projected/d60eef08-1564-405f-b4c1-3f391bbf741d-kube-api-access-w8l5n\") pod \"swift-operator-controller-manager-c674c5965-thwsc\" (UID: \"d60eef08-1564-405f-b4c1-3f391bbf741d\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-thwsc" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.403604 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6282s" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.425680 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-r9bbq"] Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.427852 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-r9bbq" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.430521 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4rcw\" (UniqueName: \"kubernetes.io/projected/debd9df4-01da-4f5e-b66c-d1f7bd08574f-kube-api-access-b4rcw\") pod \"telemetry-operator-controller-manager-d6b694c5-pfj9c\" (UID: \"debd9df4-01da-4f5e-b66c-d1f7bd08574f\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-pfj9c" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.430568 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-l5vbm" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.431220 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-r9bbq"] Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.441236 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-86bd8996f6-x2hs7"] Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.442471 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-x2hs7" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.446517 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-d76n9" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.448879 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.449103 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.449399 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-prts6" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.449490 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-86bd8996f6-x2hs7"] Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.455181 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4rcw\" (UniqueName: \"kubernetes.io/projected/debd9df4-01da-4f5e-b66c-d1f7bd08574f-kube-api-access-b4rcw\") pod \"telemetry-operator-controller-manager-d6b694c5-pfj9c\" (UID: \"debd9df4-01da-4f5e-b66c-d1f7bd08574f\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-pfj9c" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.470675 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-85j54"] Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.479241 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-85j54" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.485412 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-j4c76" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.488391 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-85j54"] Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.535288 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t64c\" (UniqueName: \"kubernetes.io/projected/f68f6008-a3fb-4039-85a0-c0475455ac09-kube-api-access-7t64c\") pod \"watcher-operator-controller-manager-6c4d75f7f9-r9bbq\" (UID: \"f68f6008-a3fb-4039-85a0-c0475455ac09\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-r9bbq" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.535338 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-x2hs7\" (UID: \"8c1cd3c4-f399-4810-bdaf-53644d7555ff\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-x2hs7" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.535358 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-x2hs7\" (UID: \"8c1cd3c4-f399-4810-bdaf-53644d7555ff\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-x2hs7" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.535389 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d9eef66-a93f-432a-8391-f6a55dc3f800-cert\") pod \"infra-operator-controller-manager-7b9c774f96-mg24p\" (UID: \"3d9eef66-a93f-432a-8391-f6a55dc3f800\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-mg24p" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.535428 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvhts\" (UniqueName: \"kubernetes.io/projected/8c1cd3c4-f399-4810-bdaf-53644d7555ff-kube-api-access-hvhts\") pod \"openstack-operator-controller-manager-86bd8996f6-x2hs7\" (UID: \"8c1cd3c4-f399-4810-bdaf-53644d7555ff\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-x2hs7" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.535553 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2nl2\" (UniqueName: \"kubernetes.io/projected/6fd482c2-5871-4d59-8402-1e57b06055b0-kube-api-access-v2nl2\") pod \"test-operator-controller-manager-5c5cb9c4d7-x9zq5\" (UID: \"6fd482c2-5871-4d59-8402-1e57b06055b0\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-x9zq5" Mar 18 18:18:17 crc kubenswrapper[4830]: E0318 18:18:17.535670 4830 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 18:18:17 crc kubenswrapper[4830]: E0318 18:18:17.535712 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d9eef66-a93f-432a-8391-f6a55dc3f800-cert podName:3d9eef66-a93f-432a-8391-f6a55dc3f800 nodeName:}" failed. No retries permitted until 2026-03-18 18:18:18.53569672 +0000 UTC m=+933.103327052 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d9eef66-a93f-432a-8391-f6a55dc3f800-cert") pod "infra-operator-controller-manager-7b9c774f96-mg24p" (UID: "3d9eef66-a93f-432a-8391-f6a55dc3f800") : secret "infra-operator-webhook-server-cert" not found Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.589152 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-ct2qr" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.636323 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t64c\" (UniqueName: \"kubernetes.io/projected/f68f6008-a3fb-4039-85a0-c0475455ac09-kube-api-access-7t64c\") pod \"watcher-operator-controller-manager-6c4d75f7f9-r9bbq\" (UID: \"f68f6008-a3fb-4039-85a0-c0475455ac09\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-r9bbq" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.636362 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-x2hs7\" (UID: \"8c1cd3c4-f399-4810-bdaf-53644d7555ff\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-x2hs7" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.636385 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-x2hs7\" (UID: \"8c1cd3c4-f399-4810-bdaf-53644d7555ff\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-x2hs7" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.636431 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvhts\" (UniqueName: \"kubernetes.io/projected/8c1cd3c4-f399-4810-bdaf-53644d7555ff-kube-api-access-hvhts\") pod \"openstack-operator-controller-manager-86bd8996f6-x2hs7\" (UID: \"8c1cd3c4-f399-4810-bdaf-53644d7555ff\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-x2hs7" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.636486 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwvbz\" (UniqueName: \"kubernetes.io/projected/e663b49d-cd0f-4a18-8284-b12dad6c136a-kube-api-access-nwvbz\") pod \"rabbitmq-cluster-operator-manager-668c99d594-85j54\" (UID: \"e663b49d-cd0f-4a18-8284-b12dad6c136a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-85j54" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.636507 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2nl2\" (UniqueName: \"kubernetes.io/projected/6fd482c2-5871-4d59-8402-1e57b06055b0-kube-api-access-v2nl2\") pod \"test-operator-controller-manager-5c5cb9c4d7-x9zq5\" (UID: \"6fd482c2-5871-4d59-8402-1e57b06055b0\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-x9zq5" Mar 18 18:18:17 crc kubenswrapper[4830]: E0318 18:18:17.636683 4830 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 18:18:17 crc kubenswrapper[4830]: E0318 18:18:17.636724 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-metrics-certs podName:8c1cd3c4-f399-4810-bdaf-53644d7555ff nodeName:}" failed. No retries permitted until 2026-03-18 18:18:18.136708704 +0000 UTC m=+932.704339036 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-x2hs7" (UID: "8c1cd3c4-f399-4810-bdaf-53644d7555ff") : secret "metrics-server-cert" not found Mar 18 18:18:17 crc kubenswrapper[4830]: E0318 18:18:17.636855 4830 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 18:18:17 crc kubenswrapper[4830]: E0318 18:18:17.636883 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-webhook-certs podName:8c1cd3c4-f399-4810-bdaf-53644d7555ff nodeName:}" failed. No retries permitted until 2026-03-18 18:18:18.136874588 +0000 UTC m=+932.704504920 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-x2hs7" (UID: "8c1cd3c4-f399-4810-bdaf-53644d7555ff") : secret "webhook-server-cert" not found Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.636908 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-n478g"] Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.646174 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-bw766"] Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.656629 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t64c\" (UniqueName: \"kubernetes.io/projected/f68f6008-a3fb-4039-85a0-c0475455ac09-kube-api-access-7t64c\") pod \"watcher-operator-controller-manager-6c4d75f7f9-r9bbq\" (UID: \"f68f6008-a3fb-4039-85a0-c0475455ac09\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-r9bbq" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.656646 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvhts\" (UniqueName: \"kubernetes.io/projected/8c1cd3c4-f399-4810-bdaf-53644d7555ff-kube-api-access-hvhts\") pod \"openstack-operator-controller-manager-86bd8996f6-x2hs7\" (UID: \"8c1cd3c4-f399-4810-bdaf-53644d7555ff\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-x2hs7" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.660439 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2nl2\" (UniqueName: \"kubernetes.io/projected/6fd482c2-5871-4d59-8402-1e57b06055b0-kube-api-access-v2nl2\") pod \"test-operator-controller-manager-5c5cb9c4d7-x9zq5\" (UID: \"6fd482c2-5871-4d59-8402-1e57b06055b0\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-x9zq5" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.683197 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-thwsc" Mar 18 18:18:17 crc kubenswrapper[4830]: W0318 18:18:17.731021 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcebd0fbd_7733_464a_aead_539d69b70b04.slice/crio-4a2d367b31e7c4b172d9e9998d0d104ed6f711e3640e8c81bd2ebeee2d11c7f3 WatchSource:0}: Error finding container 4a2d367b31e7c4b172d9e9998d0d104ed6f711e3640e8c81bd2ebeee2d11c7f3: Status 404 returned error can't find the container with id 4a2d367b31e7c4b172d9e9998d0d104ed6f711e3640e8c81bd2ebeee2d11c7f3 Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.737063 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwvbz\" (UniqueName: \"kubernetes.io/projected/e663b49d-cd0f-4a18-8284-b12dad6c136a-kube-api-access-nwvbz\") pod \"rabbitmq-cluster-operator-manager-668c99d594-85j54\" (UID: \"e663b49d-cd0f-4a18-8284-b12dad6c136a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-85j54" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.748889 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-pfj9c" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.763041 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-x9zq5" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.769541 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwvbz\" (UniqueName: \"kubernetes.io/projected/e663b49d-cd0f-4a18-8284-b12dad6c136a-kube-api-access-nwvbz\") pod \"rabbitmq-cluster-operator-manager-668c99d594-85j54\" (UID: \"e663b49d-cd0f-4a18-8284-b12dad6c136a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-85j54" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.801908 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-r9bbq" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.835408 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-85j54" Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.837747 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a872110-8984-419e-b5ed-177ec5669cfc-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899qtfsv\" (UID: \"7a872110-8984-419e-b5ed-177ec5669cfc\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899qtfsv" Mar 18 18:18:17 crc kubenswrapper[4830]: E0318 18:18:17.838015 4830 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 18:18:17 crc kubenswrapper[4830]: E0318 18:18:17.838057 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a872110-8984-419e-b5ed-177ec5669cfc-cert podName:7a872110-8984-419e-b5ed-177ec5669cfc nodeName:}" failed. No retries permitted until 2026-03-18 18:18:18.838044691 +0000 UTC m=+933.405675023 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7a872110-8984-419e-b5ed-177ec5669cfc-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899qtfsv" (UID: "7a872110-8984-419e-b5ed-177ec5669cfc") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.838336 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-4tflr"] Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.844688 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-84z2j"] Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.873699 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-cspdp"] Mar 18 18:18:17 crc kubenswrapper[4830]: I0318 18:18:17.873750 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-b2f94"] Mar 18 18:18:17 crc kubenswrapper[4830]: W0318 18:18:17.914395 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd94092b_4a34_4f83_9aaa_5ddac374e97a.slice/crio-4aa91ec7919ad2e049b7396acad45ba0647f69bdc999d2e83466481032abc4a2 WatchSource:0}: Error finding container 4aa91ec7919ad2e049b7396acad45ba0647f69bdc999d2e83466481032abc4a2: Status 404 returned error can't find the container with id 4aa91ec7919ad2e049b7396acad45ba0647f69bdc999d2e83466481032abc4a2 Mar 18 18:18:18 crc kubenswrapper[4830]: I0318 18:18:18.142098 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-x2hs7\" (UID: \"8c1cd3c4-f399-4810-bdaf-53644d7555ff\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-x2hs7" Mar 18 18:18:18 crc kubenswrapper[4830]: I0318 18:18:18.142546 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-x2hs7\" (UID: \"8c1cd3c4-f399-4810-bdaf-53644d7555ff\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-x2hs7" Mar 18 18:18:18 crc kubenswrapper[4830]: E0318 18:18:18.142490 4830 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 18:18:18 crc kubenswrapper[4830]: E0318 18:18:18.142900 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-metrics-certs podName:8c1cd3c4-f399-4810-bdaf-53644d7555ff nodeName:}" failed. No retries permitted until 2026-03-18 18:18:19.142881791 +0000 UTC m=+933.710512113 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-x2hs7" (UID: "8c1cd3c4-f399-4810-bdaf-53644d7555ff") : secret "metrics-server-cert" not found Mar 18 18:18:18 crc kubenswrapper[4830]: E0318 18:18:18.142760 4830 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 18:18:18 crc kubenswrapper[4830]: E0318 18:18:18.143401 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-webhook-certs podName:8c1cd3c4-f399-4810-bdaf-53644d7555ff nodeName:}" failed. No retries permitted until 2026-03-18 18:18:19.143377275 +0000 UTC m=+933.711007607 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-x2hs7" (UID: "8c1cd3c4-f399-4810-bdaf-53644d7555ff") : secret "webhook-server-cert" not found Mar 18 18:18:18 crc kubenswrapper[4830]: I0318 18:18:18.264120 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-dls7x"] Mar 18 18:18:18 crc kubenswrapper[4830]: I0318 18:18:18.264172 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-cpvkm"] Mar 18 18:18:18 crc kubenswrapper[4830]: I0318 18:18:18.264186 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-nvkbv"] Mar 18 18:18:18 crc kubenswrapper[4830]: I0318 18:18:18.281461 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-4mzzb"] Mar 18 18:18:18 crc kubenswrapper[4830]: I0318 18:18:18.312716 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-thwsc"] Mar 18 18:18:18 crc kubenswrapper[4830]: I0318 18:18:18.366096 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-ct2qr"] Mar 18 18:18:18 crc kubenswrapper[4830]: I0318 18:18:18.372734 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-vjzn6"] Mar 18 18:18:18 crc kubenswrapper[4830]: I0318 18:18:18.390550 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-6jmp6"] Mar 18 18:18:18 crc kubenswrapper[4830]: I0318 18:18:18.401126 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-6282s"] Mar 18 18:18:18 crc kubenswrapper[4830]: I0318 18:18:18.403987 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cspdp" event={"ID":"bd94092b-4a34-4f83-9aaa-5ddac374e97a","Type":"ContainerStarted","Data":"4aa91ec7919ad2e049b7396acad45ba0647f69bdc999d2e83466481032abc4a2"} Mar 18 18:18:18 crc kubenswrapper[4830]: W0318 18:18:18.404072 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7de1579_8ed8_4434_818e_a5ea0c366cf7.slice/crio-c0fc75473ecd59089ebc5a694f0fa92acb8a712d69992c5b5d8da4ea9813b765 WatchSource:0}: Error finding container c0fc75473ecd59089ebc5a694f0fa92acb8a712d69992c5b5d8da4ea9813b765: Status 404 returned error can't find the container with id c0fc75473ecd59089ebc5a694f0fa92acb8a712d69992c5b5d8da4ea9813b765 Mar 18 18:18:18 crc kubenswrapper[4830]: I0318 18:18:18.406500 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-6jmp6" event={"ID":"2f1b63b3-9d24-4f33-8d39-7decb4a7e0a8","Type":"ContainerStarted","Data":"5eb615dba5e7130b8f5d5a91082035c1636b8577e56a6d944d7ceb2db41e6418"} Mar 18 18:18:18 crc kubenswrapper[4830]: I0318 18:18:18.406665 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-d76n9"] Mar 18 18:18:18 crc kubenswrapper[4830]: E0318 18:18:18.406717 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h6mtf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-67ccfc9778-vjzn6_openstack-operators(6a1a3dc1-1535-4091-97a8-abc6dc2d1388): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 18:18:18 crc kubenswrapper[4830]: E0318 18:18:18.407872 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-vjzn6" podUID="6a1a3dc1-1535-4091-97a8-abc6dc2d1388" Mar 18 18:18:18 crc kubenswrapper[4830]: W0318 18:18:18.409177 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28e6b5ce_47e2_43fb_b524_c2e642dbd166.slice/crio-211b67c1bd0eef5720077305ea032426dd84ad0731d56fe40aa3e037da2a176b WatchSource:0}: Error finding container 211b67c1bd0eef5720077305ea032426dd84ad0731d56fe40aa3e037da2a176b: Status 404 returned error can't find the container with id 211b67c1bd0eef5720077305ea032426dd84ad0731d56fe40aa3e037da2a176b Mar 18 18:18:18 crc kubenswrapper[4830]: I0318 18:18:18.409512 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-84z2j" event={"ID":"b1bf7404-a81d-42e8-bc1b-157c5cd791b0","Type":"ContainerStarted","Data":"e779638d636a73a781c24ee09d1f76395457f37992b2cb72857d4de0bc7272e5"} Mar 18 18:18:18 crc kubenswrapper[4830]: E0318 18:18:18.410126 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r9dv5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-884679f54-d76n9_openstack-operators(e7de1579-8ed8-4434-818e-a5ea0c366cf7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 18:18:18 crc kubenswrapper[4830]: E0318 18:18:18.411188 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-d76n9" podUID="e7de1579-8ed8-4434-818e-a5ea0c366cf7" Mar 18 18:18:18 crc kubenswrapper[4830]: I0318 18:18:18.411282 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-ct2qr" event={"ID":"ed9931be-8631-4e73-92bd-ff18076dcb69","Type":"ContainerStarted","Data":"36347e59c336bac06c2247eee57acc792c50088a620ba689d549dc1b9e89ea09"} Mar 18 18:18:18 crc kubenswrapper[4830]: I0318 18:18:18.413864 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dls7x" event={"ID":"7e1c4c11-ddb2-45a3-94eb-8b5b27866996","Type":"ContainerStarted","Data":"e884ba0099da29f78f7dcc77234a2f8a66e47c5218c83d5e86074bf2f2892cff"} Mar 18 18:18:18 crc kubenswrapper[4830]: E0318 18:18:18.415162 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bqh8x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5b9f45d989-6282s_openstack-operators(28e6b5ce-47e2-43fb-b524-c2e642dbd166): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 18:18:18 crc kubenswrapper[4830]: I0318 18:18:18.415361 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-bw766" event={"ID":"cebd0fbd-7733-464a-aead-539d69b70b04","Type":"ContainerStarted","Data":"4a2d367b31e7c4b172d9e9998d0d104ed6f711e3640e8c81bd2ebeee2d11c7f3"} Mar 18 18:18:18 crc kubenswrapper[4830]: E0318 18:18:18.416324 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6282s" podUID="28e6b5ce-47e2-43fb-b524-c2e642dbd166" Mar 18 18:18:18 crc kubenswrapper[4830]: I0318 18:18:18.421251 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-nvkbv" event={"ID":"abc98dd9-9c79-45a2-a641-023633c1b75b","Type":"ContainerStarted","Data":"8022ca549e7df826c3c6892f7cae1e781da5f078da6b1138eb0935763eb9e4f0"} Mar 18 18:18:18 crc kubenswrapper[4830]: I0318 18:18:18.422270 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-b2f94" event={"ID":"45854217-6284-4678-903f-d64b4088ec29","Type":"ContainerStarted","Data":"c074c6e412bcb9f217201d705241b4631bb13760207396c3a40b5df6c0ac1e92"} Mar 18 18:18:18 crc kubenswrapper[4830]: I0318 18:18:18.424935 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-cpvkm" event={"ID":"db7f527f-6421-4a26-9eae-5a68054b2a88","Type":"ContainerStarted","Data":"e1101c55fb8998e67a2877cc478734df25373578f0fa26cea2e18ed27df62f87"} Mar 18 18:18:18 crc kubenswrapper[4830]: I0318 18:18:18.426429 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-4tflr" event={"ID":"ff6a5b70-c9ae-4087-b4fd-e24a712e6e33","Type":"ContainerStarted","Data":"ad645331891c7ea0f5839bd53705c5e690785e950ec9a98db6a44380ce31f1ab"} Mar 18 18:18:18 crc kubenswrapper[4830]: I0318 18:18:18.428406 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-thwsc" event={"ID":"d60eef08-1564-405f-b4c1-3f391bbf741d","Type":"ContainerStarted","Data":"a759a12d11af9eba5839bfe764755259fbf0aa1c7b3a6051db64ba03d6da1560"} Mar 18 18:18:18 crc kubenswrapper[4830]: I0318 18:18:18.439283 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-4mzzb" event={"ID":"1adc20ef-dc7a-4dce-ba20-6fe6eb6146f8","Type":"ContainerStarted","Data":"9ef9f20f08aea10ee934b5edc95b7c6ef65867dee2ca29e347d8f3096ce80556"} Mar 18 18:18:18 crc kubenswrapper[4830]: I0318 18:18:18.440452 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-n478g" event={"ID":"09f3d007-f621-4c30-a3f8-f3280a7db75d","Type":"ContainerStarted","Data":"2b4230969f252fd92344c2a7bbbe9f5e3e2ad24d05ba12e8c7aff969b2fc4e83"} Mar 18 18:18:18 crc kubenswrapper[4830]: I0318 18:18:18.540099 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-pfj9c"] Mar 18 18:18:18 crc kubenswrapper[4830]: I0318 18:18:18.550116 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-x9zq5"] Mar 18 18:18:18 crc kubenswrapper[4830]: W0318 18:18:18.552598 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddebd9df4_01da_4f5e_b66c_d1f7bd08574f.slice/crio-3a479fdbca4c05523b2727338105041870bec57b050abce63b221cf86f5e9779 WatchSource:0}: Error finding container 3a479fdbca4c05523b2727338105041870bec57b050abce63b221cf86f5e9779: Status 404 returned error can't find the container with id 3a479fdbca4c05523b2727338105041870bec57b050abce63b221cf86f5e9779 Mar 18 18:18:18 crc kubenswrapper[4830]: W0318 18:18:18.554882 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fd482c2_5871_4d59_8402_1e57b06055b0.slice/crio-1c326c46f5b9f0a02f5e50e992040ce6d08c8ca2c193c2e38f1b760713b61e46 WatchSource:0}: Error finding container 1c326c46f5b9f0a02f5e50e992040ce6d08c8ca2c193c2e38f1b760713b61e46: Status 404 returned error can't find the container with id 1c326c46f5b9f0a02f5e50e992040ce6d08c8ca2c193c2e38f1b760713b61e46 Mar 18 18:18:18 crc kubenswrapper[4830]: I0318 18:18:18.555290 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d9eef66-a93f-432a-8391-f6a55dc3f800-cert\") pod \"infra-operator-controller-manager-7b9c774f96-mg24p\" (UID: \"3d9eef66-a93f-432a-8391-f6a55dc3f800\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-mg24p" Mar 18 18:18:18 crc kubenswrapper[4830]: E0318 18:18:18.555471 4830 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 18:18:18 crc kubenswrapper[4830]: E0318 18:18:18.555542 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d9eef66-a93f-432a-8391-f6a55dc3f800-cert podName:3d9eef66-a93f-432a-8391-f6a55dc3f800 nodeName:}" failed. No retries permitted until 2026-03-18 18:18:20.555521374 +0000 UTC m=+935.123151776 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d9eef66-a93f-432a-8391-f6a55dc3f800-cert") pod "infra-operator-controller-manager-7b9c774f96-mg24p" (UID: "3d9eef66-a93f-432a-8391-f6a55dc3f800") : secret "infra-operator-webhook-server-cert" not found Mar 18 18:18:18 crc kubenswrapper[4830]: E0318 18:18:18.558277 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v2nl2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-x9zq5_openstack-operators(6fd482c2-5871-4d59-8402-1e57b06055b0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 18:18:18 crc kubenswrapper[4830]: E0318 18:18:18.560102 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-x9zq5" podUID="6fd482c2-5871-4d59-8402-1e57b06055b0" Mar 18 18:18:18 crc kubenswrapper[4830]: I0318 18:18:18.572978 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-85j54"] Mar 18 18:18:18 crc kubenswrapper[4830]: W0318 18:18:18.583272 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode663b49d_cd0f_4a18_8284_b12dad6c136a.slice/crio-2416f0b8a8bb69b72b7c8ca3c91b31805d67e901a1d8172632ce911278e946f6 WatchSource:0}: Error finding container 2416f0b8a8bb69b72b7c8ca3c91b31805d67e901a1d8172632ce911278e946f6: Status 404 returned error can't find the container with id 2416f0b8a8bb69b72b7c8ca3c91b31805d67e901a1d8172632ce911278e946f6 Mar 18 18:18:18 crc kubenswrapper[4830]: E0318 18:18:18.585674 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nwvbz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-85j54_openstack-operators(e663b49d-cd0f-4a18-8284-b12dad6c136a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 18:18:18 crc kubenswrapper[4830]: E0318 18:18:18.587195 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-85j54" podUID="e663b49d-cd0f-4a18-8284-b12dad6c136a" Mar 18 18:18:18 crc kubenswrapper[4830]: I0318 18:18:18.637700 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-r9bbq"] Mar 18 18:18:18 crc kubenswrapper[4830]: W0318 18:18:18.641652 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf68f6008_a3fb_4039_85a0_c0475455ac09.slice/crio-8e2608c3ea41a31a2b7efb5b6b3ff61a4d6a45a830c09297273c9ae9c62b137b WatchSource:0}: Error finding container 8e2608c3ea41a31a2b7efb5b6b3ff61a4d6a45a830c09297273c9ae9c62b137b: Status 404 returned error can't find the container with id 8e2608c3ea41a31a2b7efb5b6b3ff61a4d6a45a830c09297273c9ae9c62b137b Mar 18 18:18:18 crc kubenswrapper[4830]: E0318 18:18:18.643946 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7t64c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-r9bbq_openstack-operators(f68f6008-a3fb-4039-85a0-c0475455ac09): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 18:18:18 crc kubenswrapper[4830]: E0318 18:18:18.645136 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-r9bbq" podUID="f68f6008-a3fb-4039-85a0-c0475455ac09" Mar 18 18:18:18 crc kubenswrapper[4830]: I0318 18:18:18.859529 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a872110-8984-419e-b5ed-177ec5669cfc-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899qtfsv\" (UID: \"7a872110-8984-419e-b5ed-177ec5669cfc\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899qtfsv" Mar 18 18:18:18 crc kubenswrapper[4830]: E0318 18:18:18.859856 4830 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 18:18:18 crc kubenswrapper[4830]: E0318 18:18:18.859911 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a872110-8984-419e-b5ed-177ec5669cfc-cert podName:7a872110-8984-419e-b5ed-177ec5669cfc nodeName:}" failed. No retries permitted until 2026-03-18 18:18:20.859896871 +0000 UTC m=+935.427527193 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7a872110-8984-419e-b5ed-177ec5669cfc-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899qtfsv" (UID: "7a872110-8984-419e-b5ed-177ec5669cfc") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 18:18:19 crc kubenswrapper[4830]: I0318 18:18:19.168669 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-x2hs7\" (UID: \"8c1cd3c4-f399-4810-bdaf-53644d7555ff\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-x2hs7" Mar 18 18:18:19 crc kubenswrapper[4830]: I0318 18:18:19.168709 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-x2hs7\" (UID: \"8c1cd3c4-f399-4810-bdaf-53644d7555ff\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-x2hs7" Mar 18 18:18:19 crc kubenswrapper[4830]: E0318 18:18:19.168858 4830 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 18:18:19 crc kubenswrapper[4830]: E0318 18:18:19.168907 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-webhook-certs podName:8c1cd3c4-f399-4810-bdaf-53644d7555ff nodeName:}" failed. No retries permitted until 2026-03-18 18:18:21.168890848 +0000 UTC m=+935.736521180 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-x2hs7" (UID: "8c1cd3c4-f399-4810-bdaf-53644d7555ff") : secret "webhook-server-cert" not found Mar 18 18:18:19 crc kubenswrapper[4830]: E0318 18:18:19.169200 4830 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 18:18:19 crc kubenswrapper[4830]: E0318 18:18:19.169233 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-metrics-certs podName:8c1cd3c4-f399-4810-bdaf-53644d7555ff nodeName:}" failed. No retries permitted until 2026-03-18 18:18:21.169225467 +0000 UTC m=+935.736855799 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-x2hs7" (UID: "8c1cd3c4-f399-4810-bdaf-53644d7555ff") : secret "metrics-server-cert" not found Mar 18 18:18:19 crc kubenswrapper[4830]: I0318 18:18:19.464097 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-pfj9c" event={"ID":"debd9df4-01da-4f5e-b66c-d1f7bd08574f","Type":"ContainerStarted","Data":"3a479fdbca4c05523b2727338105041870bec57b050abce63b221cf86f5e9779"} Mar 18 18:18:19 crc kubenswrapper[4830]: I0318 18:18:19.471118 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-r9bbq" event={"ID":"f68f6008-a3fb-4039-85a0-c0475455ac09","Type":"ContainerStarted","Data":"8e2608c3ea41a31a2b7efb5b6b3ff61a4d6a45a830c09297273c9ae9c62b137b"} Mar 18 18:18:19 crc kubenswrapper[4830]: E0318 18:18:19.472561 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-r9bbq" podUID="f68f6008-a3fb-4039-85a0-c0475455ac09" Mar 18 18:18:19 crc kubenswrapper[4830]: I0318 18:18:19.484925 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-vjzn6" event={"ID":"6a1a3dc1-1535-4091-97a8-abc6dc2d1388","Type":"ContainerStarted","Data":"3e36bdd0d4318dd8b73742315fc93b8dd60bb7791a4b7f82251ee9d5f7714481"} Mar 18 18:18:19 crc kubenswrapper[4830]: I0318 18:18:19.488083 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-x9zq5" event={"ID":"6fd482c2-5871-4d59-8402-1e57b06055b0","Type":"ContainerStarted","Data":"1c326c46f5b9f0a02f5e50e992040ce6d08c8ca2c193c2e38f1b760713b61e46"} Mar 18 18:18:19 crc kubenswrapper[4830]: I0318 18:18:19.491285 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-d76n9" event={"ID":"e7de1579-8ed8-4434-818e-a5ea0c366cf7","Type":"ContainerStarted","Data":"c0fc75473ecd59089ebc5a694f0fa92acb8a712d69992c5b5d8da4ea9813b765"} Mar 18 18:18:19 crc kubenswrapper[4830]: E0318 18:18:19.494444 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-d76n9" podUID="e7de1579-8ed8-4434-818e-a5ea0c366cf7" Mar 18 18:18:19 crc kubenswrapper[4830]: E0318 18:18:19.494970 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-vjzn6" podUID="6a1a3dc1-1535-4091-97a8-abc6dc2d1388" Mar 18 18:18:19 crc kubenswrapper[4830]: I0318 18:18:19.498466 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6282s" event={"ID":"28e6b5ce-47e2-43fb-b524-c2e642dbd166","Type":"ContainerStarted","Data":"211b67c1bd0eef5720077305ea032426dd84ad0731d56fe40aa3e037da2a176b"} Mar 18 18:18:19 crc kubenswrapper[4830]: E0318 18:18:19.500025 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-x9zq5" podUID="6fd482c2-5871-4d59-8402-1e57b06055b0" Mar 18 18:18:19 crc kubenswrapper[4830]: E0318 18:18:19.503641 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6282s" podUID="28e6b5ce-47e2-43fb-b524-c2e642dbd166" Mar 18 18:18:19 crc kubenswrapper[4830]: I0318 18:18:19.505337 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-85j54" event={"ID":"e663b49d-cd0f-4a18-8284-b12dad6c136a","Type":"ContainerStarted","Data":"2416f0b8a8bb69b72b7c8ca3c91b31805d67e901a1d8172632ce911278e946f6"} Mar 18 18:18:19 crc kubenswrapper[4830]: E0318 18:18:19.508817 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-85j54" podUID="e663b49d-cd0f-4a18-8284-b12dad6c136a" Mar 18 18:18:20 crc kubenswrapper[4830]: E0318 18:18:20.530031 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-d76n9" podUID="e7de1579-8ed8-4434-818e-a5ea0c366cf7" Mar 18 18:18:20 crc kubenswrapper[4830]: E0318 18:18:20.530076 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6282s" podUID="28e6b5ce-47e2-43fb-b524-c2e642dbd166" Mar 18 18:18:20 crc kubenswrapper[4830]: E0318 18:18:20.530170 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-r9bbq" podUID="f68f6008-a3fb-4039-85a0-c0475455ac09" Mar 18 18:18:20 crc kubenswrapper[4830]: E0318 18:18:20.530211 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-85j54" podUID="e663b49d-cd0f-4a18-8284-b12dad6c136a" Mar 18 18:18:20 crc kubenswrapper[4830]: E0318 18:18:20.530249 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-vjzn6" podUID="6a1a3dc1-1535-4091-97a8-abc6dc2d1388" Mar 18 18:18:20 crc kubenswrapper[4830]: E0318 18:18:20.530297 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-x9zq5" podUID="6fd482c2-5871-4d59-8402-1e57b06055b0" Mar 18 18:18:20 crc kubenswrapper[4830]: I0318 18:18:20.595534 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d9eef66-a93f-432a-8391-f6a55dc3f800-cert\") pod \"infra-operator-controller-manager-7b9c774f96-mg24p\" (UID: \"3d9eef66-a93f-432a-8391-f6a55dc3f800\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-mg24p" Mar 18 18:18:20 crc kubenswrapper[4830]: E0318 18:18:20.595675 4830 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 18:18:20 crc kubenswrapper[4830]: E0318 18:18:20.595718 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d9eef66-a93f-432a-8391-f6a55dc3f800-cert podName:3d9eef66-a93f-432a-8391-f6a55dc3f800 nodeName:}" failed. No retries permitted until 2026-03-18 18:18:24.595705236 +0000 UTC m=+939.163335568 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d9eef66-a93f-432a-8391-f6a55dc3f800-cert") pod "infra-operator-controller-manager-7b9c774f96-mg24p" (UID: "3d9eef66-a93f-432a-8391-f6a55dc3f800") : secret "infra-operator-webhook-server-cert" not found Mar 18 18:18:20 crc kubenswrapper[4830]: I0318 18:18:20.910029 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a872110-8984-419e-b5ed-177ec5669cfc-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899qtfsv\" (UID: \"7a872110-8984-419e-b5ed-177ec5669cfc\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899qtfsv" Mar 18 18:18:20 crc kubenswrapper[4830]: E0318 18:18:20.910267 4830 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 18:18:20 crc kubenswrapper[4830]: E0318 18:18:20.910515 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a872110-8984-419e-b5ed-177ec5669cfc-cert podName:7a872110-8984-419e-b5ed-177ec5669cfc nodeName:}" failed. No retries permitted until 2026-03-18 18:18:24.910493164 +0000 UTC m=+939.478123496 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7a872110-8984-419e-b5ed-177ec5669cfc-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899qtfsv" (UID: "7a872110-8984-419e-b5ed-177ec5669cfc") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 18:18:21 crc kubenswrapper[4830]: I0318 18:18:21.218354 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-x2hs7\" (UID: \"8c1cd3c4-f399-4810-bdaf-53644d7555ff\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-x2hs7" Mar 18 18:18:21 crc kubenswrapper[4830]: I0318 18:18:21.218407 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-x2hs7\" (UID: \"8c1cd3c4-f399-4810-bdaf-53644d7555ff\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-x2hs7" Mar 18 18:18:21 crc kubenswrapper[4830]: E0318 18:18:21.218559 4830 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 18:18:21 crc kubenswrapper[4830]: E0318 18:18:21.218571 4830 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 18:18:21 crc kubenswrapper[4830]: E0318 18:18:21.218608 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-webhook-certs podName:8c1cd3c4-f399-4810-bdaf-53644d7555ff nodeName:}" failed. No retries permitted until 2026-03-18 18:18:25.218593825 +0000 UTC m=+939.786224157 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-x2hs7" (UID: "8c1cd3c4-f399-4810-bdaf-53644d7555ff") : secret "webhook-server-cert" not found Mar 18 18:18:21 crc kubenswrapper[4830]: E0318 18:18:21.218657 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-metrics-certs podName:8c1cd3c4-f399-4810-bdaf-53644d7555ff nodeName:}" failed. No retries permitted until 2026-03-18 18:18:25.218635106 +0000 UTC m=+939.786265438 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-x2hs7" (UID: "8c1cd3c4-f399-4810-bdaf-53644d7555ff") : secret "metrics-server-cert" not found Mar 18 18:18:24 crc kubenswrapper[4830]: I0318 18:18:24.691871 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d9eef66-a93f-432a-8391-f6a55dc3f800-cert\") pod \"infra-operator-controller-manager-7b9c774f96-mg24p\" (UID: \"3d9eef66-a93f-432a-8391-f6a55dc3f800\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-mg24p" Mar 18 18:18:24 crc kubenswrapper[4830]: E0318 18:18:24.692037 4830 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 18:18:24 crc kubenswrapper[4830]: E0318 18:18:24.692427 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d9eef66-a93f-432a-8391-f6a55dc3f800-cert podName:3d9eef66-a93f-432a-8391-f6a55dc3f800 nodeName:}" failed. No retries permitted until 2026-03-18 18:18:32.692404995 +0000 UTC m=+947.260035327 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d9eef66-a93f-432a-8391-f6a55dc3f800-cert") pod "infra-operator-controller-manager-7b9c774f96-mg24p" (UID: "3d9eef66-a93f-432a-8391-f6a55dc3f800") : secret "infra-operator-webhook-server-cert" not found Mar 18 18:18:24 crc kubenswrapper[4830]: I0318 18:18:24.997303 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a872110-8984-419e-b5ed-177ec5669cfc-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899qtfsv\" (UID: \"7a872110-8984-419e-b5ed-177ec5669cfc\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899qtfsv" Mar 18 18:18:24 crc kubenswrapper[4830]: E0318 18:18:24.997530 4830 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 18:18:24 crc kubenswrapper[4830]: E0318 18:18:24.997647 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a872110-8984-419e-b5ed-177ec5669cfc-cert podName:7a872110-8984-419e-b5ed-177ec5669cfc nodeName:}" failed. No retries permitted until 2026-03-18 18:18:32.997607315 +0000 UTC m=+947.565237677 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7a872110-8984-419e-b5ed-177ec5669cfc-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899qtfsv" (UID: "7a872110-8984-419e-b5ed-177ec5669cfc") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 18:18:25 crc kubenswrapper[4830]: I0318 18:18:25.307692 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-x2hs7\" (UID: \"8c1cd3c4-f399-4810-bdaf-53644d7555ff\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-x2hs7" Mar 18 18:18:25 crc kubenswrapper[4830]: I0318 18:18:25.307825 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-x2hs7\" (UID: \"8c1cd3c4-f399-4810-bdaf-53644d7555ff\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-x2hs7" Mar 18 18:18:25 crc kubenswrapper[4830]: E0318 18:18:25.309595 4830 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 18:18:25 crc kubenswrapper[4830]: E0318 18:18:25.309681 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-metrics-certs podName:8c1cd3c4-f399-4810-bdaf-53644d7555ff nodeName:}" failed. No retries permitted until 2026-03-18 18:18:33.309658177 +0000 UTC m=+947.877288539 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-x2hs7" (UID: "8c1cd3c4-f399-4810-bdaf-53644d7555ff") : secret "metrics-server-cert" not found Mar 18 18:18:25 crc kubenswrapper[4830]: E0318 18:18:25.310284 4830 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 18:18:25 crc kubenswrapper[4830]: E0318 18:18:25.310347 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-webhook-certs podName:8c1cd3c4-f399-4810-bdaf-53644d7555ff nodeName:}" failed. No retries permitted until 2026-03-18 18:18:33.310330606 +0000 UTC m=+947.877960968 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-x2hs7" (UID: "8c1cd3c4-f399-4810-bdaf-53644d7555ff") : secret "webhook-server-cert" not found Mar 18 18:18:32 crc kubenswrapper[4830]: E0318 18:18:32.045673 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56" Mar 18 18:18:32 crc kubenswrapper[4830]: E0318 18:18:32.046961 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ft5wr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-768b96df4c-dls7x_openstack-operators(7e1c4c11-ddb2-45a3-94eb-8b5b27866996): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 18:18:32 crc kubenswrapper[4830]: E0318 18:18:32.048817 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dls7x" podUID="7e1c4c11-ddb2-45a3-94eb-8b5b27866996" Mar 18 18:18:32 crc kubenswrapper[4830]: E0318 18:18:32.648494 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dls7x" podUID="7e1c4c11-ddb2-45a3-94eb-8b5b27866996" Mar 18 18:18:32 crc kubenswrapper[4830]: E0318 18:18:32.738860 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a" Mar 18 18:18:32 crc kubenswrapper[4830]: E0318 18:18:32.741545 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vlftd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5d488d59fb-4mzzb_openstack-operators(1adc20ef-dc7a-4dce-ba20-6fe6eb6146f8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 18:18:32 crc kubenswrapper[4830]: E0318 18:18:32.742999 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-4mzzb" podUID="1adc20ef-dc7a-4dce-ba20-6fe6eb6146f8" Mar 18 18:18:32 crc kubenswrapper[4830]: I0318 18:18:32.775934 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d9eef66-a93f-432a-8391-f6a55dc3f800-cert\") pod \"infra-operator-controller-manager-7b9c774f96-mg24p\" (UID: \"3d9eef66-a93f-432a-8391-f6a55dc3f800\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-mg24p" Mar 18 18:18:32 crc kubenswrapper[4830]: E0318 18:18:32.776139 4830 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 18:18:32 crc kubenswrapper[4830]: E0318 18:18:32.776255 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d9eef66-a93f-432a-8391-f6a55dc3f800-cert podName:3d9eef66-a93f-432a-8391-f6a55dc3f800 nodeName:}" failed. No retries permitted until 2026-03-18 18:18:48.776225602 +0000 UTC m=+963.343855934 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d9eef66-a93f-432a-8391-f6a55dc3f800-cert") pod "infra-operator-controller-manager-7b9c774f96-mg24p" (UID: "3d9eef66-a93f-432a-8391-f6a55dc3f800") : secret "infra-operator-webhook-server-cert" not found Mar 18 18:18:33 crc kubenswrapper[4830]: I0318 18:18:33.079723 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a872110-8984-419e-b5ed-177ec5669cfc-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899qtfsv\" (UID: \"7a872110-8984-419e-b5ed-177ec5669cfc\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899qtfsv" Mar 18 18:18:33 crc kubenswrapper[4830]: E0318 18:18:33.080444 4830 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 18:18:33 crc kubenswrapper[4830]: E0318 18:18:33.080498 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a872110-8984-419e-b5ed-177ec5669cfc-cert podName:7a872110-8984-419e-b5ed-177ec5669cfc nodeName:}" failed. No retries permitted until 2026-03-18 18:18:49.080482696 +0000 UTC m=+963.648113028 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7a872110-8984-419e-b5ed-177ec5669cfc-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899qtfsv" (UID: "7a872110-8984-419e-b5ed-177ec5669cfc") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 18:18:33 crc kubenswrapper[4830]: I0318 18:18:33.385022 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-x2hs7\" (UID: \"8c1cd3c4-f399-4810-bdaf-53644d7555ff\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-x2hs7" Mar 18 18:18:33 crc kubenswrapper[4830]: I0318 18:18:33.385088 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-x2hs7\" (UID: \"8c1cd3c4-f399-4810-bdaf-53644d7555ff\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-x2hs7" Mar 18 18:18:33 crc kubenswrapper[4830]: E0318 18:18:33.385198 4830 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 18:18:33 crc kubenswrapper[4830]: E0318 18:18:33.385230 4830 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 18:18:33 crc kubenswrapper[4830]: E0318 18:18:33.385283 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-metrics-certs podName:8c1cd3c4-f399-4810-bdaf-53644d7555ff nodeName:}" failed. No retries permitted until 2026-03-18 18:18:49.385260844 +0000 UTC m=+963.952891176 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-x2hs7" (UID: "8c1cd3c4-f399-4810-bdaf-53644d7555ff") : secret "metrics-server-cert" not found Mar 18 18:18:33 crc kubenswrapper[4830]: E0318 18:18:33.385298 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-webhook-certs podName:8c1cd3c4-f399-4810-bdaf-53644d7555ff nodeName:}" failed. No retries permitted until 2026-03-18 18:18:49.385292285 +0000 UTC m=+963.952922617 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-x2hs7" (UID: "8c1cd3c4-f399-4810-bdaf-53644d7555ff") : secret "webhook-server-cert" not found Mar 18 18:18:33 crc kubenswrapper[4830]: I0318 18:18:33.662486 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-bw766" event={"ID":"cebd0fbd-7733-464a-aead-539d69b70b04","Type":"ContainerStarted","Data":"a0ffb73389f9a71e89b91779d877ff6b87b6345b8f5b6c0b6d97a1d4882c8f72"} Mar 18 18:18:33 crc kubenswrapper[4830]: I0318 18:18:33.663466 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-bw766" Mar 18 18:18:33 crc kubenswrapper[4830]: I0318 18:18:33.666264 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-nvkbv" event={"ID":"abc98dd9-9c79-45a2-a641-023633c1b75b","Type":"ContainerStarted","Data":"d29a27f831c70c6682a7a5b7959e31d9138f24ff58ff36b66947810173c3c754"} Mar 18 18:18:33 crc kubenswrapper[4830]: I0318 18:18:33.666367 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-nvkbv" Mar 18 18:18:33 crc kubenswrapper[4830]: I0318 18:18:33.676884 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-n478g" event={"ID":"09f3d007-f621-4c30-a3f8-f3280a7db75d","Type":"ContainerStarted","Data":"3fc10896747de86c95452b010cb95d4f7a70ee861c84d692a7eb96b46e86dc3a"} Mar 18 18:18:33 crc kubenswrapper[4830]: I0318 18:18:33.677025 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-n478g" Mar 18 18:18:33 crc kubenswrapper[4830]: I0318 18:18:33.692307 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-b2f94" event={"ID":"45854217-6284-4678-903f-d64b4088ec29","Type":"ContainerStarted","Data":"06442dd34012919af0249ff1f305c70770a115ef10bddec49a11a1f85f8304c1"} Mar 18 18:18:33 crc kubenswrapper[4830]: I0318 18:18:33.692830 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-b2f94" Mar 18 18:18:33 crc kubenswrapper[4830]: I0318 18:18:33.707375 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cspdp" event={"ID":"bd94092b-4a34-4f83-9aaa-5ddac374e97a","Type":"ContainerStarted","Data":"244d0bd48ba6b9907db08f11588a815f973d83ca543b1cfc98a74e090c5accb6"} Mar 18 18:18:33 crc kubenswrapper[4830]: I0318 18:18:33.707526 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cspdp" Mar 18 18:18:33 crc kubenswrapper[4830]: I0318 18:18:33.723344 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-84z2j" event={"ID":"b1bf7404-a81d-42e8-bc1b-157c5cd791b0","Type":"ContainerStarted","Data":"2f734ff55742d2c703536d8a2af12cf0a21ae02606d59120f94a472bdf9260dd"} Mar 18 18:18:33 crc kubenswrapper[4830]: I0318 18:18:33.723510 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-84z2j" Mar 18 18:18:33 crc kubenswrapper[4830]: I0318 18:18:33.738686 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-4tflr" event={"ID":"ff6a5b70-c9ae-4087-b4fd-e24a712e6e33","Type":"ContainerStarted","Data":"318f7a76555d275f51b7ca953848a62c6fbc3b55bdc176b601909a989d6b58b1"} Mar 18 18:18:33 crc kubenswrapper[4830]: I0318 18:18:33.738822 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-4tflr" Mar 18 18:18:33 crc kubenswrapper[4830]: I0318 18:18:33.745437 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-cpvkm" event={"ID":"db7f527f-6421-4a26-9eae-5a68054b2a88","Type":"ContainerStarted","Data":"72a211fecaff31c915719ab0cfb9cb1271b8c2d6ee0bd20c36bf32a2431b9e0d"} Mar 18 18:18:33 crc kubenswrapper[4830]: I0318 18:18:33.745533 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-cpvkm" Mar 18 18:18:33 crc kubenswrapper[4830]: I0318 18:18:33.749430 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-b2f94" podStartSLOduration=3.006374495 podStartE2EDuration="17.749412642s" podCreationTimestamp="2026-03-18 18:18:16 +0000 UTC" firstStartedPulling="2026-03-18 18:18:17.950757521 +0000 UTC m=+932.518387843" lastFinishedPulling="2026-03-18 18:18:32.693795658 +0000 UTC m=+947.261425990" observedRunningTime="2026-03-18 18:18:33.748078275 +0000 UTC m=+948.315708597" watchObservedRunningTime="2026-03-18 18:18:33.749412642 +0000 UTC m=+948.317042974" Mar 18 18:18:33 crc kubenswrapper[4830]: I0318 18:18:33.752301 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-bw766" podStartSLOduration=2.782528018 podStartE2EDuration="17.752294142s" podCreationTimestamp="2026-03-18 18:18:16 +0000 UTC" firstStartedPulling="2026-03-18 18:18:17.733122518 +0000 UTC m=+932.300752850" lastFinishedPulling="2026-03-18 18:18:32.702888622 +0000 UTC m=+947.270518974" observedRunningTime="2026-03-18 18:18:33.707267794 +0000 UTC m=+948.274898126" watchObservedRunningTime="2026-03-18 18:18:33.752294142 +0000 UTC m=+948.319924474" Mar 18 18:18:33 crc kubenswrapper[4830]: I0318 18:18:33.765460 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-thwsc" event={"ID":"d60eef08-1564-405f-b4c1-3f391bbf741d","Type":"ContainerStarted","Data":"0e726d4b59032dd9a9bc2b00e6727d35636ecff61fcfa5f20c35cc7526d81581"} Mar 18 18:18:33 crc kubenswrapper[4830]: I0318 18:18:33.766086 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-thwsc" Mar 18 18:18:33 crc kubenswrapper[4830]: I0318 18:18:33.786945 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-pfj9c" event={"ID":"debd9df4-01da-4f5e-b66c-d1f7bd08574f","Type":"ContainerStarted","Data":"d82be99ea0ab3972c38d985f733f541c0cd07df2a327eca114acd0fd4db463dd"} Mar 18 18:18:33 crc kubenswrapper[4830]: I0318 18:18:33.787607 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-pfj9c" Mar 18 18:18:33 crc kubenswrapper[4830]: I0318 18:18:33.797195 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-6jmp6" event={"ID":"2f1b63b3-9d24-4f33-8d39-7decb4a7e0a8","Type":"ContainerStarted","Data":"a42d3fa743152c42b4e589afbff2d55d2968c8d7c98722200f3b95ef9caf2855"} Mar 18 18:18:33 crc kubenswrapper[4830]: I0318 18:18:33.797826 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-6jmp6" Mar 18 18:18:33 crc kubenswrapper[4830]: I0318 18:18:33.799816 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-ct2qr" event={"ID":"ed9931be-8631-4e73-92bd-ff18076dcb69","Type":"ContainerStarted","Data":"e569414543ad2332f0a358b9238b76fcdda77868c2040de4de330799d455dfe2"} Mar 18 18:18:33 crc kubenswrapper[4830]: I0318 18:18:33.799839 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-ct2qr" Mar 18 18:18:33 crc kubenswrapper[4830]: E0318 18:18:33.800953 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-4mzzb" podUID="1adc20ef-dc7a-4dce-ba20-6fe6eb6146f8" Mar 18 18:18:33 crc kubenswrapper[4830]: I0318 18:18:33.876895 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-n478g" podStartSLOduration=2.841690292 podStartE2EDuration="17.876876944s" podCreationTimestamp="2026-03-18 18:18:16 +0000 UTC" firstStartedPulling="2026-03-18 18:18:17.660069907 +0000 UTC m=+932.227700239" lastFinishedPulling="2026-03-18 18:18:32.695256559 +0000 UTC m=+947.262886891" observedRunningTime="2026-03-18 18:18:33.809065329 +0000 UTC m=+948.376695661" watchObservedRunningTime="2026-03-18 18:18:33.876876944 +0000 UTC m=+948.444507276" Mar 18 18:18:33 crc kubenswrapper[4830]: I0318 18:18:33.877541 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-nvkbv" podStartSLOduration=3.454647084 podStartE2EDuration="17.877533813s" podCreationTimestamp="2026-03-18 18:18:16 +0000 UTC" firstStartedPulling="2026-03-18 18:18:18.273619925 +0000 UTC m=+932.841250257" lastFinishedPulling="2026-03-18 18:18:32.696506654 +0000 UTC m=+947.264136986" observedRunningTime="2026-03-18 18:18:33.871188335 +0000 UTC m=+948.438818667" watchObservedRunningTime="2026-03-18 18:18:33.877533813 +0000 UTC m=+948.445164145" Mar 18 18:18:33 crc kubenswrapper[4830]: I0318 18:18:33.894637 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cspdp" podStartSLOduration=3.131567875 podStartE2EDuration="17.89461294s" podCreationTimestamp="2026-03-18 18:18:16 +0000 UTC" firstStartedPulling="2026-03-18 18:18:17.932883532 +0000 UTC m=+932.500513864" lastFinishedPulling="2026-03-18 18:18:32.695928597 +0000 UTC m=+947.263558929" observedRunningTime="2026-03-18 18:18:33.889786305 +0000 UTC m=+948.457416637" watchObservedRunningTime="2026-03-18 18:18:33.89461294 +0000 UTC m=+948.462243272" Mar 18 18:18:33 crc kubenswrapper[4830]: I0318 18:18:33.924378 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-pfj9c" podStartSLOduration=3.786861849 podStartE2EDuration="17.924356601s" podCreationTimestamp="2026-03-18 18:18:16 +0000 UTC" firstStartedPulling="2026-03-18 18:18:18.557804038 +0000 UTC m=+933.125434370" lastFinishedPulling="2026-03-18 18:18:32.69529878 +0000 UTC m=+947.262929122" observedRunningTime="2026-03-18 18:18:33.920887284 +0000 UTC m=+948.488517616" watchObservedRunningTime="2026-03-18 18:18:33.924356601 +0000 UTC m=+948.491986933" Mar 18 18:18:33 crc kubenswrapper[4830]: I0318 18:18:33.979921 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-6jmp6" podStartSLOduration=3.663222154 podStartE2EDuration="17.979904254s" podCreationTimestamp="2026-03-18 18:18:16 +0000 UTC" firstStartedPulling="2026-03-18 18:18:18.38042902 +0000 UTC m=+932.948059352" lastFinishedPulling="2026-03-18 18:18:32.69711112 +0000 UTC m=+947.264741452" observedRunningTime="2026-03-18 18:18:33.976389806 +0000 UTC m=+948.544020138" watchObservedRunningTime="2026-03-18 18:18:33.979904254 +0000 UTC m=+948.547534586" Mar 18 18:18:34 crc kubenswrapper[4830]: I0318 18:18:34.014437 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-4tflr" podStartSLOduration=3.21440704 podStartE2EDuration="18.014422919s" podCreationTimestamp="2026-03-18 18:18:16 +0000 UTC" firstStartedPulling="2026-03-18 18:18:17.897084381 +0000 UTC m=+932.464714713" lastFinishedPulling="2026-03-18 18:18:32.69710026 +0000 UTC m=+947.264730592" observedRunningTime="2026-03-18 18:18:34.012147975 +0000 UTC m=+948.579778307" watchObservedRunningTime="2026-03-18 18:18:34.014422919 +0000 UTC m=+948.582053251" Mar 18 18:18:34 crc kubenswrapper[4830]: I0318 18:18:34.052305 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-cpvkm" podStartSLOduration=3.606263092 podStartE2EDuration="18.052287447s" podCreationTimestamp="2026-03-18 18:18:16 +0000 UTC" firstStartedPulling="2026-03-18 18:18:18.250040636 +0000 UTC m=+932.817670968" lastFinishedPulling="2026-03-18 18:18:32.696064981 +0000 UTC m=+947.263695323" observedRunningTime="2026-03-18 18:18:34.047858093 +0000 UTC m=+948.615488425" watchObservedRunningTime="2026-03-18 18:18:34.052287447 +0000 UTC m=+948.619917779" Mar 18 18:18:34 crc kubenswrapper[4830]: I0318 18:18:34.088935 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-thwsc" podStartSLOduration=3.647338819 podStartE2EDuration="18.08889827s" podCreationTimestamp="2026-03-18 18:18:16 +0000 UTC" firstStartedPulling="2026-03-18 18:18:18.323467648 +0000 UTC m=+932.891097980" lastFinishedPulling="2026-03-18 18:18:32.765027089 +0000 UTC m=+947.332657431" observedRunningTime="2026-03-18 18:18:34.082874682 +0000 UTC m=+948.650505014" watchObservedRunningTime="2026-03-18 18:18:34.08889827 +0000 UTC m=+948.656528602" Mar 18 18:18:34 crc kubenswrapper[4830]: I0318 18:18:34.134826 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-ct2qr" podStartSLOduration=3.81518922 podStartE2EDuration="18.134807573s" podCreationTimestamp="2026-03-18 18:18:16 +0000 UTC" firstStartedPulling="2026-03-18 18:18:18.378354402 +0000 UTC m=+932.945984734" lastFinishedPulling="2026-03-18 18:18:32.697972745 +0000 UTC m=+947.265603087" observedRunningTime="2026-03-18 18:18:34.131257214 +0000 UTC m=+948.698887546" watchObservedRunningTime="2026-03-18 18:18:34.134807573 +0000 UTC m=+948.702437905" Mar 18 18:18:34 crc kubenswrapper[4830]: I0318 18:18:34.136358 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-84z2j" podStartSLOduration=3.340188306 podStartE2EDuration="18.136352667s" podCreationTimestamp="2026-03-18 18:18:16 +0000 UTC" firstStartedPulling="2026-03-18 18:18:17.897576975 +0000 UTC m=+932.465207307" lastFinishedPulling="2026-03-18 18:18:32.693741326 +0000 UTC m=+947.261371668" observedRunningTime="2026-03-18 18:18:34.109042433 +0000 UTC m=+948.676672765" watchObservedRunningTime="2026-03-18 18:18:34.136352667 +0000 UTC m=+948.703982999" Mar 18 18:18:37 crc kubenswrapper[4830]: I0318 18:18:37.338620 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-cpvkm" Mar 18 18:18:45 crc kubenswrapper[4830]: I0318 18:18:45.895156 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-r9bbq" event={"ID":"f68f6008-a3fb-4039-85a0-c0475455ac09","Type":"ContainerStarted","Data":"0688b1116b7e64a4bf9514ebccfb287a1de25feceb133f99ce275901277fa44f"} Mar 18 18:18:45 crc kubenswrapper[4830]: I0318 18:18:45.896915 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-r9bbq" Mar 18 18:18:45 crc kubenswrapper[4830]: I0318 18:18:45.899447 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-vjzn6" event={"ID":"6a1a3dc1-1535-4091-97a8-abc6dc2d1388","Type":"ContainerStarted","Data":"aeb6f0b8d8cd02b491d54fbe16f7b9983558abadb7abc20fce44319ed82a2e9b"} Mar 18 18:18:45 crc kubenswrapper[4830]: I0318 18:18:45.899668 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-vjzn6" Mar 18 18:18:45 crc kubenswrapper[4830]: I0318 18:18:45.902205 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-x9zq5" event={"ID":"6fd482c2-5871-4d59-8402-1e57b06055b0","Type":"ContainerStarted","Data":"3468d1ea275f6ff74ef7463fead743b2ec894ceca079f9375de50a772e5e6531"} Mar 18 18:18:45 crc kubenswrapper[4830]: I0318 18:18:45.902379 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-x9zq5" Mar 18 18:18:45 crc kubenswrapper[4830]: I0318 18:18:45.903993 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-d76n9" event={"ID":"e7de1579-8ed8-4434-818e-a5ea0c366cf7","Type":"ContainerStarted","Data":"8baec63a312e62bfb3b47ab812816ab7fcc3d6c49be06f09376d0746e83345bb"} Mar 18 18:18:45 crc kubenswrapper[4830]: I0318 18:18:45.904233 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-d76n9" Mar 18 18:18:45 crc kubenswrapper[4830]: I0318 18:18:45.906607 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6282s" event={"ID":"28e6b5ce-47e2-43fb-b524-c2e642dbd166","Type":"ContainerStarted","Data":"a412615952b560c237756827d18838f5933266c8141c56e13bfc72fc358f9d94"} Mar 18 18:18:45 crc kubenswrapper[4830]: I0318 18:18:45.907189 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6282s" Mar 18 18:18:45 crc kubenswrapper[4830]: I0318 18:18:45.909232 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-85j54" event={"ID":"e663b49d-cd0f-4a18-8284-b12dad6c136a","Type":"ContainerStarted","Data":"a414924d86268f650c57b8f5de96110e74dc2aaf2ca39493d72db260a44dd983"} Mar 18 18:18:45 crc kubenswrapper[4830]: I0318 18:18:45.923688 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-r9bbq" podStartSLOduration=2.09241798 podStartE2EDuration="28.923666952s" podCreationTimestamp="2026-03-18 18:18:17 +0000 UTC" firstStartedPulling="2026-03-18 18:18:18.643807742 +0000 UTC m=+933.211438074" lastFinishedPulling="2026-03-18 18:18:45.475056724 +0000 UTC m=+960.042687046" observedRunningTime="2026-03-18 18:18:45.915616727 +0000 UTC m=+960.483247109" watchObservedRunningTime="2026-03-18 18:18:45.923666952 +0000 UTC m=+960.491297294" Mar 18 18:18:45 crc kubenswrapper[4830]: I0318 18:18:45.932756 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-vjzn6" podStartSLOduration=2.863664715 podStartE2EDuration="29.932734225s" podCreationTimestamp="2026-03-18 18:18:16 +0000 UTC" firstStartedPulling="2026-03-18 18:18:18.40654901 +0000 UTC m=+932.974179342" lastFinishedPulling="2026-03-18 18:18:45.47561852 +0000 UTC m=+960.043248852" observedRunningTime="2026-03-18 18:18:45.93182628 +0000 UTC m=+960.499456612" watchObservedRunningTime="2026-03-18 18:18:45.932734225 +0000 UTC m=+960.500364577" Mar 18 18:18:45 crc kubenswrapper[4830]: I0318 18:18:45.951147 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6282s" podStartSLOduration=2.83449749 podStartE2EDuration="29.951126739s" podCreationTimestamp="2026-03-18 18:18:16 +0000 UTC" firstStartedPulling="2026-03-18 18:18:18.415062388 +0000 UTC m=+932.982692720" lastFinishedPulling="2026-03-18 18:18:45.531691637 +0000 UTC m=+960.099321969" observedRunningTime="2026-03-18 18:18:45.944430802 +0000 UTC m=+960.512061134" watchObservedRunningTime="2026-03-18 18:18:45.951126739 +0000 UTC m=+960.518757071" Mar 18 18:18:45 crc kubenswrapper[4830]: I0318 18:18:45.962235 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-d76n9" podStartSLOduration=2.840579 podStartE2EDuration="29.962217289s" podCreationTimestamp="2026-03-18 18:18:16 +0000 UTC" firstStartedPulling="2026-03-18 18:18:18.410019547 +0000 UTC m=+932.977649879" lastFinishedPulling="2026-03-18 18:18:45.531657836 +0000 UTC m=+960.099288168" observedRunningTime="2026-03-18 18:18:45.95866189 +0000 UTC m=+960.526292222" watchObservedRunningTime="2026-03-18 18:18:45.962217289 +0000 UTC m=+960.529847621" Mar 18 18:18:45 crc kubenswrapper[4830]: I0318 18:18:45.976743 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-85j54" podStartSLOduration=2.028612757 podStartE2EDuration="28.976716785s" podCreationTimestamp="2026-03-18 18:18:17 +0000 UTC" firstStartedPulling="2026-03-18 18:18:18.585561204 +0000 UTC m=+933.153191536" lastFinishedPulling="2026-03-18 18:18:45.533665232 +0000 UTC m=+960.101295564" observedRunningTime="2026-03-18 18:18:45.970751728 +0000 UTC m=+960.538382070" watchObservedRunningTime="2026-03-18 18:18:45.976716785 +0000 UTC m=+960.544347137" Mar 18 18:18:45 crc kubenswrapper[4830]: I0318 18:18:45.991811 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-x9zq5" podStartSLOduration=2.063690936 podStartE2EDuration="28.991795726s" podCreationTimestamp="2026-03-18 18:18:17 +0000 UTC" firstStartedPulling="2026-03-18 18:18:18.558168768 +0000 UTC m=+933.125799100" lastFinishedPulling="2026-03-18 18:18:45.486273558 +0000 UTC m=+960.053903890" observedRunningTime="2026-03-18 18:18:45.985878001 +0000 UTC m=+960.553508333" watchObservedRunningTime="2026-03-18 18:18:45.991795726 +0000 UTC m=+960.559426058" Mar 18 18:18:46 crc kubenswrapper[4830]: I0318 18:18:46.943276 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-n478g" Mar 18 18:18:46 crc kubenswrapper[4830]: I0318 18:18:46.959067 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-4tflr" Mar 18 18:18:46 crc kubenswrapper[4830]: I0318 18:18:46.981202 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-bw766" Mar 18 18:18:47 crc kubenswrapper[4830]: I0318 18:18:47.036741 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cspdp" Mar 18 18:18:47 crc kubenswrapper[4830]: I0318 18:18:47.042962 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-84z2j" Mar 18 18:18:47 crc kubenswrapper[4830]: I0318 18:18:47.112792 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-b2f94" Mar 18 18:18:47 crc kubenswrapper[4830]: I0318 18:18:47.230317 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-nvkbv" Mar 18 18:18:47 crc kubenswrapper[4830]: I0318 18:18:47.364369 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-6jmp6" Mar 18 18:18:47 crc kubenswrapper[4830]: I0318 18:18:47.591295 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-ct2qr" Mar 18 18:18:47 crc kubenswrapper[4830]: I0318 18:18:47.686758 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-thwsc" Mar 18 18:18:47 crc kubenswrapper[4830]: I0318 18:18:47.753539 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-pfj9c" Mar 18 18:18:47 crc kubenswrapper[4830]: I0318 18:18:47.931597 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-4mzzb" event={"ID":"1adc20ef-dc7a-4dce-ba20-6fe6eb6146f8","Type":"ContainerStarted","Data":"50868a1733f102bf9d88396f1beef7a4de612376bc1d540a3fea47ef1f3b6acb"} Mar 18 18:18:47 crc kubenswrapper[4830]: I0318 18:18:47.932843 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-4mzzb" Mar 18 18:18:48 crc kubenswrapper[4830]: I0318 18:18:48.874845 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d9eef66-a93f-432a-8391-f6a55dc3f800-cert\") pod \"infra-operator-controller-manager-7b9c774f96-mg24p\" (UID: \"3d9eef66-a93f-432a-8391-f6a55dc3f800\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-mg24p" Mar 18 18:18:48 crc kubenswrapper[4830]: I0318 18:18:48.885882 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d9eef66-a93f-432a-8391-f6a55dc3f800-cert\") pod \"infra-operator-controller-manager-7b9c774f96-mg24p\" (UID: \"3d9eef66-a93f-432a-8391-f6a55dc3f800\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-mg24p" Mar 18 18:18:48 crc kubenswrapper[4830]: I0318 18:18:48.943403 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dls7x" event={"ID":"7e1c4c11-ddb2-45a3-94eb-8b5b27866996","Type":"ContainerStarted","Data":"75aa5f367117e068a0d688c004a0accf5bb2e28232394746e9e9d1e162a7c4cf"} Mar 18 18:18:48 crc kubenswrapper[4830]: I0318 18:18:48.944220 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dls7x" Mar 18 18:18:48 crc kubenswrapper[4830]: I0318 18:18:48.972648 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-4mzzb" podStartSLOduration=4.486520055 podStartE2EDuration="32.972593928s" podCreationTimestamp="2026-03-18 18:18:16 +0000 UTC" firstStartedPulling="2026-03-18 18:18:18.310435924 +0000 UTC m=+932.878066256" lastFinishedPulling="2026-03-18 18:18:46.796509787 +0000 UTC m=+961.364140129" observedRunningTime="2026-03-18 18:18:47.951605672 +0000 UTC m=+962.519236014" watchObservedRunningTime="2026-03-18 18:18:48.972593928 +0000 UTC m=+963.540224300" Mar 18 18:18:48 crc kubenswrapper[4830]: I0318 18:18:48.976730 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-tx4km" Mar 18 18:18:48 crc kubenswrapper[4830]: I0318 18:18:48.977105 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dls7x" podStartSLOduration=3.474243462 podStartE2EDuration="32.977083764s" podCreationTimestamp="2026-03-18 18:18:16 +0000 UTC" firstStartedPulling="2026-03-18 18:18:18.250792117 +0000 UTC m=+932.818422449" lastFinishedPulling="2026-03-18 18:18:47.753632419 +0000 UTC m=+962.321262751" observedRunningTime="2026-03-18 18:18:48.965447638 +0000 UTC m=+963.533078000" watchObservedRunningTime="2026-03-18 18:18:48.977083764 +0000 UTC m=+963.544714146" Mar 18 18:18:48 crc kubenswrapper[4830]: I0318 18:18:48.987969 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-mg24p" Mar 18 18:18:49 crc kubenswrapper[4830]: I0318 18:18:49.098686 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a872110-8984-419e-b5ed-177ec5669cfc-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899qtfsv\" (UID: \"7a872110-8984-419e-b5ed-177ec5669cfc\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899qtfsv" Mar 18 18:18:49 crc kubenswrapper[4830]: I0318 18:18:49.105251 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a872110-8984-419e-b5ed-177ec5669cfc-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899qtfsv\" (UID: \"7a872110-8984-419e-b5ed-177ec5669cfc\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899qtfsv" Mar 18 18:18:49 crc kubenswrapper[4830]: I0318 18:18:49.223509 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-8h78k" Mar 18 18:18:49 crc kubenswrapper[4830]: I0318 18:18:49.233152 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899qtfsv" Mar 18 18:18:49 crc kubenswrapper[4830]: I0318 18:18:49.338492 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-mg24p"] Mar 18 18:18:49 crc kubenswrapper[4830]: I0318 18:18:49.405422 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-x2hs7\" (UID: \"8c1cd3c4-f399-4810-bdaf-53644d7555ff\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-x2hs7" Mar 18 18:18:49 crc kubenswrapper[4830]: I0318 18:18:49.405466 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-x2hs7\" (UID: \"8c1cd3c4-f399-4810-bdaf-53644d7555ff\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-x2hs7" Mar 18 18:18:49 crc kubenswrapper[4830]: I0318 18:18:49.411522 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-x2hs7\" (UID: \"8c1cd3c4-f399-4810-bdaf-53644d7555ff\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-x2hs7" Mar 18 18:18:49 crc kubenswrapper[4830]: I0318 18:18:49.415247 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8c1cd3c4-f399-4810-bdaf-53644d7555ff-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-x2hs7\" (UID: \"8c1cd3c4-f399-4810-bdaf-53644d7555ff\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-x2hs7" Mar 18 18:18:49 crc kubenswrapper[4830]: I0318 18:18:49.497885 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899qtfsv"] Mar 18 18:18:49 crc kubenswrapper[4830]: I0318 18:18:49.622108 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-prts6" Mar 18 18:18:49 crc kubenswrapper[4830]: I0318 18:18:49.630466 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-x2hs7" Mar 18 18:18:49 crc kubenswrapper[4830]: I0318 18:18:49.910373 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-86bd8996f6-x2hs7"] Mar 18 18:18:49 crc kubenswrapper[4830]: W0318 18:18:49.923192 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c1cd3c4_f399_4810_bdaf_53644d7555ff.slice/crio-1e24c1f1a4341d434b4d118e5613598ca90cf380ddb1daf562f9bcab2cbce0d9 WatchSource:0}: Error finding container 1e24c1f1a4341d434b4d118e5613598ca90cf380ddb1daf562f9bcab2cbce0d9: Status 404 returned error can't find the container with id 1e24c1f1a4341d434b4d118e5613598ca90cf380ddb1daf562f9bcab2cbce0d9 Mar 18 18:18:49 crc kubenswrapper[4830]: I0318 18:18:49.951554 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899qtfsv" event={"ID":"7a872110-8984-419e-b5ed-177ec5669cfc","Type":"ContainerStarted","Data":"a5af22e9dc2ba6304464739628eb7e86ee01723e0810826ac16cf13c77cbab76"} Mar 18 18:18:49 crc kubenswrapper[4830]: I0318 18:18:49.953266 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-mg24p" event={"ID":"3d9eef66-a93f-432a-8391-f6a55dc3f800","Type":"ContainerStarted","Data":"ce51809bf4ba0e93b20d8932dbf4740fefc39f22fee5e4a745dbaa7964954b38"} Mar 18 18:18:49 crc kubenswrapper[4830]: I0318 18:18:49.954429 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-x2hs7" event={"ID":"8c1cd3c4-f399-4810-bdaf-53644d7555ff","Type":"ContainerStarted","Data":"1e24c1f1a4341d434b4d118e5613598ca90cf380ddb1daf562f9bcab2cbce0d9"} Mar 18 18:18:50 crc kubenswrapper[4830]: I0318 18:18:50.973002 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-x2hs7" event={"ID":"8c1cd3c4-f399-4810-bdaf-53644d7555ff","Type":"ContainerStarted","Data":"cbdf6cf95c1947df893fb828e04d6ed7da98c1a7c43ef3bd72742ffddf762719"} Mar 18 18:18:50 crc kubenswrapper[4830]: I0318 18:18:50.973648 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-x2hs7" Mar 18 18:18:51 crc kubenswrapper[4830]: I0318 18:18:51.005678 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-x2hs7" podStartSLOduration=34.00565944 podStartE2EDuration="34.00565944s" podCreationTimestamp="2026-03-18 18:18:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:18:51.00422443 +0000 UTC m=+965.571854762" watchObservedRunningTime="2026-03-18 18:18:51.00565944 +0000 UTC m=+965.573289782" Mar 18 18:18:52 crc kubenswrapper[4830]: I0318 18:18:52.994725 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-mg24p" event={"ID":"3d9eef66-a93f-432a-8391-f6a55dc3f800","Type":"ContainerStarted","Data":"38176674a20c9179c24349bd29e2431a16698bb864bd78446495d2b4ef48b8be"} Mar 18 18:18:52 crc kubenswrapper[4830]: I0318 18:18:52.996859 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-mg24p" Mar 18 18:18:52 crc kubenswrapper[4830]: I0318 18:18:52.998262 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899qtfsv" event={"ID":"7a872110-8984-419e-b5ed-177ec5669cfc","Type":"ContainerStarted","Data":"e0bc71ee914cbd1d23c3f98d9a7d17d555a118cb34f67befc57de1218aa3eae4"} Mar 18 18:18:52 crc kubenswrapper[4830]: I0318 18:18:52.998510 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899qtfsv" Mar 18 18:18:53 crc kubenswrapper[4830]: I0318 18:18:53.037967 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-mg24p" podStartSLOduration=34.267262732 podStartE2EDuration="37.03793329s" podCreationTimestamp="2026-03-18 18:18:16 +0000 UTC" firstStartedPulling="2026-03-18 18:18:49.350611033 +0000 UTC m=+963.918241355" lastFinishedPulling="2026-03-18 18:18:52.121281581 +0000 UTC m=+966.688911913" observedRunningTime="2026-03-18 18:18:53.026244524 +0000 UTC m=+967.593874886" watchObservedRunningTime="2026-03-18 18:18:53.03793329 +0000 UTC m=+967.605563663" Mar 18 18:18:53 crc kubenswrapper[4830]: I0318 18:18:53.071144 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899qtfsv" podStartSLOduration=34.435067772 podStartE2EDuration="37.071124888s" podCreationTimestamp="2026-03-18 18:18:16 +0000 UTC" firstStartedPulling="2026-03-18 18:18:49.508382452 +0000 UTC m=+964.076012794" lastFinishedPulling="2026-03-18 18:18:52.144439578 +0000 UTC m=+966.712069910" observedRunningTime="2026-03-18 18:18:53.067837326 +0000 UTC m=+967.635467668" watchObservedRunningTime="2026-03-18 18:18:53.071124888 +0000 UTC m=+967.638755230" Mar 18 18:18:57 crc kubenswrapper[4830]: I0318 18:18:57.332060 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dls7x" Mar 18 18:18:57 crc kubenswrapper[4830]: I0318 18:18:57.359705 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-vjzn6" Mar 18 18:18:57 crc kubenswrapper[4830]: I0318 18:18:57.392845 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-4mzzb" Mar 18 18:18:57 crc kubenswrapper[4830]: I0318 18:18:57.425234 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6282s" Mar 18 18:18:57 crc kubenswrapper[4830]: I0318 18:18:57.453996 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-d76n9" Mar 18 18:18:57 crc kubenswrapper[4830]: I0318 18:18:57.768405 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-x9zq5" Mar 18 18:18:57 crc kubenswrapper[4830]: I0318 18:18:57.806817 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-r9bbq" Mar 18 18:18:58 crc kubenswrapper[4830]: I0318 18:18:58.997691 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-mg24p" Mar 18 18:18:59 crc kubenswrapper[4830]: I0318 18:18:59.241813 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899qtfsv" Mar 18 18:18:59 crc kubenswrapper[4830]: I0318 18:18:59.509511 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:18:59 crc kubenswrapper[4830]: I0318 18:18:59.509592 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:18:59 crc kubenswrapper[4830]: I0318 18:18:59.640415 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-x2hs7" Mar 18 18:19:15 crc kubenswrapper[4830]: I0318 18:19:15.616418 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-brw2k"] Mar 18 18:19:15 crc kubenswrapper[4830]: I0318 18:19:15.618262 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-brw2k" Mar 18 18:19:15 crc kubenswrapper[4830]: I0318 18:19:15.625021 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 18 18:19:15 crc kubenswrapper[4830]: I0318 18:19:15.625419 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 18 18:19:15 crc kubenswrapper[4830]: I0318 18:19:15.625648 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-jgqvq" Mar 18 18:19:15 crc kubenswrapper[4830]: I0318 18:19:15.625832 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 18 18:19:15 crc kubenswrapper[4830]: I0318 18:19:15.639165 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-brw2k"] Mar 18 18:19:15 crc kubenswrapper[4830]: I0318 18:19:15.659711 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64696987c5-5v6nj"] Mar 18 18:19:15 crc kubenswrapper[4830]: I0318 18:19:15.660898 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-5v6nj" Mar 18 18:19:15 crc kubenswrapper[4830]: I0318 18:19:15.664080 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 18 18:19:15 crc kubenswrapper[4830]: I0318 18:19:15.685458 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-5v6nj"] Mar 18 18:19:15 crc kubenswrapper[4830]: I0318 18:19:15.708443 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57ppt\" (UniqueName: \"kubernetes.io/projected/2e41d0ce-3d03-4246-a830-b03518c21f89-kube-api-access-57ppt\") pod \"dnsmasq-dns-5448ff6dc7-brw2k\" (UID: \"2e41d0ce-3d03-4246-a830-b03518c21f89\") " pod="openstack/dnsmasq-dns-5448ff6dc7-brw2k" Mar 18 18:19:15 crc kubenswrapper[4830]: I0318 18:19:15.708551 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e41d0ce-3d03-4246-a830-b03518c21f89-config\") pod \"dnsmasq-dns-5448ff6dc7-brw2k\" (UID: \"2e41d0ce-3d03-4246-a830-b03518c21f89\") " pod="openstack/dnsmasq-dns-5448ff6dc7-brw2k" Mar 18 18:19:15 crc kubenswrapper[4830]: I0318 18:19:15.811675 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvlrk\" (UniqueName: \"kubernetes.io/projected/c01f528c-b72a-4d75-8480-484e9d5ad79a-kube-api-access-mvlrk\") pod \"dnsmasq-dns-64696987c5-5v6nj\" (UID: \"c01f528c-b72a-4d75-8480-484e9d5ad79a\") " pod="openstack/dnsmasq-dns-64696987c5-5v6nj" Mar 18 18:19:15 crc kubenswrapper[4830]: I0318 18:19:15.811833 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c01f528c-b72a-4d75-8480-484e9d5ad79a-config\") pod \"dnsmasq-dns-64696987c5-5v6nj\" (UID: \"c01f528c-b72a-4d75-8480-484e9d5ad79a\") " pod="openstack/dnsmasq-dns-64696987c5-5v6nj" Mar 18 18:19:15 crc kubenswrapper[4830]: I0318 18:19:15.811928 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57ppt\" (UniqueName: \"kubernetes.io/projected/2e41d0ce-3d03-4246-a830-b03518c21f89-kube-api-access-57ppt\") pod \"dnsmasq-dns-5448ff6dc7-brw2k\" (UID: \"2e41d0ce-3d03-4246-a830-b03518c21f89\") " pod="openstack/dnsmasq-dns-5448ff6dc7-brw2k" Mar 18 18:19:15 crc kubenswrapper[4830]: I0318 18:19:15.811985 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c01f528c-b72a-4d75-8480-484e9d5ad79a-dns-svc\") pod \"dnsmasq-dns-64696987c5-5v6nj\" (UID: \"c01f528c-b72a-4d75-8480-484e9d5ad79a\") " pod="openstack/dnsmasq-dns-64696987c5-5v6nj" Mar 18 18:19:15 crc kubenswrapper[4830]: I0318 18:19:15.812025 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e41d0ce-3d03-4246-a830-b03518c21f89-config\") pod \"dnsmasq-dns-5448ff6dc7-brw2k\" (UID: \"2e41d0ce-3d03-4246-a830-b03518c21f89\") " pod="openstack/dnsmasq-dns-5448ff6dc7-brw2k" Mar 18 18:19:15 crc kubenswrapper[4830]: I0318 18:19:15.812936 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e41d0ce-3d03-4246-a830-b03518c21f89-config\") pod \"dnsmasq-dns-5448ff6dc7-brw2k\" (UID: \"2e41d0ce-3d03-4246-a830-b03518c21f89\") " pod="openstack/dnsmasq-dns-5448ff6dc7-brw2k" Mar 18 18:19:15 crc kubenswrapper[4830]: I0318 18:19:15.833284 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57ppt\" (UniqueName: \"kubernetes.io/projected/2e41d0ce-3d03-4246-a830-b03518c21f89-kube-api-access-57ppt\") pod \"dnsmasq-dns-5448ff6dc7-brw2k\" (UID: \"2e41d0ce-3d03-4246-a830-b03518c21f89\") " pod="openstack/dnsmasq-dns-5448ff6dc7-brw2k" Mar 18 18:19:15 crc kubenswrapper[4830]: I0318 18:19:15.914583 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c01f528c-b72a-4d75-8480-484e9d5ad79a-dns-svc\") pod \"dnsmasq-dns-64696987c5-5v6nj\" (UID: \"c01f528c-b72a-4d75-8480-484e9d5ad79a\") " pod="openstack/dnsmasq-dns-64696987c5-5v6nj" Mar 18 18:19:15 crc kubenswrapper[4830]: I0318 18:19:15.914981 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvlrk\" (UniqueName: \"kubernetes.io/projected/c01f528c-b72a-4d75-8480-484e9d5ad79a-kube-api-access-mvlrk\") pod \"dnsmasq-dns-64696987c5-5v6nj\" (UID: \"c01f528c-b72a-4d75-8480-484e9d5ad79a\") " pod="openstack/dnsmasq-dns-64696987c5-5v6nj" Mar 18 18:19:15 crc kubenswrapper[4830]: I0318 18:19:15.915067 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c01f528c-b72a-4d75-8480-484e9d5ad79a-config\") pod \"dnsmasq-dns-64696987c5-5v6nj\" (UID: \"c01f528c-b72a-4d75-8480-484e9d5ad79a\") " pod="openstack/dnsmasq-dns-64696987c5-5v6nj" Mar 18 18:19:15 crc kubenswrapper[4830]: I0318 18:19:15.915558 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c01f528c-b72a-4d75-8480-484e9d5ad79a-dns-svc\") pod \"dnsmasq-dns-64696987c5-5v6nj\" (UID: \"c01f528c-b72a-4d75-8480-484e9d5ad79a\") " pod="openstack/dnsmasq-dns-64696987c5-5v6nj" Mar 18 18:19:15 crc kubenswrapper[4830]: I0318 18:19:15.915971 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c01f528c-b72a-4d75-8480-484e9d5ad79a-config\") pod \"dnsmasq-dns-64696987c5-5v6nj\" (UID: \"c01f528c-b72a-4d75-8480-484e9d5ad79a\") " pod="openstack/dnsmasq-dns-64696987c5-5v6nj" Mar 18 18:19:15 crc kubenswrapper[4830]: I0318 18:19:15.931696 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvlrk\" (UniqueName: \"kubernetes.io/projected/c01f528c-b72a-4d75-8480-484e9d5ad79a-kube-api-access-mvlrk\") pod \"dnsmasq-dns-64696987c5-5v6nj\" (UID: \"c01f528c-b72a-4d75-8480-484e9d5ad79a\") " pod="openstack/dnsmasq-dns-64696987c5-5v6nj" Mar 18 18:19:15 crc kubenswrapper[4830]: I0318 18:19:15.949624 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-brw2k" Mar 18 18:19:15 crc kubenswrapper[4830]: I0318 18:19:15.979153 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-5v6nj" Mar 18 18:19:16 crc kubenswrapper[4830]: I0318 18:19:16.414047 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-brw2k"] Mar 18 18:19:16 crc kubenswrapper[4830]: I0318 18:19:16.488071 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-5v6nj"] Mar 18 18:19:16 crc kubenswrapper[4830]: W0318 18:19:16.494629 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc01f528c_b72a_4d75_8480_484e9d5ad79a.slice/crio-f899d169c285580ec65fc1d9ef8b53a7d1ffbd9f4671d4ee7a85291cf58fcda6 WatchSource:0}: Error finding container f899d169c285580ec65fc1d9ef8b53a7d1ffbd9f4671d4ee7a85291cf58fcda6: Status 404 returned error can't find the container with id f899d169c285580ec65fc1d9ef8b53a7d1ffbd9f4671d4ee7a85291cf58fcda6 Mar 18 18:19:17 crc kubenswrapper[4830]: I0318 18:19:17.255412 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5448ff6dc7-brw2k" event={"ID":"2e41d0ce-3d03-4246-a830-b03518c21f89","Type":"ContainerStarted","Data":"1acb21ca7263839ff3e9e5035baaf42d68ae7413294ffdbb7a682aff51d5652a"} Mar 18 18:19:17 crc kubenswrapper[4830]: I0318 18:19:17.257270 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64696987c5-5v6nj" event={"ID":"c01f528c-b72a-4d75-8480-484e9d5ad79a","Type":"ContainerStarted","Data":"f899d169c285580ec65fc1d9ef8b53a7d1ffbd9f4671d4ee7a85291cf58fcda6"} Mar 18 18:19:18 crc kubenswrapper[4830]: I0318 18:19:18.666277 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-brw2k"] Mar 18 18:19:18 crc kubenswrapper[4830]: I0318 18:19:18.704002 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-vvwtp"] Mar 18 18:19:18 crc kubenswrapper[4830]: I0318 18:19:18.705458 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658f55c9f5-vvwtp" Mar 18 18:19:18 crc kubenswrapper[4830]: I0318 18:19:18.715292 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-vvwtp"] Mar 18 18:19:18 crc kubenswrapper[4830]: I0318 18:19:18.881282 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9k9w\" (UniqueName: \"kubernetes.io/projected/24d85c15-a28a-40ef-92cb-611d03123bc8-kube-api-access-m9k9w\") pod \"dnsmasq-dns-658f55c9f5-vvwtp\" (UID: \"24d85c15-a28a-40ef-92cb-611d03123bc8\") " pod="openstack/dnsmasq-dns-658f55c9f5-vvwtp" Mar 18 18:19:18 crc kubenswrapper[4830]: I0318 18:19:18.881343 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24d85c15-a28a-40ef-92cb-611d03123bc8-config\") pod \"dnsmasq-dns-658f55c9f5-vvwtp\" (UID: \"24d85c15-a28a-40ef-92cb-611d03123bc8\") " pod="openstack/dnsmasq-dns-658f55c9f5-vvwtp" Mar 18 18:19:18 crc kubenswrapper[4830]: I0318 18:19:18.881382 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24d85c15-a28a-40ef-92cb-611d03123bc8-dns-svc\") pod \"dnsmasq-dns-658f55c9f5-vvwtp\" (UID: \"24d85c15-a28a-40ef-92cb-611d03123bc8\") " pod="openstack/dnsmasq-dns-658f55c9f5-vvwtp" Mar 18 18:19:18 crc kubenswrapper[4830]: I0318 18:19:18.984882 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-5v6nj"] Mar 18 18:19:18 crc kubenswrapper[4830]: I0318 18:19:18.985867 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9k9w\" (UniqueName: \"kubernetes.io/projected/24d85c15-a28a-40ef-92cb-611d03123bc8-kube-api-access-m9k9w\") pod \"dnsmasq-dns-658f55c9f5-vvwtp\" (UID: \"24d85c15-a28a-40ef-92cb-611d03123bc8\") " pod="openstack/dnsmasq-dns-658f55c9f5-vvwtp" Mar 18 18:19:18 crc kubenswrapper[4830]: I0318 18:19:18.985930 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24d85c15-a28a-40ef-92cb-611d03123bc8-config\") pod \"dnsmasq-dns-658f55c9f5-vvwtp\" (UID: \"24d85c15-a28a-40ef-92cb-611d03123bc8\") " pod="openstack/dnsmasq-dns-658f55c9f5-vvwtp" Mar 18 18:19:18 crc kubenswrapper[4830]: I0318 18:19:18.985974 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24d85c15-a28a-40ef-92cb-611d03123bc8-dns-svc\") pod \"dnsmasq-dns-658f55c9f5-vvwtp\" (UID: \"24d85c15-a28a-40ef-92cb-611d03123bc8\") " pod="openstack/dnsmasq-dns-658f55c9f5-vvwtp" Mar 18 18:19:18 crc kubenswrapper[4830]: I0318 18:19:18.986931 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24d85c15-a28a-40ef-92cb-611d03123bc8-config\") pod \"dnsmasq-dns-658f55c9f5-vvwtp\" (UID: \"24d85c15-a28a-40ef-92cb-611d03123bc8\") " pod="openstack/dnsmasq-dns-658f55c9f5-vvwtp" Mar 18 18:19:18 crc kubenswrapper[4830]: I0318 18:19:18.987908 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24d85c15-a28a-40ef-92cb-611d03123bc8-dns-svc\") pod \"dnsmasq-dns-658f55c9f5-vvwtp\" (UID: \"24d85c15-a28a-40ef-92cb-611d03123bc8\") " pod="openstack/dnsmasq-dns-658f55c9f5-vvwtp" Mar 18 18:19:19 crc kubenswrapper[4830]: I0318 18:19:19.015506 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-jxtww"] Mar 18 18:19:19 crc kubenswrapper[4830]: I0318 18:19:19.016510 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-jxtww" Mar 18 18:19:19 crc kubenswrapper[4830]: I0318 18:19:19.041739 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-jxtww"] Mar 18 18:19:19 crc kubenswrapper[4830]: I0318 18:19:19.058124 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9k9w\" (UniqueName: \"kubernetes.io/projected/24d85c15-a28a-40ef-92cb-611d03123bc8-kube-api-access-m9k9w\") pod \"dnsmasq-dns-658f55c9f5-vvwtp\" (UID: \"24d85c15-a28a-40ef-92cb-611d03123bc8\") " pod="openstack/dnsmasq-dns-658f55c9f5-vvwtp" Mar 18 18:19:19 crc kubenswrapper[4830]: I0318 18:19:19.189087 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zswc7\" (UniqueName: \"kubernetes.io/projected/2046a38e-0101-47a2-88d5-f91ca521cb9a-kube-api-access-zswc7\") pod \"dnsmasq-dns-54b5dffb47-jxtww\" (UID: \"2046a38e-0101-47a2-88d5-f91ca521cb9a\") " pod="openstack/dnsmasq-dns-54b5dffb47-jxtww" Mar 18 18:19:19 crc kubenswrapper[4830]: I0318 18:19:19.189136 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2046a38e-0101-47a2-88d5-f91ca521cb9a-dns-svc\") pod \"dnsmasq-dns-54b5dffb47-jxtww\" (UID: \"2046a38e-0101-47a2-88d5-f91ca521cb9a\") " pod="openstack/dnsmasq-dns-54b5dffb47-jxtww" Mar 18 18:19:19 crc kubenswrapper[4830]: I0318 18:19:19.189346 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2046a38e-0101-47a2-88d5-f91ca521cb9a-config\") pod \"dnsmasq-dns-54b5dffb47-jxtww\" (UID: \"2046a38e-0101-47a2-88d5-f91ca521cb9a\") " pod="openstack/dnsmasq-dns-54b5dffb47-jxtww" Mar 18 18:19:19 crc kubenswrapper[4830]: I0318 18:19:19.291255 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2046a38e-0101-47a2-88d5-f91ca521cb9a-dns-svc\") pod \"dnsmasq-dns-54b5dffb47-jxtww\" (UID: \"2046a38e-0101-47a2-88d5-f91ca521cb9a\") " pod="openstack/dnsmasq-dns-54b5dffb47-jxtww" Mar 18 18:19:19 crc kubenswrapper[4830]: I0318 18:19:19.291466 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2046a38e-0101-47a2-88d5-f91ca521cb9a-config\") pod \"dnsmasq-dns-54b5dffb47-jxtww\" (UID: \"2046a38e-0101-47a2-88d5-f91ca521cb9a\") " pod="openstack/dnsmasq-dns-54b5dffb47-jxtww" Mar 18 18:19:19 crc kubenswrapper[4830]: I0318 18:19:19.291608 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zswc7\" (UniqueName: \"kubernetes.io/projected/2046a38e-0101-47a2-88d5-f91ca521cb9a-kube-api-access-zswc7\") pod \"dnsmasq-dns-54b5dffb47-jxtww\" (UID: \"2046a38e-0101-47a2-88d5-f91ca521cb9a\") " pod="openstack/dnsmasq-dns-54b5dffb47-jxtww" Mar 18 18:19:19 crc kubenswrapper[4830]: I0318 18:19:19.292711 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2046a38e-0101-47a2-88d5-f91ca521cb9a-config\") pod \"dnsmasq-dns-54b5dffb47-jxtww\" (UID: \"2046a38e-0101-47a2-88d5-f91ca521cb9a\") " pod="openstack/dnsmasq-dns-54b5dffb47-jxtww" Mar 18 18:19:19 crc kubenswrapper[4830]: I0318 18:19:19.293157 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2046a38e-0101-47a2-88d5-f91ca521cb9a-dns-svc\") pod \"dnsmasq-dns-54b5dffb47-jxtww\" (UID: \"2046a38e-0101-47a2-88d5-f91ca521cb9a\") " pod="openstack/dnsmasq-dns-54b5dffb47-jxtww" Mar 18 18:19:19 crc kubenswrapper[4830]: I0318 18:19:19.317460 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zswc7\" (UniqueName: \"kubernetes.io/projected/2046a38e-0101-47a2-88d5-f91ca521cb9a-kube-api-access-zswc7\") pod \"dnsmasq-dns-54b5dffb47-jxtww\" (UID: \"2046a38e-0101-47a2-88d5-f91ca521cb9a\") " pod="openstack/dnsmasq-dns-54b5dffb47-jxtww" Mar 18 18:19:19 crc kubenswrapper[4830]: I0318 18:19:19.327148 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658f55c9f5-vvwtp" Mar 18 18:19:19 crc kubenswrapper[4830]: I0318 18:19:19.372522 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-jxtww" Mar 18 18:19:19 crc kubenswrapper[4830]: I0318 18:19:19.651415 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-vvwtp"] Mar 18 18:19:19 crc kubenswrapper[4830]: I0318 18:19:19.844966 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 18:19:19 crc kubenswrapper[4830]: I0318 18:19:19.846991 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:19 crc kubenswrapper[4830]: I0318 18:19:19.856703 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 18 18:19:19 crc kubenswrapper[4830]: I0318 18:19:19.856858 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 18 18:19:19 crc kubenswrapper[4830]: I0318 18:19:19.857051 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 18 18:19:19 crc kubenswrapper[4830]: I0318 18:19:19.857101 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 18 18:19:19 crc kubenswrapper[4830]: I0318 18:19:19.857262 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 18 18:19:19 crc kubenswrapper[4830]: I0318 18:19:19.857763 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 18 18:19:19 crc kubenswrapper[4830]: I0318 18:19:19.857056 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-67xdw" Mar 18 18:19:19 crc kubenswrapper[4830]: I0318 18:19:19.864300 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 18:19:19 crc kubenswrapper[4830]: I0318 18:19:19.911246 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-jxtww"] Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.005213 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a639262d-5bc7-4b14-a6ef-59583fdffb07-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.005253 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a639262d-5bc7-4b14-a6ef-59583fdffb07-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.005295 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a639262d-5bc7-4b14-a6ef-59583fdffb07-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.005331 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a639262d-5bc7-4b14-a6ef-59583fdffb07-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.005351 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a639262d-5bc7-4b14-a6ef-59583fdffb07-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.005370 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a639262d-5bc7-4b14-a6ef-59583fdffb07-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.005395 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.005414 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a639262d-5bc7-4b14-a6ef-59583fdffb07-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.005440 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a639262d-5bc7-4b14-a6ef-59583fdffb07-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.005458 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbgzn\" (UniqueName: \"kubernetes.io/projected/a639262d-5bc7-4b14-a6ef-59583fdffb07-kube-api-access-mbgzn\") pod \"rabbitmq-cell1-server-0\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.005482 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a639262d-5bc7-4b14-a6ef-59583fdffb07-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.106643 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a639262d-5bc7-4b14-a6ef-59583fdffb07-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.107173 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbgzn\" (UniqueName: \"kubernetes.io/projected/a639262d-5bc7-4b14-a6ef-59583fdffb07-kube-api-access-mbgzn\") pod \"rabbitmq-cell1-server-0\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.107213 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a639262d-5bc7-4b14-a6ef-59583fdffb07-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.107241 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a639262d-5bc7-4b14-a6ef-59583fdffb07-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.107263 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a639262d-5bc7-4b14-a6ef-59583fdffb07-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.107307 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a639262d-5bc7-4b14-a6ef-59583fdffb07-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.107344 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a639262d-5bc7-4b14-a6ef-59583fdffb07-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.107367 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a639262d-5bc7-4b14-a6ef-59583fdffb07-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.107392 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a639262d-5bc7-4b14-a6ef-59583fdffb07-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.107423 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.107445 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a639262d-5bc7-4b14-a6ef-59583fdffb07-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.108973 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a639262d-5bc7-4b14-a6ef-59583fdffb07-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.109313 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a639262d-5bc7-4b14-a6ef-59583fdffb07-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.114433 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a639262d-5bc7-4b14-a6ef-59583fdffb07-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.114890 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.115895 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a639262d-5bc7-4b14-a6ef-59583fdffb07-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.115040 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a639262d-5bc7-4b14-a6ef-59583fdffb07-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.125056 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a639262d-5bc7-4b14-a6ef-59583fdffb07-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.125876 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a639262d-5bc7-4b14-a6ef-59583fdffb07-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.127016 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a639262d-5bc7-4b14-a6ef-59583fdffb07-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.141134 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a639262d-5bc7-4b14-a6ef-59583fdffb07-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.161627 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbgzn\" (UniqueName: \"kubernetes.io/projected/a639262d-5bc7-4b14-a6ef-59583fdffb07-kube-api-access-mbgzn\") pod \"rabbitmq-cell1-server-0\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.150759 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.172206 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.173055 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.182322 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.182532 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.182700 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.182873 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.183048 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.183338 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-g4ldz" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.189037 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.201087 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.248584 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.282479 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56fb6c83-b748-4e21-9b1c-90fb37cefea1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.282533 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56fb6c83-b748-4e21-9b1c-90fb37cefea1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.282755 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc9jv\" (UniqueName: \"kubernetes.io/projected/56fb6c83-b748-4e21-9b1c-90fb37cefea1-kube-api-access-cc9jv\") pod \"rabbitmq-server-0\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.282872 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56fb6c83-b748-4e21-9b1c-90fb37cefea1-config-data\") pod \"rabbitmq-server-0\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.282916 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56fb6c83-b748-4e21-9b1c-90fb37cefea1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.282947 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.282976 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56fb6c83-b748-4e21-9b1c-90fb37cefea1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.282997 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56fb6c83-b748-4e21-9b1c-90fb37cefea1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.283029 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56fb6c83-b748-4e21-9b1c-90fb37cefea1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.283054 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56fb6c83-b748-4e21-9b1c-90fb37cefea1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.283218 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56fb6c83-b748-4e21-9b1c-90fb37cefea1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.313024 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-jxtww" event={"ID":"2046a38e-0101-47a2-88d5-f91ca521cb9a","Type":"ContainerStarted","Data":"e3cd4b0e68a6a086a43f45d40803a1844b42f413c54a139e981cd9ccadb0bf78"} Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.316203 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658f55c9f5-vvwtp" event={"ID":"24d85c15-a28a-40ef-92cb-611d03123bc8","Type":"ContainerStarted","Data":"511b611d0df49bd8e363b7b1c46b22bb1df50b4a0e1d6d37c526bb4e07f50710"} Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.384559 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56fb6c83-b748-4e21-9b1c-90fb37cefea1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.384616 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56fb6c83-b748-4e21-9b1c-90fb37cefea1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.384635 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56fb6c83-b748-4e21-9b1c-90fb37cefea1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.384666 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc9jv\" (UniqueName: \"kubernetes.io/projected/56fb6c83-b748-4e21-9b1c-90fb37cefea1-kube-api-access-cc9jv\") pod \"rabbitmq-server-0\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.384685 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56fb6c83-b748-4e21-9b1c-90fb37cefea1-config-data\") pod \"rabbitmq-server-0\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.384716 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56fb6c83-b748-4e21-9b1c-90fb37cefea1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.384744 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.384760 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56fb6c83-b748-4e21-9b1c-90fb37cefea1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.384790 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56fb6c83-b748-4e21-9b1c-90fb37cefea1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.384820 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56fb6c83-b748-4e21-9b1c-90fb37cefea1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.384848 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56fb6c83-b748-4e21-9b1c-90fb37cefea1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.385982 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.389827 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56fb6c83-b748-4e21-9b1c-90fb37cefea1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.390403 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56fb6c83-b748-4e21-9b1c-90fb37cefea1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.390985 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56fb6c83-b748-4e21-9b1c-90fb37cefea1-config-data\") pod \"rabbitmq-server-0\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.391233 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56fb6c83-b748-4e21-9b1c-90fb37cefea1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.396711 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56fb6c83-b748-4e21-9b1c-90fb37cefea1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.397352 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56fb6c83-b748-4e21-9b1c-90fb37cefea1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.414750 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56fb6c83-b748-4e21-9b1c-90fb37cefea1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.415744 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56fb6c83-b748-4e21-9b1c-90fb37cefea1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.416529 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc9jv\" (UniqueName: \"kubernetes.io/projected/56fb6c83-b748-4e21-9b1c-90fb37cefea1-kube-api-access-cc9jv\") pod \"rabbitmq-server-0\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.419411 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56fb6c83-b748-4e21-9b1c-90fb37cefea1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.460893 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.611335 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.684939 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 18:19:20 crc kubenswrapper[4830]: W0318 18:19:20.701276 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda639262d_5bc7_4b14_a6ef_59583fdffb07.slice/crio-4cfa4a0484891280bed69c79151db54e20b954f679f73ed24fb22fdc6733635d WatchSource:0}: Error finding container 4cfa4a0484891280bed69c79151db54e20b954f679f73ed24fb22fdc6733635d: Status 404 returned error can't find the container with id 4cfa4a0484891280bed69c79151db54e20b954f679f73ed24fb22fdc6733635d Mar 18 18:19:20 crc kubenswrapper[4830]: I0318 18:19:20.935341 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.136003 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.137650 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.142723 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.147149 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-pwfv8" Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.147165 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.148122 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.154811 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.157900 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.302543 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-config-data-default\") pod \"openstack-galera-0\" (UID: \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\") " pod="openstack/openstack-galera-0" Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.302978 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\") " pod="openstack/openstack-galera-0" Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.303005 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-config-data-generated\") pod \"openstack-galera-0\" (UID: \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\") " pod="openstack/openstack-galera-0" Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.303039 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\") " pod="openstack/openstack-galera-0" Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.303056 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\") " pod="openstack/openstack-galera-0" Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.303089 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-operator-scripts\") pod \"openstack-galera-0\" (UID: \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\") " pod="openstack/openstack-galera-0" Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.303106 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-kolla-config\") pod \"openstack-galera-0\" (UID: \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\") " pod="openstack/openstack-galera-0" Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.303157 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv9cp\" (UniqueName: \"kubernetes.io/projected/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-kube-api-access-mv9cp\") pod \"openstack-galera-0\" (UID: \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\") " pod="openstack/openstack-galera-0" Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.337277 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a639262d-5bc7-4b14-a6ef-59583fdffb07","Type":"ContainerStarted","Data":"4cfa4a0484891280bed69c79151db54e20b954f679f73ed24fb22fdc6733635d"} Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.339876 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"56fb6c83-b748-4e21-9b1c-90fb37cefea1","Type":"ContainerStarted","Data":"81bc88812550989ad89124dc8826a79b66e9b1a3b524cf17759ea64751502fc0"} Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.405872 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-config-data-generated\") pod \"openstack-galera-0\" (UID: \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\") " pod="openstack/openstack-galera-0" Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.405936 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\") " pod="openstack/openstack-galera-0" Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.405985 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\") " pod="openstack/openstack-galera-0" Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.406029 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-operator-scripts\") pod \"openstack-galera-0\" (UID: \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\") " pod="openstack/openstack-galera-0" Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.406045 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-kolla-config\") pod \"openstack-galera-0\" (UID: \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\") " pod="openstack/openstack-galera-0" Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.406120 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv9cp\" (UniqueName: \"kubernetes.io/projected/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-kube-api-access-mv9cp\") pod \"openstack-galera-0\" (UID: \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\") " pod="openstack/openstack-galera-0" Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.406145 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-config-data-default\") pod \"openstack-galera-0\" (UID: \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\") " pod="openstack/openstack-galera-0" Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.406190 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\") " pod="openstack/openstack-galera-0" Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.406449 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-config-data-generated\") pod \"openstack-galera-0\" (UID: \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\") " pod="openstack/openstack-galera-0" Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.407874 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-kolla-config\") pod \"openstack-galera-0\" (UID: \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\") " pod="openstack/openstack-galera-0" Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.408187 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-operator-scripts\") pod \"openstack-galera-0\" (UID: \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\") " pod="openstack/openstack-galera-0" Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.408640 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-config-data-default\") pod \"openstack-galera-0\" (UID: \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\") " pod="openstack/openstack-galera-0" Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.408958 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-galera-0" Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.429703 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\") " pod="openstack/openstack-galera-0" Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.429810 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\") " pod="openstack/openstack-galera-0" Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.471224 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv9cp\" (UniqueName: \"kubernetes.io/projected/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-kube-api-access-mv9cp\") pod \"openstack-galera-0\" (UID: \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\") " pod="openstack/openstack-galera-0" Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.484327 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\") " pod="openstack/openstack-galera-0" Mar 18 18:19:21 crc kubenswrapper[4830]: I0318 18:19:21.767383 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.336388 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.338488 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.342190 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.350108 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-c6d24" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.350210 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.351948 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.354246 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.420866 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/435574fa-a924-4289-a93a-dea05d57d105-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"435574fa-a924-4289-a93a-dea05d57d105\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.420927 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"435574fa-a924-4289-a93a-dea05d57d105\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.420959 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/435574fa-a924-4289-a93a-dea05d57d105-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"435574fa-a924-4289-a93a-dea05d57d105\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.421370 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/435574fa-a924-4289-a93a-dea05d57d105-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"435574fa-a924-4289-a93a-dea05d57d105\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.421403 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/435574fa-a924-4289-a93a-dea05d57d105-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"435574fa-a924-4289-a93a-dea05d57d105\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.421438 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/435574fa-a924-4289-a93a-dea05d57d105-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"435574fa-a924-4289-a93a-dea05d57d105\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.421463 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/435574fa-a924-4289-a93a-dea05d57d105-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"435574fa-a924-4289-a93a-dea05d57d105\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.421498 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft9kh\" (UniqueName: \"kubernetes.io/projected/435574fa-a924-4289-a93a-dea05d57d105-kube-api-access-ft9kh\") pod \"openstack-cell1-galera-0\" (UID: \"435574fa-a924-4289-a93a-dea05d57d105\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.523279 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/435574fa-a924-4289-a93a-dea05d57d105-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"435574fa-a924-4289-a93a-dea05d57d105\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.523340 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft9kh\" (UniqueName: \"kubernetes.io/projected/435574fa-a924-4289-a93a-dea05d57d105-kube-api-access-ft9kh\") pod \"openstack-cell1-galera-0\" (UID: \"435574fa-a924-4289-a93a-dea05d57d105\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.523372 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/435574fa-a924-4289-a93a-dea05d57d105-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"435574fa-a924-4289-a93a-dea05d57d105\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.523408 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"435574fa-a924-4289-a93a-dea05d57d105\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.523427 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/435574fa-a924-4289-a93a-dea05d57d105-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"435574fa-a924-4289-a93a-dea05d57d105\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.523464 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/435574fa-a924-4289-a93a-dea05d57d105-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"435574fa-a924-4289-a93a-dea05d57d105\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.523484 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/435574fa-a924-4289-a93a-dea05d57d105-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"435574fa-a924-4289-a93a-dea05d57d105\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.523516 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/435574fa-a924-4289-a93a-dea05d57d105-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"435574fa-a924-4289-a93a-dea05d57d105\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.523947 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/435574fa-a924-4289-a93a-dea05d57d105-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"435574fa-a924-4289-a93a-dea05d57d105\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.524752 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"435574fa-a924-4289-a93a-dea05d57d105\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-cell1-galera-0" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.527757 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/435574fa-a924-4289-a93a-dea05d57d105-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"435574fa-a924-4289-a93a-dea05d57d105\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.528256 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/435574fa-a924-4289-a93a-dea05d57d105-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"435574fa-a924-4289-a93a-dea05d57d105\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.535855 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/435574fa-a924-4289-a93a-dea05d57d105-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"435574fa-a924-4289-a93a-dea05d57d105\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.536513 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/435574fa-a924-4289-a93a-dea05d57d105-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"435574fa-a924-4289-a93a-dea05d57d105\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.551546 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft9kh\" (UniqueName: \"kubernetes.io/projected/435574fa-a924-4289-a93a-dea05d57d105-kube-api-access-ft9kh\") pod \"openstack-cell1-galera-0\" (UID: \"435574fa-a924-4289-a93a-dea05d57d105\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.582650 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/435574fa-a924-4289-a93a-dea05d57d105-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"435574fa-a924-4289-a93a-dea05d57d105\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.639832 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"435574fa-a924-4289-a93a-dea05d57d105\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.677993 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.799706 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.800665 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.809426 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.809430 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.818547 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-455tk" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.819712 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.945041 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a1e291-1a11-4747-96ed-32c95623dcbb-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a0a1e291-1a11-4747-96ed-32c95623dcbb\") " pod="openstack/memcached-0" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.945279 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a0a1e291-1a11-4747-96ed-32c95623dcbb-config-data\") pod \"memcached-0\" (UID: \"a0a1e291-1a11-4747-96ed-32c95623dcbb\") " pod="openstack/memcached-0" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.945632 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0a1e291-1a11-4747-96ed-32c95623dcbb-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a0a1e291-1a11-4747-96ed-32c95623dcbb\") " pod="openstack/memcached-0" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.945806 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jxbg\" (UniqueName: \"kubernetes.io/projected/a0a1e291-1a11-4747-96ed-32c95623dcbb-kube-api-access-7jxbg\") pod \"memcached-0\" (UID: \"a0a1e291-1a11-4747-96ed-32c95623dcbb\") " pod="openstack/memcached-0" Mar 18 18:19:22 crc kubenswrapper[4830]: I0318 18:19:22.945882 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a0a1e291-1a11-4747-96ed-32c95623dcbb-kolla-config\") pod \"memcached-0\" (UID: \"a0a1e291-1a11-4747-96ed-32c95623dcbb\") " pod="openstack/memcached-0" Mar 18 18:19:23 crc kubenswrapper[4830]: I0318 18:19:23.047554 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a1e291-1a11-4747-96ed-32c95623dcbb-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a0a1e291-1a11-4747-96ed-32c95623dcbb\") " pod="openstack/memcached-0" Mar 18 18:19:23 crc kubenswrapper[4830]: I0318 18:19:23.047645 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a0a1e291-1a11-4747-96ed-32c95623dcbb-config-data\") pod \"memcached-0\" (UID: \"a0a1e291-1a11-4747-96ed-32c95623dcbb\") " pod="openstack/memcached-0" Mar 18 18:19:23 crc kubenswrapper[4830]: I0318 18:19:23.047681 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0a1e291-1a11-4747-96ed-32c95623dcbb-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a0a1e291-1a11-4747-96ed-32c95623dcbb\") " pod="openstack/memcached-0" Mar 18 18:19:23 crc kubenswrapper[4830]: I0318 18:19:23.047705 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jxbg\" (UniqueName: \"kubernetes.io/projected/a0a1e291-1a11-4747-96ed-32c95623dcbb-kube-api-access-7jxbg\") pod \"memcached-0\" (UID: \"a0a1e291-1a11-4747-96ed-32c95623dcbb\") " pod="openstack/memcached-0" Mar 18 18:19:23 crc kubenswrapper[4830]: I0318 18:19:23.047728 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a0a1e291-1a11-4747-96ed-32c95623dcbb-kolla-config\") pod \"memcached-0\" (UID: \"a0a1e291-1a11-4747-96ed-32c95623dcbb\") " pod="openstack/memcached-0" Mar 18 18:19:23 crc kubenswrapper[4830]: I0318 18:19:23.048661 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a0a1e291-1a11-4747-96ed-32c95623dcbb-kolla-config\") pod \"memcached-0\" (UID: \"a0a1e291-1a11-4747-96ed-32c95623dcbb\") " pod="openstack/memcached-0" Mar 18 18:19:23 crc kubenswrapper[4830]: I0318 18:19:23.049832 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a0a1e291-1a11-4747-96ed-32c95623dcbb-config-data\") pod \"memcached-0\" (UID: \"a0a1e291-1a11-4747-96ed-32c95623dcbb\") " pod="openstack/memcached-0" Mar 18 18:19:23 crc kubenswrapper[4830]: I0318 18:19:23.064091 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a1e291-1a11-4747-96ed-32c95623dcbb-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a0a1e291-1a11-4747-96ed-32c95623dcbb\") " pod="openstack/memcached-0" Mar 18 18:19:23 crc kubenswrapper[4830]: I0318 18:19:23.076380 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0a1e291-1a11-4747-96ed-32c95623dcbb-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a0a1e291-1a11-4747-96ed-32c95623dcbb\") " pod="openstack/memcached-0" Mar 18 18:19:23 crc kubenswrapper[4830]: I0318 18:19:23.110489 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jxbg\" (UniqueName: \"kubernetes.io/projected/a0a1e291-1a11-4747-96ed-32c95623dcbb-kube-api-access-7jxbg\") pod \"memcached-0\" (UID: \"a0a1e291-1a11-4747-96ed-32c95623dcbb\") " pod="openstack/memcached-0" Mar 18 18:19:23 crc kubenswrapper[4830]: I0318 18:19:23.121108 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 18:19:24 crc kubenswrapper[4830]: I0318 18:19:24.920994 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 18:19:24 crc kubenswrapper[4830]: I0318 18:19:24.922310 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 18:19:24 crc kubenswrapper[4830]: I0318 18:19:24.930314 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-lf788" Mar 18 18:19:24 crc kubenswrapper[4830]: I0318 18:19:24.941576 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 18:19:24 crc kubenswrapper[4830]: I0318 18:19:24.993810 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ndww\" (UniqueName: \"kubernetes.io/projected/eceefe95-ad07-4228-ac93-a8f2484ba584-kube-api-access-9ndww\") pod \"kube-state-metrics-0\" (UID: \"eceefe95-ad07-4228-ac93-a8f2484ba584\") " pod="openstack/kube-state-metrics-0" Mar 18 18:19:25 crc kubenswrapper[4830]: I0318 18:19:25.097711 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ndww\" (UniqueName: \"kubernetes.io/projected/eceefe95-ad07-4228-ac93-a8f2484ba584-kube-api-access-9ndww\") pod \"kube-state-metrics-0\" (UID: \"eceefe95-ad07-4228-ac93-a8f2484ba584\") " pod="openstack/kube-state-metrics-0" Mar 18 18:19:25 crc kubenswrapper[4830]: I0318 18:19:25.121647 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ndww\" (UniqueName: \"kubernetes.io/projected/eceefe95-ad07-4228-ac93-a8f2484ba584-kube-api-access-9ndww\") pod \"kube-state-metrics-0\" (UID: \"eceefe95-ad07-4228-ac93-a8f2484ba584\") " pod="openstack/kube-state-metrics-0" Mar 18 18:19:25 crc kubenswrapper[4830]: I0318 18:19:25.272417 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.146672 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-chwf9"] Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.150300 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-chwf9" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.161627 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-jkwns" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.161955 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.162163 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.168579 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-chwf9"] Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.265250 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/544c01f7-a6da-45de-96f2-9ab9dea0567c-scripts\") pod \"ovn-controller-chwf9\" (UID: \"544c01f7-a6da-45de-96f2-9ab9dea0567c\") " pod="openstack/ovn-controller-chwf9" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.265302 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/544c01f7-a6da-45de-96f2-9ab9dea0567c-combined-ca-bundle\") pod \"ovn-controller-chwf9\" (UID: \"544c01f7-a6da-45de-96f2-9ab9dea0567c\") " pod="openstack/ovn-controller-chwf9" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.265325 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/544c01f7-a6da-45de-96f2-9ab9dea0567c-var-run\") pod \"ovn-controller-chwf9\" (UID: \"544c01f7-a6da-45de-96f2-9ab9dea0567c\") " pod="openstack/ovn-controller-chwf9" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.265348 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/544c01f7-a6da-45de-96f2-9ab9dea0567c-var-run-ovn\") pod \"ovn-controller-chwf9\" (UID: \"544c01f7-a6da-45de-96f2-9ab9dea0567c\") " pod="openstack/ovn-controller-chwf9" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.265371 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/544c01f7-a6da-45de-96f2-9ab9dea0567c-var-log-ovn\") pod \"ovn-controller-chwf9\" (UID: \"544c01f7-a6da-45de-96f2-9ab9dea0567c\") " pod="openstack/ovn-controller-chwf9" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.265390 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s98gw\" (UniqueName: \"kubernetes.io/projected/544c01f7-a6da-45de-96f2-9ab9dea0567c-kube-api-access-s98gw\") pod \"ovn-controller-chwf9\" (UID: \"544c01f7-a6da-45de-96f2-9ab9dea0567c\") " pod="openstack/ovn-controller-chwf9" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.265418 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/544c01f7-a6da-45de-96f2-9ab9dea0567c-ovn-controller-tls-certs\") pod \"ovn-controller-chwf9\" (UID: \"544c01f7-a6da-45de-96f2-9ab9dea0567c\") " pod="openstack/ovn-controller-chwf9" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.367606 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/544c01f7-a6da-45de-96f2-9ab9dea0567c-var-run-ovn\") pod \"ovn-controller-chwf9\" (UID: \"544c01f7-a6da-45de-96f2-9ab9dea0567c\") " pod="openstack/ovn-controller-chwf9" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.367685 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/544c01f7-a6da-45de-96f2-9ab9dea0567c-var-log-ovn\") pod \"ovn-controller-chwf9\" (UID: \"544c01f7-a6da-45de-96f2-9ab9dea0567c\") " pod="openstack/ovn-controller-chwf9" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.367714 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s98gw\" (UniqueName: \"kubernetes.io/projected/544c01f7-a6da-45de-96f2-9ab9dea0567c-kube-api-access-s98gw\") pod \"ovn-controller-chwf9\" (UID: \"544c01f7-a6da-45de-96f2-9ab9dea0567c\") " pod="openstack/ovn-controller-chwf9" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.367762 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/544c01f7-a6da-45de-96f2-9ab9dea0567c-ovn-controller-tls-certs\") pod \"ovn-controller-chwf9\" (UID: \"544c01f7-a6da-45de-96f2-9ab9dea0567c\") " pod="openstack/ovn-controller-chwf9" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.367982 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/544c01f7-a6da-45de-96f2-9ab9dea0567c-scripts\") pod \"ovn-controller-chwf9\" (UID: \"544c01f7-a6da-45de-96f2-9ab9dea0567c\") " pod="openstack/ovn-controller-chwf9" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.368008 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/544c01f7-a6da-45de-96f2-9ab9dea0567c-combined-ca-bundle\") pod \"ovn-controller-chwf9\" (UID: \"544c01f7-a6da-45de-96f2-9ab9dea0567c\") " pod="openstack/ovn-controller-chwf9" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.368032 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/544c01f7-a6da-45de-96f2-9ab9dea0567c-var-run\") pod \"ovn-controller-chwf9\" (UID: \"544c01f7-a6da-45de-96f2-9ab9dea0567c\") " pod="openstack/ovn-controller-chwf9" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.368649 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/544c01f7-a6da-45de-96f2-9ab9dea0567c-var-log-ovn\") pod \"ovn-controller-chwf9\" (UID: \"544c01f7-a6da-45de-96f2-9ab9dea0567c\") " pod="openstack/ovn-controller-chwf9" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.368675 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/544c01f7-a6da-45de-96f2-9ab9dea0567c-var-run\") pod \"ovn-controller-chwf9\" (UID: \"544c01f7-a6da-45de-96f2-9ab9dea0567c\") " pod="openstack/ovn-controller-chwf9" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.368913 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/544c01f7-a6da-45de-96f2-9ab9dea0567c-var-run-ovn\") pod \"ovn-controller-chwf9\" (UID: \"544c01f7-a6da-45de-96f2-9ab9dea0567c\") " pod="openstack/ovn-controller-chwf9" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.375478 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/544c01f7-a6da-45de-96f2-9ab9dea0567c-scripts\") pod \"ovn-controller-chwf9\" (UID: \"544c01f7-a6da-45de-96f2-9ab9dea0567c\") " pod="openstack/ovn-controller-chwf9" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.378578 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/544c01f7-a6da-45de-96f2-9ab9dea0567c-combined-ca-bundle\") pod \"ovn-controller-chwf9\" (UID: \"544c01f7-a6da-45de-96f2-9ab9dea0567c\") " pod="openstack/ovn-controller-chwf9" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.389037 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-dv8kn"] Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.394604 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s98gw\" (UniqueName: \"kubernetes.io/projected/544c01f7-a6da-45de-96f2-9ab9dea0567c-kube-api-access-s98gw\") pod \"ovn-controller-chwf9\" (UID: \"544c01f7-a6da-45de-96f2-9ab9dea0567c\") " pod="openstack/ovn-controller-chwf9" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.399865 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/544c01f7-a6da-45de-96f2-9ab9dea0567c-ovn-controller-tls-certs\") pod \"ovn-controller-chwf9\" (UID: \"544c01f7-a6da-45de-96f2-9ab9dea0567c\") " pod="openstack/ovn-controller-chwf9" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.399882 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-dv8kn" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.407141 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-dv8kn"] Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.469160 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/23b737c7-6b5d-44f4-b05a-de278f4ca572-scripts\") pod \"ovn-controller-ovs-dv8kn\" (UID: \"23b737c7-6b5d-44f4-b05a-de278f4ca572\") " pod="openstack/ovn-controller-ovs-dv8kn" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.469219 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/23b737c7-6b5d-44f4-b05a-de278f4ca572-etc-ovs\") pod \"ovn-controller-ovs-dv8kn\" (UID: \"23b737c7-6b5d-44f4-b05a-de278f4ca572\") " pod="openstack/ovn-controller-ovs-dv8kn" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.469253 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/23b737c7-6b5d-44f4-b05a-de278f4ca572-var-log\") pod \"ovn-controller-ovs-dv8kn\" (UID: \"23b737c7-6b5d-44f4-b05a-de278f4ca572\") " pod="openstack/ovn-controller-ovs-dv8kn" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.469282 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/23b737c7-6b5d-44f4-b05a-de278f4ca572-var-lib\") pod \"ovn-controller-ovs-dv8kn\" (UID: \"23b737c7-6b5d-44f4-b05a-de278f4ca572\") " pod="openstack/ovn-controller-ovs-dv8kn" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.469314 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/23b737c7-6b5d-44f4-b05a-de278f4ca572-var-run\") pod \"ovn-controller-ovs-dv8kn\" (UID: \"23b737c7-6b5d-44f4-b05a-de278f4ca572\") " pod="openstack/ovn-controller-ovs-dv8kn" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.469341 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c44jd\" (UniqueName: \"kubernetes.io/projected/23b737c7-6b5d-44f4-b05a-de278f4ca572-kube-api-access-c44jd\") pod \"ovn-controller-ovs-dv8kn\" (UID: \"23b737c7-6b5d-44f4-b05a-de278f4ca572\") " pod="openstack/ovn-controller-ovs-dv8kn" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.488884 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-chwf9" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.571438 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/23b737c7-6b5d-44f4-b05a-de278f4ca572-var-run\") pod \"ovn-controller-ovs-dv8kn\" (UID: \"23b737c7-6b5d-44f4-b05a-de278f4ca572\") " pod="openstack/ovn-controller-ovs-dv8kn" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.571504 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c44jd\" (UniqueName: \"kubernetes.io/projected/23b737c7-6b5d-44f4-b05a-de278f4ca572-kube-api-access-c44jd\") pod \"ovn-controller-ovs-dv8kn\" (UID: \"23b737c7-6b5d-44f4-b05a-de278f4ca572\") " pod="openstack/ovn-controller-ovs-dv8kn" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.571623 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/23b737c7-6b5d-44f4-b05a-de278f4ca572-scripts\") pod \"ovn-controller-ovs-dv8kn\" (UID: \"23b737c7-6b5d-44f4-b05a-de278f4ca572\") " pod="openstack/ovn-controller-ovs-dv8kn" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.571634 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/23b737c7-6b5d-44f4-b05a-de278f4ca572-var-run\") pod \"ovn-controller-ovs-dv8kn\" (UID: \"23b737c7-6b5d-44f4-b05a-de278f4ca572\") " pod="openstack/ovn-controller-ovs-dv8kn" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.571662 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/23b737c7-6b5d-44f4-b05a-de278f4ca572-etc-ovs\") pod \"ovn-controller-ovs-dv8kn\" (UID: \"23b737c7-6b5d-44f4-b05a-de278f4ca572\") " pod="openstack/ovn-controller-ovs-dv8kn" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.571725 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/23b737c7-6b5d-44f4-b05a-de278f4ca572-var-log\") pod \"ovn-controller-ovs-dv8kn\" (UID: \"23b737c7-6b5d-44f4-b05a-de278f4ca572\") " pod="openstack/ovn-controller-ovs-dv8kn" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.571755 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/23b737c7-6b5d-44f4-b05a-de278f4ca572-var-lib\") pod \"ovn-controller-ovs-dv8kn\" (UID: \"23b737c7-6b5d-44f4-b05a-de278f4ca572\") " pod="openstack/ovn-controller-ovs-dv8kn" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.571933 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/23b737c7-6b5d-44f4-b05a-de278f4ca572-etc-ovs\") pod \"ovn-controller-ovs-dv8kn\" (UID: \"23b737c7-6b5d-44f4-b05a-de278f4ca572\") " pod="openstack/ovn-controller-ovs-dv8kn" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.572002 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/23b737c7-6b5d-44f4-b05a-de278f4ca572-var-log\") pod \"ovn-controller-ovs-dv8kn\" (UID: \"23b737c7-6b5d-44f4-b05a-de278f4ca572\") " pod="openstack/ovn-controller-ovs-dv8kn" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.572102 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/23b737c7-6b5d-44f4-b05a-de278f4ca572-var-lib\") pod \"ovn-controller-ovs-dv8kn\" (UID: \"23b737c7-6b5d-44f4-b05a-de278f4ca572\") " pod="openstack/ovn-controller-ovs-dv8kn" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.573702 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/23b737c7-6b5d-44f4-b05a-de278f4ca572-scripts\") pod \"ovn-controller-ovs-dv8kn\" (UID: \"23b737c7-6b5d-44f4-b05a-de278f4ca572\") " pod="openstack/ovn-controller-ovs-dv8kn" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.590343 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c44jd\" (UniqueName: \"kubernetes.io/projected/23b737c7-6b5d-44f4-b05a-de278f4ca572-kube-api-access-c44jd\") pod \"ovn-controller-ovs-dv8kn\" (UID: \"23b737c7-6b5d-44f4-b05a-de278f4ca572\") " pod="openstack/ovn-controller-ovs-dv8kn" Mar 18 18:19:28 crc kubenswrapper[4830]: I0318 18:19:28.773583 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-dv8kn" Mar 18 18:19:29 crc kubenswrapper[4830]: I0318 18:19:29.510328 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:19:29 crc kubenswrapper[4830]: I0318 18:19:29.510936 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:19:30 crc kubenswrapper[4830]: I0318 18:19:30.401362 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 18:19:30 crc kubenswrapper[4830]: I0318 18:19:30.403797 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 18:19:30 crc kubenswrapper[4830]: I0318 18:19:30.406922 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 18 18:19:30 crc kubenswrapper[4830]: I0318 18:19:30.407669 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-pfdlm" Mar 18 18:19:30 crc kubenswrapper[4830]: I0318 18:19:30.407801 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 18 18:19:30 crc kubenswrapper[4830]: I0318 18:19:30.408190 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 18 18:19:30 crc kubenswrapper[4830]: I0318 18:19:30.409381 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 18 18:19:30 crc kubenswrapper[4830]: I0318 18:19:30.424011 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 18:19:30 crc kubenswrapper[4830]: I0318 18:19:30.522260 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/93739148-39fb-4db3-ae9d-d222feb368d7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"93739148-39fb-4db3-ae9d-d222feb368d7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:19:30 crc kubenswrapper[4830]: I0318 18:19:30.522345 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kr6m\" (UniqueName: \"kubernetes.io/projected/93739148-39fb-4db3-ae9d-d222feb368d7-kube-api-access-8kr6m\") pod \"ovsdbserver-nb-0\" (UID: \"93739148-39fb-4db3-ae9d-d222feb368d7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:19:30 crc kubenswrapper[4830]: I0318 18:19:30.522472 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/93739148-39fb-4db3-ae9d-d222feb368d7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"93739148-39fb-4db3-ae9d-d222feb368d7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:19:30 crc kubenswrapper[4830]: I0318 18:19:30.522526 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"93739148-39fb-4db3-ae9d-d222feb368d7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:19:30 crc kubenswrapper[4830]: I0318 18:19:30.522729 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/93739148-39fb-4db3-ae9d-d222feb368d7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"93739148-39fb-4db3-ae9d-d222feb368d7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:19:30 crc kubenswrapper[4830]: I0318 18:19:30.522758 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93739148-39fb-4db3-ae9d-d222feb368d7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"93739148-39fb-4db3-ae9d-d222feb368d7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:19:30 crc kubenswrapper[4830]: I0318 18:19:30.522797 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/93739148-39fb-4db3-ae9d-d222feb368d7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"93739148-39fb-4db3-ae9d-d222feb368d7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:19:30 crc kubenswrapper[4830]: I0318 18:19:30.522826 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93739148-39fb-4db3-ae9d-d222feb368d7-config\") pod \"ovsdbserver-nb-0\" (UID: \"93739148-39fb-4db3-ae9d-d222feb368d7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:19:30 crc kubenswrapper[4830]: I0318 18:19:30.623884 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kr6m\" (UniqueName: \"kubernetes.io/projected/93739148-39fb-4db3-ae9d-d222feb368d7-kube-api-access-8kr6m\") pod \"ovsdbserver-nb-0\" (UID: \"93739148-39fb-4db3-ae9d-d222feb368d7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:19:30 crc kubenswrapper[4830]: I0318 18:19:30.623947 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/93739148-39fb-4db3-ae9d-d222feb368d7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"93739148-39fb-4db3-ae9d-d222feb368d7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:19:30 crc kubenswrapper[4830]: I0318 18:19:30.623974 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"93739148-39fb-4db3-ae9d-d222feb368d7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:19:30 crc kubenswrapper[4830]: I0318 18:19:30.624017 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/93739148-39fb-4db3-ae9d-d222feb368d7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"93739148-39fb-4db3-ae9d-d222feb368d7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:19:30 crc kubenswrapper[4830]: I0318 18:19:30.624469 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"93739148-39fb-4db3-ae9d-d222feb368d7\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-nb-0" Mar 18 18:19:30 crc kubenswrapper[4830]: I0318 18:19:30.624440 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/93739148-39fb-4db3-ae9d-d222feb368d7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"93739148-39fb-4db3-ae9d-d222feb368d7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:19:30 crc kubenswrapper[4830]: I0318 18:19:30.624489 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93739148-39fb-4db3-ae9d-d222feb368d7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"93739148-39fb-4db3-ae9d-d222feb368d7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:19:30 crc kubenswrapper[4830]: I0318 18:19:30.624688 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93739148-39fb-4db3-ae9d-d222feb368d7-config\") pod \"ovsdbserver-nb-0\" (UID: \"93739148-39fb-4db3-ae9d-d222feb368d7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:19:30 crc kubenswrapper[4830]: I0318 18:19:30.624721 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/93739148-39fb-4db3-ae9d-d222feb368d7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"93739148-39fb-4db3-ae9d-d222feb368d7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:19:30 crc kubenswrapper[4830]: I0318 18:19:30.624872 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/93739148-39fb-4db3-ae9d-d222feb368d7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"93739148-39fb-4db3-ae9d-d222feb368d7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:19:30 crc kubenswrapper[4830]: I0318 18:19:30.625673 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93739148-39fb-4db3-ae9d-d222feb368d7-config\") pod \"ovsdbserver-nb-0\" (UID: \"93739148-39fb-4db3-ae9d-d222feb368d7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:19:30 crc kubenswrapper[4830]: I0318 18:19:30.628543 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/93739148-39fb-4db3-ae9d-d222feb368d7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"93739148-39fb-4db3-ae9d-d222feb368d7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:19:30 crc kubenswrapper[4830]: I0318 18:19:30.650573 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93739148-39fb-4db3-ae9d-d222feb368d7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"93739148-39fb-4db3-ae9d-d222feb368d7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:19:30 crc kubenswrapper[4830]: I0318 18:19:30.652059 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/93739148-39fb-4db3-ae9d-d222feb368d7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"93739148-39fb-4db3-ae9d-d222feb368d7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:19:30 crc kubenswrapper[4830]: I0318 18:19:30.654113 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/93739148-39fb-4db3-ae9d-d222feb368d7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"93739148-39fb-4db3-ae9d-d222feb368d7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:19:30 crc kubenswrapper[4830]: I0318 18:19:30.659369 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kr6m\" (UniqueName: \"kubernetes.io/projected/93739148-39fb-4db3-ae9d-d222feb368d7-kube-api-access-8kr6m\") pod \"ovsdbserver-nb-0\" (UID: \"93739148-39fb-4db3-ae9d-d222feb368d7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:19:30 crc kubenswrapper[4830]: I0318 18:19:30.660172 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"93739148-39fb-4db3-ae9d-d222feb368d7\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:19:30 crc kubenswrapper[4830]: I0318 18:19:30.728543 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 18:19:31 crc kubenswrapper[4830]: I0318 18:19:31.351330 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 18:19:31 crc kubenswrapper[4830]: I0318 18:19:31.353637 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 18:19:31 crc kubenswrapper[4830]: I0318 18:19:31.358209 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-zvh6n" Mar 18 18:19:31 crc kubenswrapper[4830]: I0318 18:19:31.358378 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 18:19:31 crc kubenswrapper[4830]: I0318 18:19:31.358437 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 18 18:19:31 crc kubenswrapper[4830]: I0318 18:19:31.358482 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 18 18:19:31 crc kubenswrapper[4830]: I0318 18:19:31.358858 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 18 18:19:31 crc kubenswrapper[4830]: I0318 18:19:31.438517 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:19:31 crc kubenswrapper[4830]: I0318 18:19:31.438609 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:19:31 crc kubenswrapper[4830]: I0318 18:19:31.438705 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:19:31 crc kubenswrapper[4830]: I0318 18:19:31.438747 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:19:31 crc kubenswrapper[4830]: I0318 18:19:31.438838 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-config\") pod \"ovsdbserver-sb-0\" (UID: \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:19:31 crc kubenswrapper[4830]: I0318 18:19:31.438881 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:19:31 crc kubenswrapper[4830]: I0318 18:19:31.438901 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:19:31 crc kubenswrapper[4830]: I0318 18:19:31.438951 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4v24\" (UniqueName: \"kubernetes.io/projected/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-kube-api-access-l4v24\") pod \"ovsdbserver-sb-0\" (UID: \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:19:31 crc kubenswrapper[4830]: I0318 18:19:31.541058 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:19:31 crc kubenswrapper[4830]: I0318 18:19:31.541898 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:19:31 crc kubenswrapper[4830]: I0318 18:19:31.541933 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:19:31 crc kubenswrapper[4830]: I0318 18:19:31.541953 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:19:31 crc kubenswrapper[4830]: I0318 18:19:31.542024 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-config\") pod \"ovsdbserver-sb-0\" (UID: \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:19:31 crc kubenswrapper[4830]: I0318 18:19:31.542051 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:19:31 crc kubenswrapper[4830]: I0318 18:19:31.542066 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:19:31 crc kubenswrapper[4830]: I0318 18:19:31.542093 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4v24\" (UniqueName: \"kubernetes.io/projected/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-kube-api-access-l4v24\") pod \"ovsdbserver-sb-0\" (UID: \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:19:31 crc kubenswrapper[4830]: I0318 18:19:31.542661 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:19:31 crc kubenswrapper[4830]: I0318 18:19:31.544376 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-config\") pod \"ovsdbserver-sb-0\" (UID: \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:19:31 crc kubenswrapper[4830]: I0318 18:19:31.544632 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:19:31 crc kubenswrapper[4830]: I0318 18:19:31.544740 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-sb-0" Mar 18 18:19:31 crc kubenswrapper[4830]: I0318 18:19:31.548642 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:19:31 crc kubenswrapper[4830]: I0318 18:19:31.549324 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:19:31 crc kubenswrapper[4830]: I0318 18:19:31.552805 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:19:31 crc kubenswrapper[4830]: I0318 18:19:31.562402 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4v24\" (UniqueName: \"kubernetes.io/projected/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-kube-api-access-l4v24\") pod \"ovsdbserver-sb-0\" (UID: \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:19:31 crc kubenswrapper[4830]: I0318 18:19:31.584141 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:19:31 crc kubenswrapper[4830]: I0318 18:19:31.722008 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 18:19:37 crc kubenswrapper[4830]: E0318 18:19:37.939211 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:2087a09e7ea9f1dbadd433366bb46cc93dd5460ac9606b65f430460f4c2ee18d" Mar 18 18:19:37 crc kubenswrapper[4830]: E0318 18:19:37.939991 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:2087a09e7ea9f1dbadd433366bb46cc93dd5460ac9606b65f430460f4c2ee18d,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mbgzn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(a639262d-5bc7-4b14-a6ef-59583fdffb07): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 18:19:37 crc kubenswrapper[4830]: E0318 18:19:37.941228 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="a639262d-5bc7-4b14-a6ef-59583fdffb07" Mar 18 18:19:38 crc kubenswrapper[4830]: E0318 18:19:38.570181 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:2087a09e7ea9f1dbadd433366bb46cc93dd5460ac9606b65f430460f4c2ee18d\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="a639262d-5bc7-4b14-a6ef-59583fdffb07" Mar 18 18:19:41 crc kubenswrapper[4830]: E0318 18:19:41.066975 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 18 18:19:41 crc kubenswrapper[4830]: E0318 18:19:41.068224 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mvlrk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-64696987c5-5v6nj_openstack(c01f528c-b72a-4d75-8480-484e9d5ad79a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 18:19:41 crc kubenswrapper[4830]: E0318 18:19:41.069558 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-64696987c5-5v6nj" podUID="c01f528c-b72a-4d75-8480-484e9d5ad79a" Mar 18 18:19:41 crc kubenswrapper[4830]: E0318 18:19:41.079311 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 18 18:19:41 crc kubenswrapper[4830]: E0318 18:19:41.079701 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m9k9w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-658f55c9f5-vvwtp_openstack(24d85c15-a28a-40ef-92cb-611d03123bc8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 18:19:41 crc kubenswrapper[4830]: E0318 18:19:41.081020 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-658f55c9f5-vvwtp" podUID="24d85c15-a28a-40ef-92cb-611d03123bc8" Mar 18 18:19:41 crc kubenswrapper[4830]: E0318 18:19:41.118159 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 18 18:19:41 crc kubenswrapper[4830]: E0318 18:19:41.118367 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zswc7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-54b5dffb47-jxtww_openstack(2046a38e-0101-47a2-88d5-f91ca521cb9a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 18:19:41 crc kubenswrapper[4830]: E0318 18:19:41.120721 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-54b5dffb47-jxtww" podUID="2046a38e-0101-47a2-88d5-f91ca521cb9a" Mar 18 18:19:41 crc kubenswrapper[4830]: E0318 18:19:41.328101 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 18 18:19:41 crc kubenswrapper[4830]: E0318 18:19:41.328506 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-57ppt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5448ff6dc7-brw2k_openstack(2e41d0ce-3d03-4246-a830-b03518c21f89): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 18:19:41 crc kubenswrapper[4830]: E0318 18:19:41.329909 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5448ff6dc7-brw2k" podUID="2e41d0ce-3d03-4246-a830-b03518c21f89" Mar 18 18:19:41 crc kubenswrapper[4830]: E0318 18:19:41.591812 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51\\\"\"" pod="openstack/dnsmasq-dns-54b5dffb47-jxtww" podUID="2046a38e-0101-47a2-88d5-f91ca521cb9a" Mar 18 18:19:41 crc kubenswrapper[4830]: E0318 18:19:41.591815 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51\\\"\"" pod="openstack/dnsmasq-dns-658f55c9f5-vvwtp" podUID="24d85c15-a28a-40ef-92cb-611d03123bc8" Mar 18 18:19:41 crc kubenswrapper[4830]: I0318 18:19:41.715016 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 18:19:41 crc kubenswrapper[4830]: W0318 18:19:41.725663 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeceefe95_ad07_4228_ac93_a8f2484ba584.slice/crio-9051b9364ed790c8b9d8c8949d64fcbefd485723e7a773ed646932bcd1c0e6d0 WatchSource:0}: Error finding container 9051b9364ed790c8b9d8c8949d64fcbefd485723e7a773ed646932bcd1c0e6d0: Status 404 returned error can't find the container with id 9051b9364ed790c8b9d8c8949d64fcbefd485723e7a773ed646932bcd1c0e6d0 Mar 18 18:19:41 crc kubenswrapper[4830]: I0318 18:19:41.879804 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-brw2k" Mar 18 18:19:41 crc kubenswrapper[4830]: I0318 18:19:41.888422 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-5v6nj" Mar 18 18:19:41 crc kubenswrapper[4830]: I0318 18:19:41.974858 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c01f528c-b72a-4d75-8480-484e9d5ad79a-dns-svc\") pod \"c01f528c-b72a-4d75-8480-484e9d5ad79a\" (UID: \"c01f528c-b72a-4d75-8480-484e9d5ad79a\") " Mar 18 18:19:41 crc kubenswrapper[4830]: I0318 18:19:41.975128 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c01f528c-b72a-4d75-8480-484e9d5ad79a-config\") pod \"c01f528c-b72a-4d75-8480-484e9d5ad79a\" (UID: \"c01f528c-b72a-4d75-8480-484e9d5ad79a\") " Mar 18 18:19:41 crc kubenswrapper[4830]: I0318 18:19:41.975307 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e41d0ce-3d03-4246-a830-b03518c21f89-config\") pod \"2e41d0ce-3d03-4246-a830-b03518c21f89\" (UID: \"2e41d0ce-3d03-4246-a830-b03518c21f89\") " Mar 18 18:19:41 crc kubenswrapper[4830]: I0318 18:19:41.975460 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvlrk\" (UniqueName: \"kubernetes.io/projected/c01f528c-b72a-4d75-8480-484e9d5ad79a-kube-api-access-mvlrk\") pod \"c01f528c-b72a-4d75-8480-484e9d5ad79a\" (UID: \"c01f528c-b72a-4d75-8480-484e9d5ad79a\") " Mar 18 18:19:41 crc kubenswrapper[4830]: I0318 18:19:41.975589 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57ppt\" (UniqueName: \"kubernetes.io/projected/2e41d0ce-3d03-4246-a830-b03518c21f89-kube-api-access-57ppt\") pod \"2e41d0ce-3d03-4246-a830-b03518c21f89\" (UID: \"2e41d0ce-3d03-4246-a830-b03518c21f89\") " Mar 18 18:19:41 crc kubenswrapper[4830]: I0318 18:19:41.975528 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c01f528c-b72a-4d75-8480-484e9d5ad79a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c01f528c-b72a-4d75-8480-484e9d5ad79a" (UID: "c01f528c-b72a-4d75-8480-484e9d5ad79a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:19:41 crc kubenswrapper[4830]: I0318 18:19:41.976198 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c01f528c-b72a-4d75-8480-484e9d5ad79a-config" (OuterVolumeSpecName: "config") pod "c01f528c-b72a-4d75-8480-484e9d5ad79a" (UID: "c01f528c-b72a-4d75-8480-484e9d5ad79a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:19:41 crc kubenswrapper[4830]: I0318 18:19:41.976312 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e41d0ce-3d03-4246-a830-b03518c21f89-config" (OuterVolumeSpecName: "config") pod "2e41d0ce-3d03-4246-a830-b03518c21f89" (UID: "2e41d0ce-3d03-4246-a830-b03518c21f89"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:19:41 crc kubenswrapper[4830]: I0318 18:19:41.976851 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e41d0ce-3d03-4246-a830-b03518c21f89-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:19:41 crc kubenswrapper[4830]: I0318 18:19:41.977102 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c01f528c-b72a-4d75-8480-484e9d5ad79a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 18:19:41 crc kubenswrapper[4830]: I0318 18:19:41.977391 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c01f528c-b72a-4d75-8480-484e9d5ad79a-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:19:41 crc kubenswrapper[4830]: I0318 18:19:41.981395 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e41d0ce-3d03-4246-a830-b03518c21f89-kube-api-access-57ppt" (OuterVolumeSpecName: "kube-api-access-57ppt") pod "2e41d0ce-3d03-4246-a830-b03518c21f89" (UID: "2e41d0ce-3d03-4246-a830-b03518c21f89"). InnerVolumeSpecName "kube-api-access-57ppt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:19:41 crc kubenswrapper[4830]: I0318 18:19:41.981601 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c01f528c-b72a-4d75-8480-484e9d5ad79a-kube-api-access-mvlrk" (OuterVolumeSpecName: "kube-api-access-mvlrk") pod "c01f528c-b72a-4d75-8480-484e9d5ad79a" (UID: "c01f528c-b72a-4d75-8480-484e9d5ad79a"). InnerVolumeSpecName "kube-api-access-mvlrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:19:42 crc kubenswrapper[4830]: I0318 18:19:42.031857 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 18:19:42 crc kubenswrapper[4830]: I0318 18:19:42.044756 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-chwf9"] Mar 18 18:19:42 crc kubenswrapper[4830]: I0318 18:19:42.057588 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 18:19:42 crc kubenswrapper[4830]: W0318 18:19:42.073662 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34dba61e_bb8b_4d99_9a3c_7d9c5dd4ad15.slice/crio-327b00fdc804b36736db73fc8c691c17804ae744740ed660740eaa0606739c62 WatchSource:0}: Error finding container 327b00fdc804b36736db73fc8c691c17804ae744740ed660740eaa0606739c62: Status 404 returned error can't find the container with id 327b00fdc804b36736db73fc8c691c17804ae744740ed660740eaa0606739c62 Mar 18 18:19:42 crc kubenswrapper[4830]: I0318 18:19:42.079674 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvlrk\" (UniqueName: \"kubernetes.io/projected/c01f528c-b72a-4d75-8480-484e9d5ad79a-kube-api-access-mvlrk\") on node \"crc\" DevicePath \"\"" Mar 18 18:19:42 crc kubenswrapper[4830]: I0318 18:19:42.079699 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57ppt\" (UniqueName: \"kubernetes.io/projected/2e41d0ce-3d03-4246-a830-b03518c21f89-kube-api-access-57ppt\") on node \"crc\" DevicePath \"\"" Mar 18 18:19:42 crc kubenswrapper[4830]: I0318 18:19:42.086617 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 18:19:42 crc kubenswrapper[4830]: W0318 18:19:42.098220 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod435574fa_a924_4289_a93a_dea05d57d105.slice/crio-cf96e6e3aa6cfbde5731d0a8ac5bfd6e0ea77de40696ffd41f2dcf7a5ab2da05 WatchSource:0}: Error finding container cf96e6e3aa6cfbde5731d0a8ac5bfd6e0ea77de40696ffd41f2dcf7a5ab2da05: Status 404 returned error can't find the container with id cf96e6e3aa6cfbde5731d0a8ac5bfd6e0ea77de40696ffd41f2dcf7a5ab2da05 Mar 18 18:19:42 crc kubenswrapper[4830]: I0318 18:19:42.164854 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 18:19:42 crc kubenswrapper[4830]: W0318 18:19:42.165704 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6d11dd9_4b5b_463e_a834_91c7ecc8b021.slice/crio-cfed119a9d9df8a00ccdadf5a187c6fa1edd82ab5917850b8b4c39e4ed1bcd6b WatchSource:0}: Error finding container cfed119a9d9df8a00ccdadf5a187c6fa1edd82ab5917850b8b4c39e4ed1bcd6b: Status 404 returned error can't find the container with id cfed119a9d9df8a00ccdadf5a187c6fa1edd82ab5917850b8b4c39e4ed1bcd6b Mar 18 18:19:42 crc kubenswrapper[4830]: W0318 18:19:42.263821 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23b737c7_6b5d_44f4_b05a_de278f4ca572.slice/crio-d486014fb7ebdc66db36195db223508854fbe9541cb559f93347e98398c82346 WatchSource:0}: Error finding container d486014fb7ebdc66db36195db223508854fbe9541cb559f93347e98398c82346: Status 404 returned error can't find the container with id d486014fb7ebdc66db36195db223508854fbe9541cb559f93347e98398c82346 Mar 18 18:19:42 crc kubenswrapper[4830]: I0318 18:19:42.264926 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-dv8kn"] Mar 18 18:19:42 crc kubenswrapper[4830]: I0318 18:19:42.605813 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"eceefe95-ad07-4228-ac93-a8f2484ba584","Type":"ContainerStarted","Data":"9051b9364ed790c8b9d8c8949d64fcbefd485723e7a773ed646932bcd1c0e6d0"} Mar 18 18:19:42 crc kubenswrapper[4830]: I0318 18:19:42.606801 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"a0a1e291-1a11-4747-96ed-32c95623dcbb","Type":"ContainerStarted","Data":"d805b73651ccf98e97dab5ad8973d11d3a50eafb0529910eecf3d0ed2bdefb03"} Mar 18 18:19:42 crc kubenswrapper[4830]: I0318 18:19:42.607791 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"435574fa-a924-4289-a93a-dea05d57d105","Type":"ContainerStarted","Data":"cf96e6e3aa6cfbde5731d0a8ac5bfd6e0ea77de40696ffd41f2dcf7a5ab2da05"} Mar 18 18:19:42 crc kubenswrapper[4830]: I0318 18:19:42.608873 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15","Type":"ContainerStarted","Data":"327b00fdc804b36736db73fc8c691c17804ae744740ed660740eaa0606739c62"} Mar 18 18:19:42 crc kubenswrapper[4830]: I0318 18:19:42.609982 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e6d11dd9-4b5b-463e-a834-91c7ecc8b021","Type":"ContainerStarted","Data":"cfed119a9d9df8a00ccdadf5a187c6fa1edd82ab5917850b8b4c39e4ed1bcd6b"} Mar 18 18:19:42 crc kubenswrapper[4830]: I0318 18:19:42.610999 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64696987c5-5v6nj" event={"ID":"c01f528c-b72a-4d75-8480-484e9d5ad79a","Type":"ContainerDied","Data":"f899d169c285580ec65fc1d9ef8b53a7d1ffbd9f4671d4ee7a85291cf58fcda6"} Mar 18 18:19:42 crc kubenswrapper[4830]: I0318 18:19:42.611027 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-5v6nj" Mar 18 18:19:42 crc kubenswrapper[4830]: I0318 18:19:42.613157 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-chwf9" event={"ID":"544c01f7-a6da-45de-96f2-9ab9dea0567c","Type":"ContainerStarted","Data":"d3c3e314884865817c6c30dd92c28997f9b3bd149baf807074125a27d2981e24"} Mar 18 18:19:42 crc kubenswrapper[4830]: I0318 18:19:42.614369 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dv8kn" event={"ID":"23b737c7-6b5d-44f4-b05a-de278f4ca572","Type":"ContainerStarted","Data":"d486014fb7ebdc66db36195db223508854fbe9541cb559f93347e98398c82346"} Mar 18 18:19:42 crc kubenswrapper[4830]: I0318 18:19:42.617137 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5448ff6dc7-brw2k" event={"ID":"2e41d0ce-3d03-4246-a830-b03518c21f89","Type":"ContainerDied","Data":"1acb21ca7263839ff3e9e5035baaf42d68ae7413294ffdbb7a682aff51d5652a"} Mar 18 18:19:42 crc kubenswrapper[4830]: I0318 18:19:42.617140 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-brw2k" Mar 18 18:19:42 crc kubenswrapper[4830]: I0318 18:19:42.622895 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"56fb6c83-b748-4e21-9b1c-90fb37cefea1","Type":"ContainerStarted","Data":"15382a1e088fc1fb63a24c483cd644ccab35ae8fb871c5907d4ff1a797361333"} Mar 18 18:19:42 crc kubenswrapper[4830]: I0318 18:19:42.667531 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-5v6nj"] Mar 18 18:19:42 crc kubenswrapper[4830]: I0318 18:19:42.705984 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-5v6nj"] Mar 18 18:19:42 crc kubenswrapper[4830]: I0318 18:19:42.765028 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-brw2k"] Mar 18 18:19:42 crc kubenswrapper[4830]: I0318 18:19:42.771111 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-brw2k"] Mar 18 18:19:43 crc kubenswrapper[4830]: I0318 18:19:43.021695 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 18:19:43 crc kubenswrapper[4830]: I0318 18:19:43.630490 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"93739148-39fb-4db3-ae9d-d222feb368d7","Type":"ContainerStarted","Data":"c368f83eb29fcf234da438ba1f29502958e03086b3711a354500f4c7b5c5c05a"} Mar 18 18:19:44 crc kubenswrapper[4830]: I0318 18:19:44.245597 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e41d0ce-3d03-4246-a830-b03518c21f89" path="/var/lib/kubelet/pods/2e41d0ce-3d03-4246-a830-b03518c21f89/volumes" Mar 18 18:19:44 crc kubenswrapper[4830]: I0318 18:19:44.246485 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c01f528c-b72a-4d75-8480-484e9d5ad79a" path="/var/lib/kubelet/pods/c01f528c-b72a-4d75-8480-484e9d5ad79a/volumes" Mar 18 18:19:51 crc kubenswrapper[4830]: I0318 18:19:51.712369 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"eceefe95-ad07-4228-ac93-a8f2484ba584","Type":"ContainerStarted","Data":"70f3b429e2562b27fda6ee8c73713da103efc539bc82849bd258e189d8678a3f"} Mar 18 18:19:51 crc kubenswrapper[4830]: I0318 18:19:51.713130 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 18 18:19:51 crc kubenswrapper[4830]: I0318 18:19:51.719255 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"a0a1e291-1a11-4747-96ed-32c95623dcbb","Type":"ContainerStarted","Data":"c6c30f91c3c07f2417a561616bc4ab4ba1863961710fa17a2a7105a6e4af19cd"} Mar 18 18:19:51 crc kubenswrapper[4830]: I0318 18:19:51.719388 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 18 18:19:51 crc kubenswrapper[4830]: I0318 18:19:51.720650 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"93739148-39fb-4db3-ae9d-d222feb368d7","Type":"ContainerStarted","Data":"3073305b4183467e7f6c2b40e18ca0a3dc5dd325e4392cdfee5efad929986263"} Mar 18 18:19:51 crc kubenswrapper[4830]: I0318 18:19:51.726375 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"435574fa-a924-4289-a93a-dea05d57d105","Type":"ContainerStarted","Data":"8a2b2534baed3a130b8121d69c5626b8abb92c0dc65a019d61420e4ccd6e5352"} Mar 18 18:19:51 crc kubenswrapper[4830]: I0318 18:19:51.731552 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15","Type":"ContainerStarted","Data":"e20da5e2c4bbdc2778d58c4baf6547e914e5cc2ec137efafe2f7cd9631a76c14"} Mar 18 18:19:51 crc kubenswrapper[4830]: I0318 18:19:51.732837 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=18.175396737 podStartE2EDuration="27.732817313s" podCreationTimestamp="2026-03-18 18:19:24 +0000 UTC" firstStartedPulling="2026-03-18 18:19:41.727306317 +0000 UTC m=+1016.294936649" lastFinishedPulling="2026-03-18 18:19:51.284726893 +0000 UTC m=+1025.852357225" observedRunningTime="2026-03-18 18:19:51.729783798 +0000 UTC m=+1026.297414120" watchObservedRunningTime="2026-03-18 18:19:51.732817313 +0000 UTC m=+1026.300447655" Mar 18 18:19:51 crc kubenswrapper[4830]: I0318 18:19:51.734014 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dv8kn" event={"ID":"23b737c7-6b5d-44f4-b05a-de278f4ca572","Type":"ContainerStarted","Data":"18ea142ceb8d413ddf0c7ab0f2cbe4c96f2ce6a59c01ff0b773b207d1a0b8f74"} Mar 18 18:19:51 crc kubenswrapper[4830]: I0318 18:19:51.736268 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e6d11dd9-4b5b-463e-a834-91c7ecc8b021","Type":"ContainerStarted","Data":"e27720e7dca97ec5784c549e6e6c7e84e6b4913613d159710e88f4288654e511"} Mar 18 18:19:51 crc kubenswrapper[4830]: I0318 18:19:51.755840 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=20.536679262 podStartE2EDuration="29.755821302s" podCreationTimestamp="2026-03-18 18:19:22 +0000 UTC" firstStartedPulling="2026-03-18 18:19:42.0604243 +0000 UTC m=+1016.628054632" lastFinishedPulling="2026-03-18 18:19:51.2795663 +0000 UTC m=+1025.847196672" observedRunningTime="2026-03-18 18:19:51.750670639 +0000 UTC m=+1026.318300981" watchObservedRunningTime="2026-03-18 18:19:51.755821302 +0000 UTC m=+1026.323451634" Mar 18 18:19:52 crc kubenswrapper[4830]: I0318 18:19:52.750399 4830 generic.go:334] "Generic (PLEG): container finished" podID="23b737c7-6b5d-44f4-b05a-de278f4ca572" containerID="18ea142ceb8d413ddf0c7ab0f2cbe4c96f2ce6a59c01ff0b773b207d1a0b8f74" exitCode=0 Mar 18 18:19:52 crc kubenswrapper[4830]: I0318 18:19:52.750544 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dv8kn" event={"ID":"23b737c7-6b5d-44f4-b05a-de278f4ca572","Type":"ContainerDied","Data":"18ea142ceb8d413ddf0c7ab0f2cbe4c96f2ce6a59c01ff0b773b207d1a0b8f74"} Mar 18 18:19:52 crc kubenswrapper[4830]: I0318 18:19:52.753559 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-chwf9" event={"ID":"544c01f7-a6da-45de-96f2-9ab9dea0567c","Type":"ContainerStarted","Data":"7eab1cf8b6cb575621ae6e6f99b624e1a23b211fa8cf4fe29aa7e8049a993337"} Mar 18 18:19:52 crc kubenswrapper[4830]: I0318 18:19:52.797761 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-chwf9" podStartSLOduration=15.571648021 podStartE2EDuration="24.797743824s" podCreationTimestamp="2026-03-18 18:19:28 +0000 UTC" firstStartedPulling="2026-03-18 18:19:42.060931814 +0000 UTC m=+1016.628562146" lastFinishedPulling="2026-03-18 18:19:51.287027617 +0000 UTC m=+1025.854657949" observedRunningTime="2026-03-18 18:19:52.786739428 +0000 UTC m=+1027.354369760" watchObservedRunningTime="2026-03-18 18:19:52.797743824 +0000 UTC m=+1027.365374156" Mar 18 18:19:53 crc kubenswrapper[4830]: I0318 18:19:53.489810 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-chwf9" Mar 18 18:19:53 crc kubenswrapper[4830]: I0318 18:19:53.767904 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dv8kn" event={"ID":"23b737c7-6b5d-44f4-b05a-de278f4ca572","Type":"ContainerStarted","Data":"880631acc0141d0007f3a250db7aaba33c7a12bda1b531c7c202660030481e50"} Mar 18 18:19:53 crc kubenswrapper[4830]: I0318 18:19:53.767980 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dv8kn" event={"ID":"23b737c7-6b5d-44f4-b05a-de278f4ca572","Type":"ContainerStarted","Data":"4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc"} Mar 18 18:19:53 crc kubenswrapper[4830]: I0318 18:19:53.768068 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-dv8kn" Mar 18 18:19:53 crc kubenswrapper[4830]: I0318 18:19:53.768356 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-dv8kn" Mar 18 18:19:53 crc kubenswrapper[4830]: I0318 18:19:53.802716 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-dv8kn" podStartSLOduration=16.784755651 podStartE2EDuration="25.802696037s" podCreationTimestamp="2026-03-18 18:19:28 +0000 UTC" firstStartedPulling="2026-03-18 18:19:42.266010036 +0000 UTC m=+1016.833640368" lastFinishedPulling="2026-03-18 18:19:51.283950422 +0000 UTC m=+1025.851580754" observedRunningTime="2026-03-18 18:19:53.789843389 +0000 UTC m=+1028.357473761" watchObservedRunningTime="2026-03-18 18:19:53.802696037 +0000 UTC m=+1028.370326379" Mar 18 18:19:55 crc kubenswrapper[4830]: I0318 18:19:55.786118 4830 generic.go:334] "Generic (PLEG): container finished" podID="435574fa-a924-4289-a93a-dea05d57d105" containerID="8a2b2534baed3a130b8121d69c5626b8abb92c0dc65a019d61420e4ccd6e5352" exitCode=0 Mar 18 18:19:55 crc kubenswrapper[4830]: I0318 18:19:55.786253 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"435574fa-a924-4289-a93a-dea05d57d105","Type":"ContainerDied","Data":"8a2b2534baed3a130b8121d69c5626b8abb92c0dc65a019d61420e4ccd6e5352"} Mar 18 18:19:55 crc kubenswrapper[4830]: I0318 18:19:55.789392 4830 generic.go:334] "Generic (PLEG): container finished" podID="34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15" containerID="e20da5e2c4bbdc2778d58c4baf6547e914e5cc2ec137efafe2f7cd9631a76c14" exitCode=0 Mar 18 18:19:55 crc kubenswrapper[4830]: I0318 18:19:55.789478 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15","Type":"ContainerDied","Data":"e20da5e2c4bbdc2778d58c4baf6547e914e5cc2ec137efafe2f7cd9631a76c14"} Mar 18 18:19:55 crc kubenswrapper[4830]: I0318 18:19:55.791246 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e6d11dd9-4b5b-463e-a834-91c7ecc8b021","Type":"ContainerStarted","Data":"6d02c3d4022f8ff71336fe32eb97efefa0f42dad83cb62b31f62c9f071d62b10"} Mar 18 18:19:55 crc kubenswrapper[4830]: I0318 18:19:55.797244 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"93739148-39fb-4db3-ae9d-d222feb368d7","Type":"ContainerStarted","Data":"e6ff33896ab819ecb0f974d24f8341e4cd47187d5b83f4031d921d854055799e"} Mar 18 18:19:55 crc kubenswrapper[4830]: I0318 18:19:55.799880 4830 generic.go:334] "Generic (PLEG): container finished" podID="24d85c15-a28a-40ef-92cb-611d03123bc8" containerID="038e1a05a3225fcb0be592a1b895c7b6a948afc837d895acea6636d4233cabc7" exitCode=0 Mar 18 18:19:55 crc kubenswrapper[4830]: I0318 18:19:55.799943 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658f55c9f5-vvwtp" event={"ID":"24d85c15-a28a-40ef-92cb-611d03123bc8","Type":"ContainerDied","Data":"038e1a05a3225fcb0be592a1b895c7b6a948afc837d895acea6636d4233cabc7"} Mar 18 18:19:55 crc kubenswrapper[4830]: I0318 18:19:55.855100 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=12.982522699 podStartE2EDuration="25.855058643s" podCreationTimestamp="2026-03-18 18:19:30 +0000 UTC" firstStartedPulling="2026-03-18 18:19:42.168938947 +0000 UTC m=+1016.736569279" lastFinishedPulling="2026-03-18 18:19:55.041474851 +0000 UTC m=+1029.609105223" observedRunningTime="2026-03-18 18:19:55.84523676 +0000 UTC m=+1030.412867122" watchObservedRunningTime="2026-03-18 18:19:55.855058643 +0000 UTC m=+1030.422688985" Mar 18 18:19:55 crc kubenswrapper[4830]: I0318 18:19:55.934821 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=15.017500796 podStartE2EDuration="26.934759969s" podCreationTimestamp="2026-03-18 18:19:29 +0000 UTC" firstStartedPulling="2026-03-18 18:19:43.104434948 +0000 UTC m=+1017.672065280" lastFinishedPulling="2026-03-18 18:19:55.021694081 +0000 UTC m=+1029.589324453" observedRunningTime="2026-03-18 18:19:55.93011619 +0000 UTC m=+1030.497746572" watchObservedRunningTime="2026-03-18 18:19:55.934759969 +0000 UTC m=+1030.502390311" Mar 18 18:19:56 crc kubenswrapper[4830]: I0318 18:19:56.722327 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 18 18:19:56 crc kubenswrapper[4830]: I0318 18:19:56.809916 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15","Type":"ContainerStarted","Data":"1faa9ae26dd3b098b13664816e21acb92be7c62408cd3cf5567216f95dc7ad27"} Mar 18 18:19:56 crc kubenswrapper[4830]: I0318 18:19:56.815056 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a639262d-5bc7-4b14-a6ef-59583fdffb07","Type":"ContainerStarted","Data":"4b3823ab703387f205ec3b36349fb621c98a8c89a6e4303228224586840c10d9"} Mar 18 18:19:56 crc kubenswrapper[4830]: I0318 18:19:56.817280 4830 generic.go:334] "Generic (PLEG): container finished" podID="2046a38e-0101-47a2-88d5-f91ca521cb9a" containerID="c91e68b7a6e0067e0e3cd3bab48a1b742b18fcf09cbfeeeb780dc948fb3e3c5d" exitCode=0 Mar 18 18:19:56 crc kubenswrapper[4830]: I0318 18:19:56.817363 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-jxtww" event={"ID":"2046a38e-0101-47a2-88d5-f91ca521cb9a","Type":"ContainerDied","Data":"c91e68b7a6e0067e0e3cd3bab48a1b742b18fcf09cbfeeeb780dc948fb3e3c5d"} Mar 18 18:19:56 crc kubenswrapper[4830]: I0318 18:19:56.819733 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658f55c9f5-vvwtp" event={"ID":"24d85c15-a28a-40ef-92cb-611d03123bc8","Type":"ContainerStarted","Data":"08890f6548cf5896aac9ebc71a44c7267681c510333e48d52e04ea4f1bdf5cb1"} Mar 18 18:19:56 crc kubenswrapper[4830]: I0318 18:19:56.820152 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-658f55c9f5-vvwtp" Mar 18 18:19:56 crc kubenswrapper[4830]: I0318 18:19:56.823352 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"435574fa-a924-4289-a93a-dea05d57d105","Type":"ContainerStarted","Data":"01d8e91004d318c41a6579e547dd6425e1913b522dba6cd78012d1eca9d7aedf"} Mar 18 18:19:56 crc kubenswrapper[4830]: I0318 18:19:56.831965 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=27.629093989 podStartE2EDuration="36.831949986s" podCreationTimestamp="2026-03-18 18:19:20 +0000 UTC" firstStartedPulling="2026-03-18 18:19:42.076663651 +0000 UTC m=+1016.644293983" lastFinishedPulling="2026-03-18 18:19:51.279519648 +0000 UTC m=+1025.847149980" observedRunningTime="2026-03-18 18:19:56.828557001 +0000 UTC m=+1031.396187373" watchObservedRunningTime="2026-03-18 18:19:56.831949986 +0000 UTC m=+1031.399580318" Mar 18 18:19:56 crc kubenswrapper[4830]: I0318 18:19:56.871357 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-658f55c9f5-vvwtp" podStartSLOduration=3.518884097 podStartE2EDuration="38.871338101s" podCreationTimestamp="2026-03-18 18:19:18 +0000 UTC" firstStartedPulling="2026-03-18 18:19:19.646449713 +0000 UTC m=+994.214080045" lastFinishedPulling="2026-03-18 18:19:54.998903677 +0000 UTC m=+1029.566534049" observedRunningTime="2026-03-18 18:19:56.869206682 +0000 UTC m=+1031.436837034" watchObservedRunningTime="2026-03-18 18:19:56.871338101 +0000 UTC m=+1031.438968433" Mar 18 18:19:56 crc kubenswrapper[4830]: I0318 18:19:56.918481 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=26.73382503 podStartE2EDuration="35.918460891s" podCreationTimestamp="2026-03-18 18:19:21 +0000 UTC" firstStartedPulling="2026-03-18 18:19:42.100118243 +0000 UTC m=+1016.667748575" lastFinishedPulling="2026-03-18 18:19:51.284754104 +0000 UTC m=+1025.852384436" observedRunningTime="2026-03-18 18:19:56.917058322 +0000 UTC m=+1031.484688674" watchObservedRunningTime="2026-03-18 18:19:56.918460891 +0000 UTC m=+1031.486091223" Mar 18 18:19:57 crc kubenswrapper[4830]: I0318 18:19:57.729085 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 18 18:19:57 crc kubenswrapper[4830]: I0318 18:19:57.791833 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 18 18:19:57 crc kubenswrapper[4830]: I0318 18:19:57.837223 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-jxtww" event={"ID":"2046a38e-0101-47a2-88d5-f91ca521cb9a","Type":"ContainerStarted","Data":"3ad90e9bb629396916279f41abb6b41dae58a2802d7ab12610a2e1187406f14e"} Mar 18 18:19:57 crc kubenswrapper[4830]: I0318 18:19:57.838263 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 18 18:19:57 crc kubenswrapper[4830]: I0318 18:19:57.872847 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54b5dffb47-jxtww" podStartSLOduration=-9223371996.981962 podStartE2EDuration="39.872813016s" podCreationTimestamp="2026-03-18 18:19:18 +0000 UTC" firstStartedPulling="2026-03-18 18:19:19.939690476 +0000 UTC m=+994.507320808" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:19:57.86001286 +0000 UTC m=+1032.427643222" watchObservedRunningTime="2026-03-18 18:19:57.872813016 +0000 UTC m=+1032.440443378" Mar 18 18:19:57 crc kubenswrapper[4830]: I0318 18:19:57.902359 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.123051 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.226242 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-jxtww"] Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.255814 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-jkvj9"] Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.256846 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-jkvj9" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.261007 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.272190 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-jkvj9"] Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.291713 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84d7bcdf99-dpxx8"] Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.292942 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84d7bcdf99-dpxx8" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.299520 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.316827 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84d7bcdf99-dpxx8"] Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.387672 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91e72012-8cc7-45c1-b677-6e3b666627da-config\") pod \"dnsmasq-dns-84d7bcdf99-dpxx8\" (UID: \"91e72012-8cc7-45c1-b677-6e3b666627da\") " pod="openstack/dnsmasq-dns-84d7bcdf99-dpxx8" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.388043 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91e72012-8cc7-45c1-b677-6e3b666627da-dns-svc\") pod \"dnsmasq-dns-84d7bcdf99-dpxx8\" (UID: \"91e72012-8cc7-45c1-b677-6e3b666627da\") " pod="openstack/dnsmasq-dns-84d7bcdf99-dpxx8" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.388085 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/60094d0f-d530-424e-92d1-62e473acc664-ovn-rundir\") pod \"ovn-controller-metrics-jkvj9\" (UID: \"60094d0f-d530-424e-92d1-62e473acc664\") " pod="openstack/ovn-controller-metrics-jkvj9" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.388106 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/91e72012-8cc7-45c1-b677-6e3b666627da-ovsdbserver-nb\") pod \"dnsmasq-dns-84d7bcdf99-dpxx8\" (UID: \"91e72012-8cc7-45c1-b677-6e3b666627da\") " pod="openstack/dnsmasq-dns-84d7bcdf99-dpxx8" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.388126 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/60094d0f-d530-424e-92d1-62e473acc664-ovs-rundir\") pod \"ovn-controller-metrics-jkvj9\" (UID: \"60094d0f-d530-424e-92d1-62e473acc664\") " pod="openstack/ovn-controller-metrics-jkvj9" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.388210 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60094d0f-d530-424e-92d1-62e473acc664-config\") pod \"ovn-controller-metrics-jkvj9\" (UID: \"60094d0f-d530-424e-92d1-62e473acc664\") " pod="openstack/ovn-controller-metrics-jkvj9" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.388242 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/60094d0f-d530-424e-92d1-62e473acc664-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-jkvj9\" (UID: \"60094d0f-d530-424e-92d1-62e473acc664\") " pod="openstack/ovn-controller-metrics-jkvj9" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.388295 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkfb5\" (UniqueName: \"kubernetes.io/projected/60094d0f-d530-424e-92d1-62e473acc664-kube-api-access-kkfb5\") pod \"ovn-controller-metrics-jkvj9\" (UID: \"60094d0f-d530-424e-92d1-62e473acc664\") " pod="openstack/ovn-controller-metrics-jkvj9" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.388314 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6njw\" (UniqueName: \"kubernetes.io/projected/91e72012-8cc7-45c1-b677-6e3b666627da-kube-api-access-r6njw\") pod \"dnsmasq-dns-84d7bcdf99-dpxx8\" (UID: \"91e72012-8cc7-45c1-b677-6e3b666627da\") " pod="openstack/dnsmasq-dns-84d7bcdf99-dpxx8" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.388437 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60094d0f-d530-424e-92d1-62e473acc664-combined-ca-bundle\") pod \"ovn-controller-metrics-jkvj9\" (UID: \"60094d0f-d530-424e-92d1-62e473acc664\") " pod="openstack/ovn-controller-metrics-jkvj9" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.490081 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkfb5\" (UniqueName: \"kubernetes.io/projected/60094d0f-d530-424e-92d1-62e473acc664-kube-api-access-kkfb5\") pod \"ovn-controller-metrics-jkvj9\" (UID: \"60094d0f-d530-424e-92d1-62e473acc664\") " pod="openstack/ovn-controller-metrics-jkvj9" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.490135 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6njw\" (UniqueName: \"kubernetes.io/projected/91e72012-8cc7-45c1-b677-6e3b666627da-kube-api-access-r6njw\") pod \"dnsmasq-dns-84d7bcdf99-dpxx8\" (UID: \"91e72012-8cc7-45c1-b677-6e3b666627da\") " pod="openstack/dnsmasq-dns-84d7bcdf99-dpxx8" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.490172 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60094d0f-d530-424e-92d1-62e473acc664-combined-ca-bundle\") pod \"ovn-controller-metrics-jkvj9\" (UID: \"60094d0f-d530-424e-92d1-62e473acc664\") " pod="openstack/ovn-controller-metrics-jkvj9" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.490246 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91e72012-8cc7-45c1-b677-6e3b666627da-config\") pod \"dnsmasq-dns-84d7bcdf99-dpxx8\" (UID: \"91e72012-8cc7-45c1-b677-6e3b666627da\") " pod="openstack/dnsmasq-dns-84d7bcdf99-dpxx8" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.490282 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91e72012-8cc7-45c1-b677-6e3b666627da-dns-svc\") pod \"dnsmasq-dns-84d7bcdf99-dpxx8\" (UID: \"91e72012-8cc7-45c1-b677-6e3b666627da\") " pod="openstack/dnsmasq-dns-84d7bcdf99-dpxx8" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.490326 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/60094d0f-d530-424e-92d1-62e473acc664-ovn-rundir\") pod \"ovn-controller-metrics-jkvj9\" (UID: \"60094d0f-d530-424e-92d1-62e473acc664\") " pod="openstack/ovn-controller-metrics-jkvj9" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.490352 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/91e72012-8cc7-45c1-b677-6e3b666627da-ovsdbserver-nb\") pod \"dnsmasq-dns-84d7bcdf99-dpxx8\" (UID: \"91e72012-8cc7-45c1-b677-6e3b666627da\") " pod="openstack/dnsmasq-dns-84d7bcdf99-dpxx8" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.490372 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/60094d0f-d530-424e-92d1-62e473acc664-ovs-rundir\") pod \"ovn-controller-metrics-jkvj9\" (UID: \"60094d0f-d530-424e-92d1-62e473acc664\") " pod="openstack/ovn-controller-metrics-jkvj9" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.490403 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60094d0f-d530-424e-92d1-62e473acc664-config\") pod \"ovn-controller-metrics-jkvj9\" (UID: \"60094d0f-d530-424e-92d1-62e473acc664\") " pod="openstack/ovn-controller-metrics-jkvj9" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.490437 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/60094d0f-d530-424e-92d1-62e473acc664-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-jkvj9\" (UID: \"60094d0f-d530-424e-92d1-62e473acc664\") " pod="openstack/ovn-controller-metrics-jkvj9" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.490796 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/60094d0f-d530-424e-92d1-62e473acc664-ovn-rundir\") pod \"ovn-controller-metrics-jkvj9\" (UID: \"60094d0f-d530-424e-92d1-62e473acc664\") " pod="openstack/ovn-controller-metrics-jkvj9" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.490799 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/60094d0f-d530-424e-92d1-62e473acc664-ovs-rundir\") pod \"ovn-controller-metrics-jkvj9\" (UID: \"60094d0f-d530-424e-92d1-62e473acc664\") " pod="openstack/ovn-controller-metrics-jkvj9" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.491414 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91e72012-8cc7-45c1-b677-6e3b666627da-config\") pod \"dnsmasq-dns-84d7bcdf99-dpxx8\" (UID: \"91e72012-8cc7-45c1-b677-6e3b666627da\") " pod="openstack/dnsmasq-dns-84d7bcdf99-dpxx8" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.491448 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91e72012-8cc7-45c1-b677-6e3b666627da-dns-svc\") pod \"dnsmasq-dns-84d7bcdf99-dpxx8\" (UID: \"91e72012-8cc7-45c1-b677-6e3b666627da\") " pod="openstack/dnsmasq-dns-84d7bcdf99-dpxx8" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.491499 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60094d0f-d530-424e-92d1-62e473acc664-config\") pod \"ovn-controller-metrics-jkvj9\" (UID: \"60094d0f-d530-424e-92d1-62e473acc664\") " pod="openstack/ovn-controller-metrics-jkvj9" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.492064 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/91e72012-8cc7-45c1-b677-6e3b666627da-ovsdbserver-nb\") pod \"dnsmasq-dns-84d7bcdf99-dpxx8\" (UID: \"91e72012-8cc7-45c1-b677-6e3b666627da\") " pod="openstack/dnsmasq-dns-84d7bcdf99-dpxx8" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.496860 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/60094d0f-d530-424e-92d1-62e473acc664-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-jkvj9\" (UID: \"60094d0f-d530-424e-92d1-62e473acc664\") " pod="openstack/ovn-controller-metrics-jkvj9" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.503445 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60094d0f-d530-424e-92d1-62e473acc664-combined-ca-bundle\") pod \"ovn-controller-metrics-jkvj9\" (UID: \"60094d0f-d530-424e-92d1-62e473acc664\") " pod="openstack/ovn-controller-metrics-jkvj9" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.509397 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkfb5\" (UniqueName: \"kubernetes.io/projected/60094d0f-d530-424e-92d1-62e473acc664-kube-api-access-kkfb5\") pod \"ovn-controller-metrics-jkvj9\" (UID: \"60094d0f-d530-424e-92d1-62e473acc664\") " pod="openstack/ovn-controller-metrics-jkvj9" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.509508 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6njw\" (UniqueName: \"kubernetes.io/projected/91e72012-8cc7-45c1-b677-6e3b666627da-kube-api-access-r6njw\") pod \"dnsmasq-dns-84d7bcdf99-dpxx8\" (UID: \"91e72012-8cc7-45c1-b677-6e3b666627da\") " pod="openstack/dnsmasq-dns-84d7bcdf99-dpxx8" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.585744 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-jkvj9" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.617261 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84d7bcdf99-dpxx8" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.685294 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-vvwtp"] Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.709375 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f697c8bff-4zxns"] Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.710851 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f697c8bff-4zxns" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.713842 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.722453 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.728603 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f697c8bff-4zxns"] Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.787987 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.794677 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whg87\" (UniqueName: \"kubernetes.io/projected/3a9f706f-6103-4f0e-bf89-0389d47ef9ed-kube-api-access-whg87\") pod \"dnsmasq-dns-f697c8bff-4zxns\" (UID: \"3a9f706f-6103-4f0e-bf89-0389d47ef9ed\") " pod="openstack/dnsmasq-dns-f697c8bff-4zxns" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.794730 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a9f706f-6103-4f0e-bf89-0389d47ef9ed-dns-svc\") pod \"dnsmasq-dns-f697c8bff-4zxns\" (UID: \"3a9f706f-6103-4f0e-bf89-0389d47ef9ed\") " pod="openstack/dnsmasq-dns-f697c8bff-4zxns" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.794832 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a9f706f-6103-4f0e-bf89-0389d47ef9ed-ovsdbserver-sb\") pod \"dnsmasq-dns-f697c8bff-4zxns\" (UID: \"3a9f706f-6103-4f0e-bf89-0389d47ef9ed\") " pod="openstack/dnsmasq-dns-f697c8bff-4zxns" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.794882 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a9f706f-6103-4f0e-bf89-0389d47ef9ed-ovsdbserver-nb\") pod \"dnsmasq-dns-f697c8bff-4zxns\" (UID: \"3a9f706f-6103-4f0e-bf89-0389d47ef9ed\") " pod="openstack/dnsmasq-dns-f697c8bff-4zxns" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.794916 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a9f706f-6103-4f0e-bf89-0389d47ef9ed-config\") pod \"dnsmasq-dns-f697c8bff-4zxns\" (UID: \"3a9f706f-6103-4f0e-bf89-0389d47ef9ed\") " pod="openstack/dnsmasq-dns-f697c8bff-4zxns" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.845040 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-658f55c9f5-vvwtp" podUID="24d85c15-a28a-40ef-92cb-611d03123bc8" containerName="dnsmasq-dns" containerID="cri-o://08890f6548cf5896aac9ebc71a44c7267681c510333e48d52e04ea4f1bdf5cb1" gracePeriod=10 Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.845617 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54b5dffb47-jxtww" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.846119 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54b5dffb47-jxtww" podUID="2046a38e-0101-47a2-88d5-f91ca521cb9a" containerName="dnsmasq-dns" containerID="cri-o://3ad90e9bb629396916279f41abb6b41dae58a2802d7ab12610a2e1187406f14e" gracePeriod=10 Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.889037 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.896084 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a9f706f-6103-4f0e-bf89-0389d47ef9ed-ovsdbserver-sb\") pod \"dnsmasq-dns-f697c8bff-4zxns\" (UID: \"3a9f706f-6103-4f0e-bf89-0389d47ef9ed\") " pod="openstack/dnsmasq-dns-f697c8bff-4zxns" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.896166 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a9f706f-6103-4f0e-bf89-0389d47ef9ed-ovsdbserver-nb\") pod \"dnsmasq-dns-f697c8bff-4zxns\" (UID: \"3a9f706f-6103-4f0e-bf89-0389d47ef9ed\") " pod="openstack/dnsmasq-dns-f697c8bff-4zxns" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.896235 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a9f706f-6103-4f0e-bf89-0389d47ef9ed-config\") pod \"dnsmasq-dns-f697c8bff-4zxns\" (UID: \"3a9f706f-6103-4f0e-bf89-0389d47ef9ed\") " pod="openstack/dnsmasq-dns-f697c8bff-4zxns" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.896287 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whg87\" (UniqueName: \"kubernetes.io/projected/3a9f706f-6103-4f0e-bf89-0389d47ef9ed-kube-api-access-whg87\") pod \"dnsmasq-dns-f697c8bff-4zxns\" (UID: \"3a9f706f-6103-4f0e-bf89-0389d47ef9ed\") " pod="openstack/dnsmasq-dns-f697c8bff-4zxns" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.896327 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a9f706f-6103-4f0e-bf89-0389d47ef9ed-dns-svc\") pod \"dnsmasq-dns-f697c8bff-4zxns\" (UID: \"3a9f706f-6103-4f0e-bf89-0389d47ef9ed\") " pod="openstack/dnsmasq-dns-f697c8bff-4zxns" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.897081 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a9f706f-6103-4f0e-bf89-0389d47ef9ed-ovsdbserver-sb\") pod \"dnsmasq-dns-f697c8bff-4zxns\" (UID: \"3a9f706f-6103-4f0e-bf89-0389d47ef9ed\") " pod="openstack/dnsmasq-dns-f697c8bff-4zxns" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.897351 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a9f706f-6103-4f0e-bf89-0389d47ef9ed-dns-svc\") pod \"dnsmasq-dns-f697c8bff-4zxns\" (UID: \"3a9f706f-6103-4f0e-bf89-0389d47ef9ed\") " pod="openstack/dnsmasq-dns-f697c8bff-4zxns" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.897649 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a9f706f-6103-4f0e-bf89-0389d47ef9ed-config\") pod \"dnsmasq-dns-f697c8bff-4zxns\" (UID: \"3a9f706f-6103-4f0e-bf89-0389d47ef9ed\") " pod="openstack/dnsmasq-dns-f697c8bff-4zxns" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.898375 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a9f706f-6103-4f0e-bf89-0389d47ef9ed-ovsdbserver-nb\") pod \"dnsmasq-dns-f697c8bff-4zxns\" (UID: \"3a9f706f-6103-4f0e-bf89-0389d47ef9ed\") " pod="openstack/dnsmasq-dns-f697c8bff-4zxns" Mar 18 18:19:58 crc kubenswrapper[4830]: I0318 18:19:58.926635 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whg87\" (UniqueName: \"kubernetes.io/projected/3a9f706f-6103-4f0e-bf89-0389d47ef9ed-kube-api-access-whg87\") pod \"dnsmasq-dns-f697c8bff-4zxns\" (UID: \"3a9f706f-6103-4f0e-bf89-0389d47ef9ed\") " pod="openstack/dnsmasq-dns-f697c8bff-4zxns" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.040233 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.041374 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.046406 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.046761 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.047165 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.048054 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-gt9rp" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.057042 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f697c8bff-4zxns" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.062599 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.100055 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b116575-f650-432e-9eb8-31b6f16b027c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"7b116575-f650-432e-9eb8-31b6f16b027c\") " pod="openstack/ovn-northd-0" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.100206 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b116575-f650-432e-9eb8-31b6f16b027c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"7b116575-f650-432e-9eb8-31b6f16b027c\") " pod="openstack/ovn-northd-0" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.100290 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b116575-f650-432e-9eb8-31b6f16b027c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"7b116575-f650-432e-9eb8-31b6f16b027c\") " pod="openstack/ovn-northd-0" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.100497 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b116575-f650-432e-9eb8-31b6f16b027c-scripts\") pod \"ovn-northd-0\" (UID: \"7b116575-f650-432e-9eb8-31b6f16b027c\") " pod="openstack/ovn-northd-0" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.100623 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz5mg\" (UniqueName: \"kubernetes.io/projected/7b116575-f650-432e-9eb8-31b6f16b027c-kube-api-access-nz5mg\") pod \"ovn-northd-0\" (UID: \"7b116575-f650-432e-9eb8-31b6f16b027c\") " pod="openstack/ovn-northd-0" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.100725 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b116575-f650-432e-9eb8-31b6f16b027c-config\") pod \"ovn-northd-0\" (UID: \"7b116575-f650-432e-9eb8-31b6f16b027c\") " pod="openstack/ovn-northd-0" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.100847 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7b116575-f650-432e-9eb8-31b6f16b027c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"7b116575-f650-432e-9eb8-31b6f16b027c\") " pod="openstack/ovn-northd-0" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.122550 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-jkvj9"] Mar 18 18:19:59 crc kubenswrapper[4830]: W0318 18:19:59.135458 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60094d0f_d530_424e_92d1_62e473acc664.slice/crio-250dd4e24f2c79323606d92cdadeed2e6b85ef56d96df84143ca1b4bbede7635 WatchSource:0}: Error finding container 250dd4e24f2c79323606d92cdadeed2e6b85ef56d96df84143ca1b4bbede7635: Status 404 returned error can't find the container with id 250dd4e24f2c79323606d92cdadeed2e6b85ef56d96df84143ca1b4bbede7635 Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.183840 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84d7bcdf99-dpxx8"] Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.202204 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz5mg\" (UniqueName: \"kubernetes.io/projected/7b116575-f650-432e-9eb8-31b6f16b027c-kube-api-access-nz5mg\") pod \"ovn-northd-0\" (UID: \"7b116575-f650-432e-9eb8-31b6f16b027c\") " pod="openstack/ovn-northd-0" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.202249 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b116575-f650-432e-9eb8-31b6f16b027c-config\") pod \"ovn-northd-0\" (UID: \"7b116575-f650-432e-9eb8-31b6f16b027c\") " pod="openstack/ovn-northd-0" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.202272 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7b116575-f650-432e-9eb8-31b6f16b027c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"7b116575-f650-432e-9eb8-31b6f16b027c\") " pod="openstack/ovn-northd-0" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.202296 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b116575-f650-432e-9eb8-31b6f16b027c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"7b116575-f650-432e-9eb8-31b6f16b027c\") " pod="openstack/ovn-northd-0" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.202322 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b116575-f650-432e-9eb8-31b6f16b027c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"7b116575-f650-432e-9eb8-31b6f16b027c\") " pod="openstack/ovn-northd-0" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.202337 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b116575-f650-432e-9eb8-31b6f16b027c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"7b116575-f650-432e-9eb8-31b6f16b027c\") " pod="openstack/ovn-northd-0" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.202400 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b116575-f650-432e-9eb8-31b6f16b027c-scripts\") pod \"ovn-northd-0\" (UID: \"7b116575-f650-432e-9eb8-31b6f16b027c\") " pod="openstack/ovn-northd-0" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.202725 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7b116575-f650-432e-9eb8-31b6f16b027c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"7b116575-f650-432e-9eb8-31b6f16b027c\") " pod="openstack/ovn-northd-0" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.203104 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b116575-f650-432e-9eb8-31b6f16b027c-scripts\") pod \"ovn-northd-0\" (UID: \"7b116575-f650-432e-9eb8-31b6f16b027c\") " pod="openstack/ovn-northd-0" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.203672 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b116575-f650-432e-9eb8-31b6f16b027c-config\") pod \"ovn-northd-0\" (UID: \"7b116575-f650-432e-9eb8-31b6f16b027c\") " pod="openstack/ovn-northd-0" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.211133 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b116575-f650-432e-9eb8-31b6f16b027c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"7b116575-f650-432e-9eb8-31b6f16b027c\") " pod="openstack/ovn-northd-0" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.211620 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b116575-f650-432e-9eb8-31b6f16b027c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"7b116575-f650-432e-9eb8-31b6f16b027c\") " pod="openstack/ovn-northd-0" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.214040 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b116575-f650-432e-9eb8-31b6f16b027c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"7b116575-f650-432e-9eb8-31b6f16b027c\") " pod="openstack/ovn-northd-0" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.230269 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz5mg\" (UniqueName: \"kubernetes.io/projected/7b116575-f650-432e-9eb8-31b6f16b027c-kube-api-access-nz5mg\") pod \"ovn-northd-0\" (UID: \"7b116575-f650-432e-9eb8-31b6f16b027c\") " pod="openstack/ovn-northd-0" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.358250 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-jxtww" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.375455 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.477063 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658f55c9f5-vvwtp" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.509353 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.509434 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zswc7\" (UniqueName: \"kubernetes.io/projected/2046a38e-0101-47a2-88d5-f91ca521cb9a-kube-api-access-zswc7\") pod \"2046a38e-0101-47a2-88d5-f91ca521cb9a\" (UID: \"2046a38e-0101-47a2-88d5-f91ca521cb9a\") " Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.509431 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.509533 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2046a38e-0101-47a2-88d5-f91ca521cb9a-config\") pod \"2046a38e-0101-47a2-88d5-f91ca521cb9a\" (UID: \"2046a38e-0101-47a2-88d5-f91ca521cb9a\") " Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.509588 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.509701 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2046a38e-0101-47a2-88d5-f91ca521cb9a-dns-svc\") pod \"2046a38e-0101-47a2-88d5-f91ca521cb9a\" (UID: \"2046a38e-0101-47a2-88d5-f91ca521cb9a\") " Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.510410 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0f0582e7c69a5ff0a523a01804a4f3c9becc735481bb91df9516cfe7387f2359"} pod="openshift-machine-config-operator/machine-config-daemon-plzpb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.510473 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" containerID="cri-o://0f0582e7c69a5ff0a523a01804a4f3c9becc735481bb91df9516cfe7387f2359" gracePeriod=600 Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.515267 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2046a38e-0101-47a2-88d5-f91ca521cb9a-kube-api-access-zswc7" (OuterVolumeSpecName: "kube-api-access-zswc7") pod "2046a38e-0101-47a2-88d5-f91ca521cb9a" (UID: "2046a38e-0101-47a2-88d5-f91ca521cb9a"). InnerVolumeSpecName "kube-api-access-zswc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.795300 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zswc7\" (UniqueName: \"kubernetes.io/projected/2046a38e-0101-47a2-88d5-f91ca521cb9a-kube-api-access-zswc7\") on node \"crc\" DevicePath \"\"" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.807627 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f697c8bff-4zxns"] Mar 18 18:19:59 crc kubenswrapper[4830]: W0318 18:19:59.816211 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a9f706f_6103_4f0e_bf89_0389d47ef9ed.slice/crio-4ddcb02676be975041680d561dab6178bae6e3ccac9cfcd37497a1bf0db1dbba WatchSource:0}: Error finding container 4ddcb02676be975041680d561dab6178bae6e3ccac9cfcd37497a1bf0db1dbba: Status 404 returned error can't find the container with id 4ddcb02676be975041680d561dab6178bae6e3ccac9cfcd37497a1bf0db1dbba Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.829055 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2046a38e-0101-47a2-88d5-f91ca521cb9a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2046a38e-0101-47a2-88d5-f91ca521cb9a" (UID: "2046a38e-0101-47a2-88d5-f91ca521cb9a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.861295 4830 generic.go:334] "Generic (PLEG): container finished" podID="91e72012-8cc7-45c1-b677-6e3b666627da" containerID="85eb6bb8fec46075293083ba5b98ceca1567d61de2237f52043628258bd6cff8" exitCode=0 Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.861369 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84d7bcdf99-dpxx8" event={"ID":"91e72012-8cc7-45c1-b677-6e3b666627da","Type":"ContainerDied","Data":"85eb6bb8fec46075293083ba5b98ceca1567d61de2237f52043628258bd6cff8"} Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.861401 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84d7bcdf99-dpxx8" event={"ID":"91e72012-8cc7-45c1-b677-6e3b666627da","Type":"ContainerStarted","Data":"23b27a22c9cdd64ff4d76031042da32afc4d13c172bbf183ecb61aa0119913d3"} Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.863671 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2046a38e-0101-47a2-88d5-f91ca521cb9a-config" (OuterVolumeSpecName: "config") pod "2046a38e-0101-47a2-88d5-f91ca521cb9a" (UID: "2046a38e-0101-47a2-88d5-f91ca521cb9a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.865049 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-jkvj9" event={"ID":"60094d0f-d530-424e-92d1-62e473acc664","Type":"ContainerStarted","Data":"42b14e059955cc8e166c0627991a760521592d7af71c5890c3dca6e2c64b9fb8"} Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.865105 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-jkvj9" event={"ID":"60094d0f-d530-424e-92d1-62e473acc664","Type":"ContainerStarted","Data":"250dd4e24f2c79323606d92cdadeed2e6b85ef56d96df84143ca1b4bbede7635"} Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.872274 4830 generic.go:334] "Generic (PLEG): container finished" podID="2046a38e-0101-47a2-88d5-f91ca521cb9a" containerID="3ad90e9bb629396916279f41abb6b41dae58a2802d7ab12610a2e1187406f14e" exitCode=0 Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.872380 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-jxtww" event={"ID":"2046a38e-0101-47a2-88d5-f91ca521cb9a","Type":"ContainerDied","Data":"3ad90e9bb629396916279f41abb6b41dae58a2802d7ab12610a2e1187406f14e"} Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.872462 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-jxtww" event={"ID":"2046a38e-0101-47a2-88d5-f91ca521cb9a","Type":"ContainerDied","Data":"e3cd4b0e68a6a086a43f45d40803a1844b42f413c54a139e981cd9ccadb0bf78"} Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.872481 4830 scope.go:117] "RemoveContainer" containerID="3ad90e9bb629396916279f41abb6b41dae58a2802d7ab12610a2e1187406f14e" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.874087 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-jxtww" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.874455 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f697c8bff-4zxns" event={"ID":"3a9f706f-6103-4f0e-bf89-0389d47ef9ed","Type":"ContainerStarted","Data":"4ddcb02676be975041680d561dab6178bae6e3ccac9cfcd37497a1bf0db1dbba"} Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.878436 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.880878 4830 generic.go:334] "Generic (PLEG): container finished" podID="24d85c15-a28a-40ef-92cb-611d03123bc8" containerID="08890f6548cf5896aac9ebc71a44c7267681c510333e48d52e04ea4f1bdf5cb1" exitCode=0 Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.880968 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658f55c9f5-vvwtp" event={"ID":"24d85c15-a28a-40ef-92cb-611d03123bc8","Type":"ContainerDied","Data":"08890f6548cf5896aac9ebc71a44c7267681c510333e48d52e04ea4f1bdf5cb1"} Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.881005 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658f55c9f5-vvwtp" event={"ID":"24d85c15-a28a-40ef-92cb-611d03123bc8","Type":"ContainerDied","Data":"511b611d0df49bd8e363b7b1c46b22bb1df50b4a0e1d6d37c526bb4e07f50710"} Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.881209 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658f55c9f5-vvwtp" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.889727 4830 generic.go:334] "Generic (PLEG): container finished" podID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerID="0f0582e7c69a5ff0a523a01804a4f3c9becc735481bb91df9516cfe7387f2359" exitCode=0 Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.900156 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerDied","Data":"0f0582e7c69a5ff0a523a01804a4f3c9becc735481bb91df9516cfe7387f2359"} Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.902163 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24d85c15-a28a-40ef-92cb-611d03123bc8-config\") pod \"24d85c15-a28a-40ef-92cb-611d03123bc8\" (UID: \"24d85c15-a28a-40ef-92cb-611d03123bc8\") " Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.902275 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9k9w\" (UniqueName: \"kubernetes.io/projected/24d85c15-a28a-40ef-92cb-611d03123bc8-kube-api-access-m9k9w\") pod \"24d85c15-a28a-40ef-92cb-611d03123bc8\" (UID: \"24d85c15-a28a-40ef-92cb-611d03123bc8\") " Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.902881 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24d85c15-a28a-40ef-92cb-611d03123bc8-dns-svc\") pod \"24d85c15-a28a-40ef-92cb-611d03123bc8\" (UID: \"24d85c15-a28a-40ef-92cb-611d03123bc8\") " Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.907984 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24d85c15-a28a-40ef-92cb-611d03123bc8-kube-api-access-m9k9w" (OuterVolumeSpecName: "kube-api-access-m9k9w") pod "24d85c15-a28a-40ef-92cb-611d03123bc8" (UID: "24d85c15-a28a-40ef-92cb-611d03123bc8"). InnerVolumeSpecName "kube-api-access-m9k9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.917675 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2046a38e-0101-47a2-88d5-f91ca521cb9a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.927990 4830 scope.go:117] "RemoveContainer" containerID="c91e68b7a6e0067e0e3cd3bab48a1b742b18fcf09cbfeeeb780dc948fb3e3c5d" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.932849 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2046a38e-0101-47a2-88d5-f91ca521cb9a-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.988004 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-jkvj9" podStartSLOduration=1.986872499 podStartE2EDuration="1.986872499s" podCreationTimestamp="2026-03-18 18:19:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:19:59.936479957 +0000 UTC m=+1034.504110299" watchObservedRunningTime="2026-03-18 18:19:59.986872499 +0000 UTC m=+1034.554502831" Mar 18 18:19:59 crc kubenswrapper[4830]: I0318 18:19:59.998983 4830 scope.go:117] "RemoveContainer" containerID="3ad90e9bb629396916279f41abb6b41dae58a2802d7ab12610a2e1187406f14e" Mar 18 18:20:00 crc kubenswrapper[4830]: E0318 18:20:00.000455 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ad90e9bb629396916279f41abb6b41dae58a2802d7ab12610a2e1187406f14e\": container with ID starting with 3ad90e9bb629396916279f41abb6b41dae58a2802d7ab12610a2e1187406f14e not found: ID does not exist" containerID="3ad90e9bb629396916279f41abb6b41dae58a2802d7ab12610a2e1187406f14e" Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.000502 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ad90e9bb629396916279f41abb6b41dae58a2802d7ab12610a2e1187406f14e"} err="failed to get container status \"3ad90e9bb629396916279f41abb6b41dae58a2802d7ab12610a2e1187406f14e\": rpc error: code = NotFound desc = could not find container \"3ad90e9bb629396916279f41abb6b41dae58a2802d7ab12610a2e1187406f14e\": container with ID starting with 3ad90e9bb629396916279f41abb6b41dae58a2802d7ab12610a2e1187406f14e not found: ID does not exist" Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.000522 4830 scope.go:117] "RemoveContainer" containerID="c91e68b7a6e0067e0e3cd3bab48a1b742b18fcf09cbfeeeb780dc948fb3e3c5d" Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.001320 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-jxtww"] Mar 18 18:20:00 crc kubenswrapper[4830]: E0318 18:20:00.001463 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c91e68b7a6e0067e0e3cd3bab48a1b742b18fcf09cbfeeeb780dc948fb3e3c5d\": container with ID starting with c91e68b7a6e0067e0e3cd3bab48a1b742b18fcf09cbfeeeb780dc948fb3e3c5d not found: ID does not exist" containerID="c91e68b7a6e0067e0e3cd3bab48a1b742b18fcf09cbfeeeb780dc948fb3e3c5d" Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.001488 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c91e68b7a6e0067e0e3cd3bab48a1b742b18fcf09cbfeeeb780dc948fb3e3c5d"} err="failed to get container status \"c91e68b7a6e0067e0e3cd3bab48a1b742b18fcf09cbfeeeb780dc948fb3e3c5d\": rpc error: code = NotFound desc = could not find container \"c91e68b7a6e0067e0e3cd3bab48a1b742b18fcf09cbfeeeb780dc948fb3e3c5d\": container with ID starting with c91e68b7a6e0067e0e3cd3bab48a1b742b18fcf09cbfeeeb780dc948fb3e3c5d not found: ID does not exist" Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.001502 4830 scope.go:117] "RemoveContainer" containerID="08890f6548cf5896aac9ebc71a44c7267681c510333e48d52e04ea4f1bdf5cb1" Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.019584 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-jxtww"] Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.034713 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24d85c15-a28a-40ef-92cb-611d03123bc8-config" (OuterVolumeSpecName: "config") pod "24d85c15-a28a-40ef-92cb-611d03123bc8" (UID: "24d85c15-a28a-40ef-92cb-611d03123bc8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.035360 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24d85c15-a28a-40ef-92cb-611d03123bc8-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.035383 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9k9w\" (UniqueName: \"kubernetes.io/projected/24d85c15-a28a-40ef-92cb-611d03123bc8-kube-api-access-m9k9w\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.041371 4830 scope.go:117] "RemoveContainer" containerID="038e1a05a3225fcb0be592a1b895c7b6a948afc837d895acea6636d4233cabc7" Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.048945 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24d85c15-a28a-40ef-92cb-611d03123bc8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "24d85c15-a28a-40ef-92cb-611d03123bc8" (UID: "24d85c15-a28a-40ef-92cb-611d03123bc8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.080936 4830 scope.go:117] "RemoveContainer" containerID="08890f6548cf5896aac9ebc71a44c7267681c510333e48d52e04ea4f1bdf5cb1" Mar 18 18:20:00 crc kubenswrapper[4830]: E0318 18:20:00.085475 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08890f6548cf5896aac9ebc71a44c7267681c510333e48d52e04ea4f1bdf5cb1\": container with ID starting with 08890f6548cf5896aac9ebc71a44c7267681c510333e48d52e04ea4f1bdf5cb1 not found: ID does not exist" containerID="08890f6548cf5896aac9ebc71a44c7267681c510333e48d52e04ea4f1bdf5cb1" Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.085522 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08890f6548cf5896aac9ebc71a44c7267681c510333e48d52e04ea4f1bdf5cb1"} err="failed to get container status \"08890f6548cf5896aac9ebc71a44c7267681c510333e48d52e04ea4f1bdf5cb1\": rpc error: code = NotFound desc = could not find container \"08890f6548cf5896aac9ebc71a44c7267681c510333e48d52e04ea4f1bdf5cb1\": container with ID starting with 08890f6548cf5896aac9ebc71a44c7267681c510333e48d52e04ea4f1bdf5cb1 not found: ID does not exist" Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.085548 4830 scope.go:117] "RemoveContainer" containerID="038e1a05a3225fcb0be592a1b895c7b6a948afc837d895acea6636d4233cabc7" Mar 18 18:20:00 crc kubenswrapper[4830]: E0318 18:20:00.085844 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"038e1a05a3225fcb0be592a1b895c7b6a948afc837d895acea6636d4233cabc7\": container with ID starting with 038e1a05a3225fcb0be592a1b895c7b6a948afc837d895acea6636d4233cabc7 not found: ID does not exist" containerID="038e1a05a3225fcb0be592a1b895c7b6a948afc837d895acea6636d4233cabc7" Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.085884 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"038e1a05a3225fcb0be592a1b895c7b6a948afc837d895acea6636d4233cabc7"} err="failed to get container status \"038e1a05a3225fcb0be592a1b895c7b6a948afc837d895acea6636d4233cabc7\": rpc error: code = NotFound desc = could not find container \"038e1a05a3225fcb0be592a1b895c7b6a948afc837d895acea6636d4233cabc7\": container with ID starting with 038e1a05a3225fcb0be592a1b895c7b6a948afc837d895acea6636d4233cabc7 not found: ID does not exist" Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.085913 4830 scope.go:117] "RemoveContainer" containerID="3a569bf099365538438bf2523866621050b3b655b0210e45d89e9932425c1a49" Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.136745 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24d85c15-a28a-40ef-92cb-611d03123bc8-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.139935 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564300-tc5tf"] Mar 18 18:20:00 crc kubenswrapper[4830]: E0318 18:20:00.140295 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24d85c15-a28a-40ef-92cb-611d03123bc8" containerName="dnsmasq-dns" Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.140312 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="24d85c15-a28a-40ef-92cb-611d03123bc8" containerName="dnsmasq-dns" Mar 18 18:20:00 crc kubenswrapper[4830]: E0318 18:20:00.140348 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24d85c15-a28a-40ef-92cb-611d03123bc8" containerName="init" Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.140355 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="24d85c15-a28a-40ef-92cb-611d03123bc8" containerName="init" Mar 18 18:20:00 crc kubenswrapper[4830]: E0318 18:20:00.140368 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2046a38e-0101-47a2-88d5-f91ca521cb9a" containerName="dnsmasq-dns" Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.140375 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="2046a38e-0101-47a2-88d5-f91ca521cb9a" containerName="dnsmasq-dns" Mar 18 18:20:00 crc kubenswrapper[4830]: E0318 18:20:00.140393 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2046a38e-0101-47a2-88d5-f91ca521cb9a" containerName="init" Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.140399 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="2046a38e-0101-47a2-88d5-f91ca521cb9a" containerName="init" Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.140583 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="24d85c15-a28a-40ef-92cb-611d03123bc8" containerName="dnsmasq-dns" Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.140605 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="2046a38e-0101-47a2-88d5-f91ca521cb9a" containerName="dnsmasq-dns" Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.141168 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564300-tc5tf" Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.150884 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564300-tc5tf"] Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.185636 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.185797 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.185965 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.237563 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxdbd\" (UniqueName: \"kubernetes.io/projected/e0ac501e-b22c-4dd7-8e7e-51c56f870890-kube-api-access-sxdbd\") pod \"auto-csr-approver-29564300-tc5tf\" (UID: \"e0ac501e-b22c-4dd7-8e7e-51c56f870890\") " pod="openshift-infra/auto-csr-approver-29564300-tc5tf" Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.243439 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2046a38e-0101-47a2-88d5-f91ca521cb9a" path="/var/lib/kubelet/pods/2046a38e-0101-47a2-88d5-f91ca521cb9a/volumes" Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.244093 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-vvwtp"] Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.244172 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-vvwtp"] Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.339881 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxdbd\" (UniqueName: \"kubernetes.io/projected/e0ac501e-b22c-4dd7-8e7e-51c56f870890-kube-api-access-sxdbd\") pod \"auto-csr-approver-29564300-tc5tf\" (UID: \"e0ac501e-b22c-4dd7-8e7e-51c56f870890\") " pod="openshift-infra/auto-csr-approver-29564300-tc5tf" Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.358248 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxdbd\" (UniqueName: \"kubernetes.io/projected/e0ac501e-b22c-4dd7-8e7e-51c56f870890-kube-api-access-sxdbd\") pod \"auto-csr-approver-29564300-tc5tf\" (UID: \"e0ac501e-b22c-4dd7-8e7e-51c56f870890\") " pod="openshift-infra/auto-csr-approver-29564300-tc5tf" Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.529156 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564300-tc5tf" Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.797826 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564300-tc5tf"] Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.897427 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"7b116575-f650-432e-9eb8-31b6f16b027c","Type":"ContainerStarted","Data":"38175a96a48085f7db7e366f32d5fbfb42fa11538c532e62066eb897a627791b"} Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.900309 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564300-tc5tf" event={"ID":"e0ac501e-b22c-4dd7-8e7e-51c56f870890","Type":"ContainerStarted","Data":"93da18fdfc7697ae25d5bbf4aecb0b7cd40372ee255ded5971973d1f8242dfa1"} Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.901825 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84d7bcdf99-dpxx8" event={"ID":"91e72012-8cc7-45c1-b677-6e3b666627da","Type":"ContainerStarted","Data":"168906caefd0a04d148288840d80e3c1166ab95f7792da7ef3027d8287abce0c"} Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.901968 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84d7bcdf99-dpxx8" Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.903979 4830 generic.go:334] "Generic (PLEG): container finished" podID="3a9f706f-6103-4f0e-bf89-0389d47ef9ed" containerID="02cf15d2b959ead835b6e8299f4c0e47ada8a43833e6352b80f8cccd3d2f01d4" exitCode=0 Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.904028 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f697c8bff-4zxns" event={"ID":"3a9f706f-6103-4f0e-bf89-0389d47ef9ed","Type":"ContainerDied","Data":"02cf15d2b959ead835b6e8299f4c0e47ada8a43833e6352b80f8cccd3d2f01d4"} Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.907758 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerStarted","Data":"95c4e07cab8acd660c3305d62103b7c04d3c929938a23e2544d7e9b8fe0b847c"} Mar 18 18:20:00 crc kubenswrapper[4830]: I0318 18:20:00.949627 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84d7bcdf99-dpxx8" podStartSLOduration=2.949608158 podStartE2EDuration="2.949608158s" podCreationTimestamp="2026-03-18 18:19:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:20:00.92521261 +0000 UTC m=+1035.492842932" watchObservedRunningTime="2026-03-18 18:20:00.949608158 +0000 UTC m=+1035.517238490" Mar 18 18:20:01 crc kubenswrapper[4830]: I0318 18:20:01.768298 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 18 18:20:01 crc kubenswrapper[4830]: I0318 18:20:01.769014 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 18 18:20:01 crc kubenswrapper[4830]: I0318 18:20:01.931804 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f697c8bff-4zxns" event={"ID":"3a9f706f-6103-4f0e-bf89-0389d47ef9ed","Type":"ContainerStarted","Data":"6b0a1000ef0b2d9f58a639bef40f03b22a1d61614e2c9ca09ec4a411f863e252"} Mar 18 18:20:01 crc kubenswrapper[4830]: I0318 18:20:01.932723 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f697c8bff-4zxns" Mar 18 18:20:01 crc kubenswrapper[4830]: I0318 18:20:01.933467 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"7b116575-f650-432e-9eb8-31b6f16b027c","Type":"ContainerStarted","Data":"5dd7c3004c5f8608ed4722eddd8ec0a5d064fff0cca450e65c3c344caa64b4da"} Mar 18 18:20:01 crc kubenswrapper[4830]: I0318 18:20:01.952000 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f697c8bff-4zxns" podStartSLOduration=3.951985589 podStartE2EDuration="3.951985589s" podCreationTimestamp="2026-03-18 18:19:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:20:01.949835309 +0000 UTC m=+1036.517465641" watchObservedRunningTime="2026-03-18 18:20:01.951985589 +0000 UTC m=+1036.519615921" Mar 18 18:20:02 crc kubenswrapper[4830]: I0318 18:20:02.246683 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24d85c15-a28a-40ef-92cb-611d03123bc8" path="/var/lib/kubelet/pods/24d85c15-a28a-40ef-92cb-611d03123bc8/volumes" Mar 18 18:20:02 crc kubenswrapper[4830]: I0318 18:20:02.678912 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 18 18:20:02 crc kubenswrapper[4830]: I0318 18:20:02.679318 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 18 18:20:02 crc kubenswrapper[4830]: I0318 18:20:02.818852 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 18 18:20:02 crc kubenswrapper[4830]: I0318 18:20:02.944473 4830 generic.go:334] "Generic (PLEG): container finished" podID="e0ac501e-b22c-4dd7-8e7e-51c56f870890" containerID="ab2208ca95c916d6035b8232ffd1553a2e84b6421f7e81beebc4d247d69149c0" exitCode=0 Mar 18 18:20:02 crc kubenswrapper[4830]: I0318 18:20:02.944556 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564300-tc5tf" event={"ID":"e0ac501e-b22c-4dd7-8e7e-51c56f870890","Type":"ContainerDied","Data":"ab2208ca95c916d6035b8232ffd1553a2e84b6421f7e81beebc4d247d69149c0"} Mar 18 18:20:02 crc kubenswrapper[4830]: I0318 18:20:02.948267 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"7b116575-f650-432e-9eb8-31b6f16b027c","Type":"ContainerStarted","Data":"1cae5bbd9865bbf63fae7e180aeb6c01f50309bbfd9244d4790218d40ab51f78"} Mar 18 18:20:02 crc kubenswrapper[4830]: I0318 18:20:02.948302 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 18 18:20:03 crc kubenswrapper[4830]: I0318 18:20:03.005149 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.393804328 podStartE2EDuration="4.005121991s" podCreationTimestamp="2026-03-18 18:19:59 +0000 UTC" firstStartedPulling="2026-03-18 18:19:59.928273239 +0000 UTC m=+1034.495903571" lastFinishedPulling="2026-03-18 18:20:01.539590902 +0000 UTC m=+1036.107221234" observedRunningTime="2026-03-18 18:20:02.99104802 +0000 UTC m=+1037.558678352" watchObservedRunningTime="2026-03-18 18:20:03.005121991 +0000 UTC m=+1037.572752363" Mar 18 18:20:03 crc kubenswrapper[4830]: I0318 18:20:03.059491 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 18 18:20:04 crc kubenswrapper[4830]: I0318 18:20:04.128433 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 18 18:20:04 crc kubenswrapper[4830]: I0318 18:20:04.248660 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 18 18:20:04 crc kubenswrapper[4830]: I0318 18:20:04.363463 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564300-tc5tf" Mar 18 18:20:04 crc kubenswrapper[4830]: I0318 18:20:04.438277 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxdbd\" (UniqueName: \"kubernetes.io/projected/e0ac501e-b22c-4dd7-8e7e-51c56f870890-kube-api-access-sxdbd\") pod \"e0ac501e-b22c-4dd7-8e7e-51c56f870890\" (UID: \"e0ac501e-b22c-4dd7-8e7e-51c56f870890\") " Mar 18 18:20:04 crc kubenswrapper[4830]: I0318 18:20:04.446147 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0ac501e-b22c-4dd7-8e7e-51c56f870890-kube-api-access-sxdbd" (OuterVolumeSpecName: "kube-api-access-sxdbd") pod "e0ac501e-b22c-4dd7-8e7e-51c56f870890" (UID: "e0ac501e-b22c-4dd7-8e7e-51c56f870890"). InnerVolumeSpecName "kube-api-access-sxdbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:20:04 crc kubenswrapper[4830]: I0318 18:20:04.500816 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-2062-account-create-update-zb4gw"] Mar 18 18:20:04 crc kubenswrapper[4830]: E0318 18:20:04.501253 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0ac501e-b22c-4dd7-8e7e-51c56f870890" containerName="oc" Mar 18 18:20:04 crc kubenswrapper[4830]: I0318 18:20:04.501264 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ac501e-b22c-4dd7-8e7e-51c56f870890" containerName="oc" Mar 18 18:20:04 crc kubenswrapper[4830]: I0318 18:20:04.501492 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0ac501e-b22c-4dd7-8e7e-51c56f870890" containerName="oc" Mar 18 18:20:04 crc kubenswrapper[4830]: I0318 18:20:04.502163 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2062-account-create-update-zb4gw" Mar 18 18:20:04 crc kubenswrapper[4830]: I0318 18:20:04.504210 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 18 18:20:04 crc kubenswrapper[4830]: I0318 18:20:04.508914 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2062-account-create-update-zb4gw"] Mar 18 18:20:04 crc kubenswrapper[4830]: I0318 18:20:04.540595 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmch7\" (UniqueName: \"kubernetes.io/projected/ca5d6885-918e-49f2-8fdf-0098353bb996-kube-api-access-dmch7\") pod \"placement-2062-account-create-update-zb4gw\" (UID: \"ca5d6885-918e-49f2-8fdf-0098353bb996\") " pod="openstack/placement-2062-account-create-update-zb4gw" Mar 18 18:20:04 crc kubenswrapper[4830]: I0318 18:20:04.540703 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca5d6885-918e-49f2-8fdf-0098353bb996-operator-scripts\") pod \"placement-2062-account-create-update-zb4gw\" (UID: \"ca5d6885-918e-49f2-8fdf-0098353bb996\") " pod="openstack/placement-2062-account-create-update-zb4gw" Mar 18 18:20:04 crc kubenswrapper[4830]: I0318 18:20:04.540839 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxdbd\" (UniqueName: \"kubernetes.io/projected/e0ac501e-b22c-4dd7-8e7e-51c56f870890-kube-api-access-sxdbd\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:04 crc kubenswrapper[4830]: I0318 18:20:04.643249 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmch7\" (UniqueName: \"kubernetes.io/projected/ca5d6885-918e-49f2-8fdf-0098353bb996-kube-api-access-dmch7\") pod \"placement-2062-account-create-update-zb4gw\" (UID: \"ca5d6885-918e-49f2-8fdf-0098353bb996\") " pod="openstack/placement-2062-account-create-update-zb4gw" Mar 18 18:20:04 crc kubenswrapper[4830]: I0318 18:20:04.643536 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca5d6885-918e-49f2-8fdf-0098353bb996-operator-scripts\") pod \"placement-2062-account-create-update-zb4gw\" (UID: \"ca5d6885-918e-49f2-8fdf-0098353bb996\") " pod="openstack/placement-2062-account-create-update-zb4gw" Mar 18 18:20:04 crc kubenswrapper[4830]: I0318 18:20:04.644756 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca5d6885-918e-49f2-8fdf-0098353bb996-operator-scripts\") pod \"placement-2062-account-create-update-zb4gw\" (UID: \"ca5d6885-918e-49f2-8fdf-0098353bb996\") " pod="openstack/placement-2062-account-create-update-zb4gw" Mar 18 18:20:04 crc kubenswrapper[4830]: I0318 18:20:04.664651 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmch7\" (UniqueName: \"kubernetes.io/projected/ca5d6885-918e-49f2-8fdf-0098353bb996-kube-api-access-dmch7\") pod \"placement-2062-account-create-update-zb4gw\" (UID: \"ca5d6885-918e-49f2-8fdf-0098353bb996\") " pod="openstack/placement-2062-account-create-update-zb4gw" Mar 18 18:20:04 crc kubenswrapper[4830]: I0318 18:20:04.823741 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2062-account-create-update-zb4gw" Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:04.999441 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564300-tc5tf" Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.005197 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564300-tc5tf" event={"ID":"e0ac501e-b22c-4dd7-8e7e-51c56f870890","Type":"ContainerDied","Data":"93da18fdfc7697ae25d5bbf4aecb0b7cd40372ee255ded5971973d1f8242dfa1"} Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.005282 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93da18fdfc7697ae25d5bbf4aecb0b7cd40372ee255ded5971973d1f8242dfa1" Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.278428 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.297939 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84d7bcdf99-dpxx8"] Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.298172 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84d7bcdf99-dpxx8" podUID="91e72012-8cc7-45c1-b677-6e3b666627da" containerName="dnsmasq-dns" containerID="cri-o://168906caefd0a04d148288840d80e3c1166ab95f7792da7ef3027d8287abce0c" gracePeriod=10 Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.298935 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84d7bcdf99-dpxx8" Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.336916 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2062-account-create-update-zb4gw"] Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.364246 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b4ddd5fb7-4q6g2"] Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.366202 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4ddd5fb7-4q6g2" Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.394053 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b4ddd5fb7-4q6g2"] Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.459343 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564294-mrcl8"] Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.459659 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8b4m\" (UniqueName: \"kubernetes.io/projected/a93ae87c-c4d2-4dec-af01-3478996b70fc-kube-api-access-t8b4m\") pod \"dnsmasq-dns-b4ddd5fb7-4q6g2\" (UID: \"a93ae87c-c4d2-4dec-af01-3478996b70fc\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-4q6g2" Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.459728 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a93ae87c-c4d2-4dec-af01-3478996b70fc-dns-svc\") pod \"dnsmasq-dns-b4ddd5fb7-4q6g2\" (UID: \"a93ae87c-c4d2-4dec-af01-3478996b70fc\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-4q6g2" Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.459750 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a93ae87c-c4d2-4dec-af01-3478996b70fc-ovsdbserver-sb\") pod \"dnsmasq-dns-b4ddd5fb7-4q6g2\" (UID: \"a93ae87c-c4d2-4dec-af01-3478996b70fc\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-4q6g2" Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.459827 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a93ae87c-c4d2-4dec-af01-3478996b70fc-config\") pod \"dnsmasq-dns-b4ddd5fb7-4q6g2\" (UID: \"a93ae87c-c4d2-4dec-af01-3478996b70fc\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-4q6g2" Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.459857 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a93ae87c-c4d2-4dec-af01-3478996b70fc-ovsdbserver-nb\") pod \"dnsmasq-dns-b4ddd5fb7-4q6g2\" (UID: \"a93ae87c-c4d2-4dec-af01-3478996b70fc\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-4q6g2" Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.472318 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564294-mrcl8"] Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.561838 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a93ae87c-c4d2-4dec-af01-3478996b70fc-config\") pod \"dnsmasq-dns-b4ddd5fb7-4q6g2\" (UID: \"a93ae87c-c4d2-4dec-af01-3478996b70fc\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-4q6g2" Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.561896 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a93ae87c-c4d2-4dec-af01-3478996b70fc-ovsdbserver-nb\") pod \"dnsmasq-dns-b4ddd5fb7-4q6g2\" (UID: \"a93ae87c-c4d2-4dec-af01-3478996b70fc\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-4q6g2" Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.561942 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8b4m\" (UniqueName: \"kubernetes.io/projected/a93ae87c-c4d2-4dec-af01-3478996b70fc-kube-api-access-t8b4m\") pod \"dnsmasq-dns-b4ddd5fb7-4q6g2\" (UID: \"a93ae87c-c4d2-4dec-af01-3478996b70fc\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-4q6g2" Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.561972 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a93ae87c-c4d2-4dec-af01-3478996b70fc-dns-svc\") pod \"dnsmasq-dns-b4ddd5fb7-4q6g2\" (UID: \"a93ae87c-c4d2-4dec-af01-3478996b70fc\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-4q6g2" Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.561989 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a93ae87c-c4d2-4dec-af01-3478996b70fc-ovsdbserver-sb\") pod \"dnsmasq-dns-b4ddd5fb7-4q6g2\" (UID: \"a93ae87c-c4d2-4dec-af01-3478996b70fc\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-4q6g2" Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.562901 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a93ae87c-c4d2-4dec-af01-3478996b70fc-ovsdbserver-sb\") pod \"dnsmasq-dns-b4ddd5fb7-4q6g2\" (UID: \"a93ae87c-c4d2-4dec-af01-3478996b70fc\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-4q6g2" Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.563675 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a93ae87c-c4d2-4dec-af01-3478996b70fc-ovsdbserver-nb\") pod \"dnsmasq-dns-b4ddd5fb7-4q6g2\" (UID: \"a93ae87c-c4d2-4dec-af01-3478996b70fc\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-4q6g2" Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.564292 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a93ae87c-c4d2-4dec-af01-3478996b70fc-dns-svc\") pod \"dnsmasq-dns-b4ddd5fb7-4q6g2\" (UID: \"a93ae87c-c4d2-4dec-af01-3478996b70fc\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-4q6g2" Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.564704 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a93ae87c-c4d2-4dec-af01-3478996b70fc-config\") pod \"dnsmasq-dns-b4ddd5fb7-4q6g2\" (UID: \"a93ae87c-c4d2-4dec-af01-3478996b70fc\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-4q6g2" Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.594359 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8b4m\" (UniqueName: \"kubernetes.io/projected/a93ae87c-c4d2-4dec-af01-3478996b70fc-kube-api-access-t8b4m\") pod \"dnsmasq-dns-b4ddd5fb7-4q6g2\" (UID: \"a93ae87c-c4d2-4dec-af01-3478996b70fc\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-4q6g2" Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.747288 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84d7bcdf99-dpxx8" Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.764263 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91e72012-8cc7-45c1-b677-6e3b666627da-dns-svc\") pod \"91e72012-8cc7-45c1-b677-6e3b666627da\" (UID: \"91e72012-8cc7-45c1-b677-6e3b666627da\") " Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.764375 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6njw\" (UniqueName: \"kubernetes.io/projected/91e72012-8cc7-45c1-b677-6e3b666627da-kube-api-access-r6njw\") pod \"91e72012-8cc7-45c1-b677-6e3b666627da\" (UID: \"91e72012-8cc7-45c1-b677-6e3b666627da\") " Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.764408 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91e72012-8cc7-45c1-b677-6e3b666627da-config\") pod \"91e72012-8cc7-45c1-b677-6e3b666627da\" (UID: \"91e72012-8cc7-45c1-b677-6e3b666627da\") " Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.765170 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/91e72012-8cc7-45c1-b677-6e3b666627da-ovsdbserver-nb\") pod \"91e72012-8cc7-45c1-b677-6e3b666627da\" (UID: \"91e72012-8cc7-45c1-b677-6e3b666627da\") " Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.768258 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91e72012-8cc7-45c1-b677-6e3b666627da-kube-api-access-r6njw" (OuterVolumeSpecName: "kube-api-access-r6njw") pod "91e72012-8cc7-45c1-b677-6e3b666627da" (UID: "91e72012-8cc7-45c1-b677-6e3b666627da"). InnerVolumeSpecName "kube-api-access-r6njw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.790667 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4ddd5fb7-4q6g2" Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.810610 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91e72012-8cc7-45c1-b677-6e3b666627da-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "91e72012-8cc7-45c1-b677-6e3b666627da" (UID: "91e72012-8cc7-45c1-b677-6e3b666627da"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.811665 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91e72012-8cc7-45c1-b677-6e3b666627da-config" (OuterVolumeSpecName: "config") pod "91e72012-8cc7-45c1-b677-6e3b666627da" (UID: "91e72012-8cc7-45c1-b677-6e3b666627da"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.818227 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91e72012-8cc7-45c1-b677-6e3b666627da-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "91e72012-8cc7-45c1-b677-6e3b666627da" (UID: "91e72012-8cc7-45c1-b677-6e3b666627da"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.867450 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/91e72012-8cc7-45c1-b677-6e3b666627da-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.867486 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91e72012-8cc7-45c1-b677-6e3b666627da-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.867499 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6njw\" (UniqueName: \"kubernetes.io/projected/91e72012-8cc7-45c1-b677-6e3b666627da-kube-api-access-r6njw\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:05 crc kubenswrapper[4830]: I0318 18:20:05.867515 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91e72012-8cc7-45c1-b677-6e3b666627da-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.010062 4830 generic.go:334] "Generic (PLEG): container finished" podID="91e72012-8cc7-45c1-b677-6e3b666627da" containerID="168906caefd0a04d148288840d80e3c1166ab95f7792da7ef3027d8287abce0c" exitCode=0 Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.010140 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84d7bcdf99-dpxx8" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.010199 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84d7bcdf99-dpxx8" event={"ID":"91e72012-8cc7-45c1-b677-6e3b666627da","Type":"ContainerDied","Data":"168906caefd0a04d148288840d80e3c1166ab95f7792da7ef3027d8287abce0c"} Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.010257 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84d7bcdf99-dpxx8" event={"ID":"91e72012-8cc7-45c1-b677-6e3b666627da","Type":"ContainerDied","Data":"23b27a22c9cdd64ff4d76031042da32afc4d13c172bbf183ecb61aa0119913d3"} Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.010279 4830 scope.go:117] "RemoveContainer" containerID="168906caefd0a04d148288840d80e3c1166ab95f7792da7ef3027d8287abce0c" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.013534 4830 generic.go:334] "Generic (PLEG): container finished" podID="ca5d6885-918e-49f2-8fdf-0098353bb996" containerID="a98d18015621f551484484a4e5423ae99f7e30bab0f975455f87bfd8b54217cb" exitCode=0 Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.013596 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2062-account-create-update-zb4gw" event={"ID":"ca5d6885-918e-49f2-8fdf-0098353bb996","Type":"ContainerDied","Data":"a98d18015621f551484484a4e5423ae99f7e30bab0f975455f87bfd8b54217cb"} Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.013633 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2062-account-create-update-zb4gw" event={"ID":"ca5d6885-918e-49f2-8fdf-0098353bb996","Type":"ContainerStarted","Data":"754503ce4d1e48bc6b908f995b1ac770dceebc9d3b66fe13ed74f2d7a883ad1f"} Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.052592 4830 scope.go:117] "RemoveContainer" containerID="85eb6bb8fec46075293083ba5b98ceca1567d61de2237f52043628258bd6cff8" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.068077 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84d7bcdf99-dpxx8"] Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.076416 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84d7bcdf99-dpxx8"] Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.083135 4830 scope.go:117] "RemoveContainer" containerID="168906caefd0a04d148288840d80e3c1166ab95f7792da7ef3027d8287abce0c" Mar 18 18:20:06 crc kubenswrapper[4830]: E0318 18:20:06.083594 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"168906caefd0a04d148288840d80e3c1166ab95f7792da7ef3027d8287abce0c\": container with ID starting with 168906caefd0a04d148288840d80e3c1166ab95f7792da7ef3027d8287abce0c not found: ID does not exist" containerID="168906caefd0a04d148288840d80e3c1166ab95f7792da7ef3027d8287abce0c" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.083627 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"168906caefd0a04d148288840d80e3c1166ab95f7792da7ef3027d8287abce0c"} err="failed to get container status \"168906caefd0a04d148288840d80e3c1166ab95f7792da7ef3027d8287abce0c\": rpc error: code = NotFound desc = could not find container \"168906caefd0a04d148288840d80e3c1166ab95f7792da7ef3027d8287abce0c\": container with ID starting with 168906caefd0a04d148288840d80e3c1166ab95f7792da7ef3027d8287abce0c not found: ID does not exist" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.083648 4830 scope.go:117] "RemoveContainer" containerID="85eb6bb8fec46075293083ba5b98ceca1567d61de2237f52043628258bd6cff8" Mar 18 18:20:06 crc kubenswrapper[4830]: E0318 18:20:06.085327 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85eb6bb8fec46075293083ba5b98ceca1567d61de2237f52043628258bd6cff8\": container with ID starting with 85eb6bb8fec46075293083ba5b98ceca1567d61de2237f52043628258bd6cff8 not found: ID does not exist" containerID="85eb6bb8fec46075293083ba5b98ceca1567d61de2237f52043628258bd6cff8" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.085353 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85eb6bb8fec46075293083ba5b98ceca1567d61de2237f52043628258bd6cff8"} err="failed to get container status \"85eb6bb8fec46075293083ba5b98ceca1567d61de2237f52043628258bd6cff8\": rpc error: code = NotFound desc = could not find container \"85eb6bb8fec46075293083ba5b98ceca1567d61de2237f52043628258bd6cff8\": container with ID starting with 85eb6bb8fec46075293083ba5b98ceca1567d61de2237f52043628258bd6cff8 not found: ID does not exist" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.217960 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b4ddd5fb7-4q6g2"] Mar 18 18:20:06 crc kubenswrapper[4830]: W0318 18:20:06.221661 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda93ae87c_c4d2_4dec_af01_3478996b70fc.slice/crio-11abd75f2828fbeab59adc44c628e5b1236435d08218147f26257bab514e3f14 WatchSource:0}: Error finding container 11abd75f2828fbeab59adc44c628e5b1236435d08218147f26257bab514e3f14: Status 404 returned error can't find the container with id 11abd75f2828fbeab59adc44c628e5b1236435d08218147f26257bab514e3f14 Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.247017 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="368156cb-80ff-479d-8417-2ff46f33363f" path="/var/lib/kubelet/pods/368156cb-80ff-479d-8417-2ff46f33363f/volumes" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.247955 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91e72012-8cc7-45c1-b677-6e3b666627da" path="/var/lib/kubelet/pods/91e72012-8cc7-45c1-b677-6e3b666627da/volumes" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.478504 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 18 18:20:06 crc kubenswrapper[4830]: E0318 18:20:06.479757 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91e72012-8cc7-45c1-b677-6e3b666627da" containerName="dnsmasq-dns" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.479852 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="91e72012-8cc7-45c1-b677-6e3b666627da" containerName="dnsmasq-dns" Mar 18 18:20:06 crc kubenswrapper[4830]: E0318 18:20:06.479944 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91e72012-8cc7-45c1-b677-6e3b666627da" containerName="init" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.480001 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="91e72012-8cc7-45c1-b677-6e3b666627da" containerName="init" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.480204 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="91e72012-8cc7-45c1-b677-6e3b666627da" containerName="dnsmasq-dns" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.487762 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.489605 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.489680 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.490841 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-h8ljs" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.504246 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.524218 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.579281 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jrmf\" (UniqueName: \"kubernetes.io/projected/ccc6cbaa-b562-49fc-9add-94aac04d60ed-kube-api-access-5jrmf\") pod \"swift-storage-0\" (UID: \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\") " pod="openstack/swift-storage-0" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.579630 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ccc6cbaa-b562-49fc-9add-94aac04d60ed-cache\") pod \"swift-storage-0\" (UID: \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\") " pod="openstack/swift-storage-0" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.579660 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ccc6cbaa-b562-49fc-9add-94aac04d60ed-etc-swift\") pod \"swift-storage-0\" (UID: \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\") " pod="openstack/swift-storage-0" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.579764 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ccc6cbaa-b562-49fc-9add-94aac04d60ed-lock\") pod \"swift-storage-0\" (UID: \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\") " pod="openstack/swift-storage-0" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.579812 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\") " pod="openstack/swift-storage-0" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.579837 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccc6cbaa-b562-49fc-9add-94aac04d60ed-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\") " pod="openstack/swift-storage-0" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.681052 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jrmf\" (UniqueName: \"kubernetes.io/projected/ccc6cbaa-b562-49fc-9add-94aac04d60ed-kube-api-access-5jrmf\") pod \"swift-storage-0\" (UID: \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\") " pod="openstack/swift-storage-0" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.681202 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ccc6cbaa-b562-49fc-9add-94aac04d60ed-cache\") pod \"swift-storage-0\" (UID: \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\") " pod="openstack/swift-storage-0" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.681251 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ccc6cbaa-b562-49fc-9add-94aac04d60ed-etc-swift\") pod \"swift-storage-0\" (UID: \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\") " pod="openstack/swift-storage-0" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.681345 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ccc6cbaa-b562-49fc-9add-94aac04d60ed-lock\") pod \"swift-storage-0\" (UID: \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\") " pod="openstack/swift-storage-0" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.681391 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\") " pod="openstack/swift-storage-0" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.681426 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccc6cbaa-b562-49fc-9add-94aac04d60ed-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\") " pod="openstack/swift-storage-0" Mar 18 18:20:06 crc kubenswrapper[4830]: E0318 18:20:06.681449 4830 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 18:20:06 crc kubenswrapper[4830]: E0318 18:20:06.681465 4830 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 18:20:06 crc kubenswrapper[4830]: E0318 18:20:06.681504 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ccc6cbaa-b562-49fc-9add-94aac04d60ed-etc-swift podName:ccc6cbaa-b562-49fc-9add-94aac04d60ed nodeName:}" failed. No retries permitted until 2026-03-18 18:20:07.181488865 +0000 UTC m=+1041.749119197 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ccc6cbaa-b562-49fc-9add-94aac04d60ed-etc-swift") pod "swift-storage-0" (UID: "ccc6cbaa-b562-49fc-9add-94aac04d60ed") : configmap "swift-ring-files" not found Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.681740 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ccc6cbaa-b562-49fc-9add-94aac04d60ed-cache\") pod \"swift-storage-0\" (UID: \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\") " pod="openstack/swift-storage-0" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.681923 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ccc6cbaa-b562-49fc-9add-94aac04d60ed-lock\") pod \"swift-storage-0\" (UID: \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\") " pod="openstack/swift-storage-0" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.682054 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/swift-storage-0" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.686917 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccc6cbaa-b562-49fc-9add-94aac04d60ed-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\") " pod="openstack/swift-storage-0" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.704759 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\") " pod="openstack/swift-storage-0" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.705420 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jrmf\" (UniqueName: \"kubernetes.io/projected/ccc6cbaa-b562-49fc-9add-94aac04d60ed-kube-api-access-5jrmf\") pod \"swift-storage-0\" (UID: \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\") " pod="openstack/swift-storage-0" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.837843 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-nmp7q"] Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.839510 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nmp7q" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.847499 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.848136 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.849029 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.851447 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-nmp7q"] Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.884191 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cb262f8b-f0ed-4644-b313-2a2b46815860-etc-swift\") pod \"swift-ring-rebalance-nmp7q\" (UID: \"cb262f8b-f0ed-4644-b313-2a2b46815860\") " pod="openstack/swift-ring-rebalance-nmp7q" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.884260 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb262f8b-f0ed-4644-b313-2a2b46815860-scripts\") pod \"swift-ring-rebalance-nmp7q\" (UID: \"cb262f8b-f0ed-4644-b313-2a2b46815860\") " pod="openstack/swift-ring-rebalance-nmp7q" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.884465 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljb9m\" (UniqueName: \"kubernetes.io/projected/cb262f8b-f0ed-4644-b313-2a2b46815860-kube-api-access-ljb9m\") pod \"swift-ring-rebalance-nmp7q\" (UID: \"cb262f8b-f0ed-4644-b313-2a2b46815860\") " pod="openstack/swift-ring-rebalance-nmp7q" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.884575 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cb262f8b-f0ed-4644-b313-2a2b46815860-ring-data-devices\") pod \"swift-ring-rebalance-nmp7q\" (UID: \"cb262f8b-f0ed-4644-b313-2a2b46815860\") " pod="openstack/swift-ring-rebalance-nmp7q" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.884624 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cb262f8b-f0ed-4644-b313-2a2b46815860-dispersionconf\") pod \"swift-ring-rebalance-nmp7q\" (UID: \"cb262f8b-f0ed-4644-b313-2a2b46815860\") " pod="openstack/swift-ring-rebalance-nmp7q" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.884679 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb262f8b-f0ed-4644-b313-2a2b46815860-combined-ca-bundle\") pod \"swift-ring-rebalance-nmp7q\" (UID: \"cb262f8b-f0ed-4644-b313-2a2b46815860\") " pod="openstack/swift-ring-rebalance-nmp7q" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.884830 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cb262f8b-f0ed-4644-b313-2a2b46815860-swiftconf\") pod \"swift-ring-rebalance-nmp7q\" (UID: \"cb262f8b-f0ed-4644-b313-2a2b46815860\") " pod="openstack/swift-ring-rebalance-nmp7q" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.986646 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljb9m\" (UniqueName: \"kubernetes.io/projected/cb262f8b-f0ed-4644-b313-2a2b46815860-kube-api-access-ljb9m\") pod \"swift-ring-rebalance-nmp7q\" (UID: \"cb262f8b-f0ed-4644-b313-2a2b46815860\") " pod="openstack/swift-ring-rebalance-nmp7q" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.986761 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cb262f8b-f0ed-4644-b313-2a2b46815860-ring-data-devices\") pod \"swift-ring-rebalance-nmp7q\" (UID: \"cb262f8b-f0ed-4644-b313-2a2b46815860\") " pod="openstack/swift-ring-rebalance-nmp7q" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.986885 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cb262f8b-f0ed-4644-b313-2a2b46815860-dispersionconf\") pod \"swift-ring-rebalance-nmp7q\" (UID: \"cb262f8b-f0ed-4644-b313-2a2b46815860\") " pod="openstack/swift-ring-rebalance-nmp7q" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.986945 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb262f8b-f0ed-4644-b313-2a2b46815860-combined-ca-bundle\") pod \"swift-ring-rebalance-nmp7q\" (UID: \"cb262f8b-f0ed-4644-b313-2a2b46815860\") " pod="openstack/swift-ring-rebalance-nmp7q" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.987061 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cb262f8b-f0ed-4644-b313-2a2b46815860-swiftconf\") pod \"swift-ring-rebalance-nmp7q\" (UID: \"cb262f8b-f0ed-4644-b313-2a2b46815860\") " pod="openstack/swift-ring-rebalance-nmp7q" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.987212 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cb262f8b-f0ed-4644-b313-2a2b46815860-etc-swift\") pod \"swift-ring-rebalance-nmp7q\" (UID: \"cb262f8b-f0ed-4644-b313-2a2b46815860\") " pod="openstack/swift-ring-rebalance-nmp7q" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.987280 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb262f8b-f0ed-4644-b313-2a2b46815860-scripts\") pod \"swift-ring-rebalance-nmp7q\" (UID: \"cb262f8b-f0ed-4644-b313-2a2b46815860\") " pod="openstack/swift-ring-rebalance-nmp7q" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.993578 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cb262f8b-f0ed-4644-b313-2a2b46815860-etc-swift\") pod \"swift-ring-rebalance-nmp7q\" (UID: \"cb262f8b-f0ed-4644-b313-2a2b46815860\") " pod="openstack/swift-ring-rebalance-nmp7q" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.993890 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cb262f8b-f0ed-4644-b313-2a2b46815860-ring-data-devices\") pod \"swift-ring-rebalance-nmp7q\" (UID: \"cb262f8b-f0ed-4644-b313-2a2b46815860\") " pod="openstack/swift-ring-rebalance-nmp7q" Mar 18 18:20:06 crc kubenswrapper[4830]: I0318 18:20:06.997371 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb262f8b-f0ed-4644-b313-2a2b46815860-combined-ca-bundle\") pod \"swift-ring-rebalance-nmp7q\" (UID: \"cb262f8b-f0ed-4644-b313-2a2b46815860\") " pod="openstack/swift-ring-rebalance-nmp7q" Mar 18 18:20:07 crc kubenswrapper[4830]: I0318 18:20:07.000536 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb262f8b-f0ed-4644-b313-2a2b46815860-scripts\") pod \"swift-ring-rebalance-nmp7q\" (UID: \"cb262f8b-f0ed-4644-b313-2a2b46815860\") " pod="openstack/swift-ring-rebalance-nmp7q" Mar 18 18:20:07 crc kubenswrapper[4830]: I0318 18:20:07.004431 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cb262f8b-f0ed-4644-b313-2a2b46815860-dispersionconf\") pod \"swift-ring-rebalance-nmp7q\" (UID: \"cb262f8b-f0ed-4644-b313-2a2b46815860\") " pod="openstack/swift-ring-rebalance-nmp7q" Mar 18 18:20:07 crc kubenswrapper[4830]: I0318 18:20:07.004533 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cb262f8b-f0ed-4644-b313-2a2b46815860-swiftconf\") pod \"swift-ring-rebalance-nmp7q\" (UID: \"cb262f8b-f0ed-4644-b313-2a2b46815860\") " pod="openstack/swift-ring-rebalance-nmp7q" Mar 18 18:20:07 crc kubenswrapper[4830]: I0318 18:20:07.018662 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljb9m\" (UniqueName: \"kubernetes.io/projected/cb262f8b-f0ed-4644-b313-2a2b46815860-kube-api-access-ljb9m\") pod \"swift-ring-rebalance-nmp7q\" (UID: \"cb262f8b-f0ed-4644-b313-2a2b46815860\") " pod="openstack/swift-ring-rebalance-nmp7q" Mar 18 18:20:07 crc kubenswrapper[4830]: I0318 18:20:07.037659 4830 generic.go:334] "Generic (PLEG): container finished" podID="a93ae87c-c4d2-4dec-af01-3478996b70fc" containerID="ca0e0174fbb956aab4d64c5335a5c9127ee141ff9977768dc8b0aac8c5436c5d" exitCode=0 Mar 18 18:20:07 crc kubenswrapper[4830]: I0318 18:20:07.038678 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4ddd5fb7-4q6g2" event={"ID":"a93ae87c-c4d2-4dec-af01-3478996b70fc","Type":"ContainerDied","Data":"ca0e0174fbb956aab4d64c5335a5c9127ee141ff9977768dc8b0aac8c5436c5d"} Mar 18 18:20:07 crc kubenswrapper[4830]: I0318 18:20:07.038710 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4ddd5fb7-4q6g2" event={"ID":"a93ae87c-c4d2-4dec-af01-3478996b70fc","Type":"ContainerStarted","Data":"11abd75f2828fbeab59adc44c628e5b1236435d08218147f26257bab514e3f14"} Mar 18 18:20:07 crc kubenswrapper[4830]: I0318 18:20:07.190413 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ccc6cbaa-b562-49fc-9add-94aac04d60ed-etc-swift\") pod \"swift-storage-0\" (UID: \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\") " pod="openstack/swift-storage-0" Mar 18 18:20:07 crc kubenswrapper[4830]: E0318 18:20:07.190609 4830 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 18:20:07 crc kubenswrapper[4830]: E0318 18:20:07.190631 4830 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 18:20:07 crc kubenswrapper[4830]: E0318 18:20:07.190681 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ccc6cbaa-b562-49fc-9add-94aac04d60ed-etc-swift podName:ccc6cbaa-b562-49fc-9add-94aac04d60ed nodeName:}" failed. No retries permitted until 2026-03-18 18:20:08.190666903 +0000 UTC m=+1042.758297235 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ccc6cbaa-b562-49fc-9add-94aac04d60ed-etc-swift") pod "swift-storage-0" (UID: "ccc6cbaa-b562-49fc-9add-94aac04d60ed") : configmap "swift-ring-files" not found Mar 18 18:20:07 crc kubenswrapper[4830]: I0318 18:20:07.199212 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nmp7q" Mar 18 18:20:07 crc kubenswrapper[4830]: I0318 18:20:07.282910 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2062-account-create-update-zb4gw" Mar 18 18:20:07 crc kubenswrapper[4830]: I0318 18:20:07.395565 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmch7\" (UniqueName: \"kubernetes.io/projected/ca5d6885-918e-49f2-8fdf-0098353bb996-kube-api-access-dmch7\") pod \"ca5d6885-918e-49f2-8fdf-0098353bb996\" (UID: \"ca5d6885-918e-49f2-8fdf-0098353bb996\") " Mar 18 18:20:07 crc kubenswrapper[4830]: I0318 18:20:07.395847 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca5d6885-918e-49f2-8fdf-0098353bb996-operator-scripts\") pod \"ca5d6885-918e-49f2-8fdf-0098353bb996\" (UID: \"ca5d6885-918e-49f2-8fdf-0098353bb996\") " Mar 18 18:20:07 crc kubenswrapper[4830]: I0318 18:20:07.397043 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca5d6885-918e-49f2-8fdf-0098353bb996-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ca5d6885-918e-49f2-8fdf-0098353bb996" (UID: "ca5d6885-918e-49f2-8fdf-0098353bb996"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:07 crc kubenswrapper[4830]: I0318 18:20:07.401677 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca5d6885-918e-49f2-8fdf-0098353bb996-kube-api-access-dmch7" (OuterVolumeSpecName: "kube-api-access-dmch7") pod "ca5d6885-918e-49f2-8fdf-0098353bb996" (UID: "ca5d6885-918e-49f2-8fdf-0098353bb996"). InnerVolumeSpecName "kube-api-access-dmch7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:20:07 crc kubenswrapper[4830]: I0318 18:20:07.497556 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca5d6885-918e-49f2-8fdf-0098353bb996-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:07 crc kubenswrapper[4830]: I0318 18:20:07.497584 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmch7\" (UniqueName: \"kubernetes.io/projected/ca5d6885-918e-49f2-8fdf-0098353bb996-kube-api-access-dmch7\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:07 crc kubenswrapper[4830]: I0318 18:20:07.644221 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-nmp7q"] Mar 18 18:20:07 crc kubenswrapper[4830]: I0318 18:20:07.657460 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 18:20:08 crc kubenswrapper[4830]: I0318 18:20:08.050722 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nmp7q" event={"ID":"cb262f8b-f0ed-4644-b313-2a2b46815860","Type":"ContainerStarted","Data":"ef7cb5dcf01c3803669c426fc1213faf11fd090df14fffeb0df9b3a3c040c170"} Mar 18 18:20:08 crc kubenswrapper[4830]: I0318 18:20:08.055403 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4ddd5fb7-4q6g2" event={"ID":"a93ae87c-c4d2-4dec-af01-3478996b70fc","Type":"ContainerStarted","Data":"846a2ec81ed8d83c06ca8e866ae1099194cbad679ba89dee281a5483357a279a"} Mar 18 18:20:08 crc kubenswrapper[4830]: I0318 18:20:08.055583 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b4ddd5fb7-4q6g2" Mar 18 18:20:08 crc kubenswrapper[4830]: I0318 18:20:08.058479 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2062-account-create-update-zb4gw" event={"ID":"ca5d6885-918e-49f2-8fdf-0098353bb996","Type":"ContainerDied","Data":"754503ce4d1e48bc6b908f995b1ac770dceebc9d3b66fe13ed74f2d7a883ad1f"} Mar 18 18:20:08 crc kubenswrapper[4830]: I0318 18:20:08.058738 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="754503ce4d1e48bc6b908f995b1ac770dceebc9d3b66fe13ed74f2d7a883ad1f" Mar 18 18:20:08 crc kubenswrapper[4830]: I0318 18:20:08.058542 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2062-account-create-update-zb4gw" Mar 18 18:20:08 crc kubenswrapper[4830]: I0318 18:20:08.089625 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b4ddd5fb7-4q6g2" podStartSLOduration=3.089605929 podStartE2EDuration="3.089605929s" podCreationTimestamp="2026-03-18 18:20:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:20:08.081639897 +0000 UTC m=+1042.649270239" watchObservedRunningTime="2026-03-18 18:20:08.089605929 +0000 UTC m=+1042.657236271" Mar 18 18:20:08 crc kubenswrapper[4830]: I0318 18:20:08.215345 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ccc6cbaa-b562-49fc-9add-94aac04d60ed-etc-swift\") pod \"swift-storage-0\" (UID: \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\") " pod="openstack/swift-storage-0" Mar 18 18:20:08 crc kubenswrapper[4830]: E0318 18:20:08.215602 4830 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 18:20:08 crc kubenswrapper[4830]: E0318 18:20:08.215646 4830 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 18:20:08 crc kubenswrapper[4830]: E0318 18:20:08.215728 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ccc6cbaa-b562-49fc-9add-94aac04d60ed-etc-swift podName:ccc6cbaa-b562-49fc-9add-94aac04d60ed nodeName:}" failed. No retries permitted until 2026-03-18 18:20:10.215703224 +0000 UTC m=+1044.783333596 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ccc6cbaa-b562-49fc-9add-94aac04d60ed-etc-swift") pod "swift-storage-0" (UID: "ccc6cbaa-b562-49fc-9add-94aac04d60ed") : configmap "swift-ring-files" not found Mar 18 18:20:08 crc kubenswrapper[4830]: I0318 18:20:08.432599 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-4l8qp"] Mar 18 18:20:08 crc kubenswrapper[4830]: E0318 18:20:08.432939 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca5d6885-918e-49f2-8fdf-0098353bb996" containerName="mariadb-account-create-update" Mar 18 18:20:08 crc kubenswrapper[4830]: I0318 18:20:08.432985 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca5d6885-918e-49f2-8fdf-0098353bb996" containerName="mariadb-account-create-update" Mar 18 18:20:08 crc kubenswrapper[4830]: I0318 18:20:08.433171 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca5d6885-918e-49f2-8fdf-0098353bb996" containerName="mariadb-account-create-update" Mar 18 18:20:08 crc kubenswrapper[4830]: I0318 18:20:08.433658 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4l8qp" Mar 18 18:20:08 crc kubenswrapper[4830]: I0318 18:20:08.446986 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-4l8qp"] Mar 18 18:20:08 crc kubenswrapper[4830]: I0318 18:20:08.463117 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-19b0-account-create-update-tm69m"] Mar 18 18:20:08 crc kubenswrapper[4830]: I0318 18:20:08.464203 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-19b0-account-create-update-tm69m" Mar 18 18:20:08 crc kubenswrapper[4830]: I0318 18:20:08.470576 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 18 18:20:08 crc kubenswrapper[4830]: I0318 18:20:08.511811 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-19b0-account-create-update-tm69m"] Mar 18 18:20:08 crc kubenswrapper[4830]: I0318 18:20:08.521859 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6lmc\" (UniqueName: \"kubernetes.io/projected/7d328c0f-c9ac-4381-884a-44182b2544d7-kube-api-access-f6lmc\") pod \"glance-db-create-4l8qp\" (UID: \"7d328c0f-c9ac-4381-884a-44182b2544d7\") " pod="openstack/glance-db-create-4l8qp" Mar 18 18:20:08 crc kubenswrapper[4830]: I0318 18:20:08.522035 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5c80123-0588-4b50-a44b-18dca565e2ed-operator-scripts\") pod \"glance-19b0-account-create-update-tm69m\" (UID: \"c5c80123-0588-4b50-a44b-18dca565e2ed\") " pod="openstack/glance-19b0-account-create-update-tm69m" Mar 18 18:20:08 crc kubenswrapper[4830]: I0318 18:20:08.522098 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d328c0f-c9ac-4381-884a-44182b2544d7-operator-scripts\") pod \"glance-db-create-4l8qp\" (UID: \"7d328c0f-c9ac-4381-884a-44182b2544d7\") " pod="openstack/glance-db-create-4l8qp" Mar 18 18:20:08 crc kubenswrapper[4830]: I0318 18:20:08.522208 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd99n\" (UniqueName: \"kubernetes.io/projected/c5c80123-0588-4b50-a44b-18dca565e2ed-kube-api-access-qd99n\") pod \"glance-19b0-account-create-update-tm69m\" (UID: \"c5c80123-0588-4b50-a44b-18dca565e2ed\") " pod="openstack/glance-19b0-account-create-update-tm69m" Mar 18 18:20:08 crc kubenswrapper[4830]: I0318 18:20:08.624217 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd99n\" (UniqueName: \"kubernetes.io/projected/c5c80123-0588-4b50-a44b-18dca565e2ed-kube-api-access-qd99n\") pod \"glance-19b0-account-create-update-tm69m\" (UID: \"c5c80123-0588-4b50-a44b-18dca565e2ed\") " pod="openstack/glance-19b0-account-create-update-tm69m" Mar 18 18:20:08 crc kubenswrapper[4830]: I0318 18:20:08.624582 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6lmc\" (UniqueName: \"kubernetes.io/projected/7d328c0f-c9ac-4381-884a-44182b2544d7-kube-api-access-f6lmc\") pod \"glance-db-create-4l8qp\" (UID: \"7d328c0f-c9ac-4381-884a-44182b2544d7\") " pod="openstack/glance-db-create-4l8qp" Mar 18 18:20:08 crc kubenswrapper[4830]: I0318 18:20:08.624736 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5c80123-0588-4b50-a44b-18dca565e2ed-operator-scripts\") pod \"glance-19b0-account-create-update-tm69m\" (UID: \"c5c80123-0588-4b50-a44b-18dca565e2ed\") " pod="openstack/glance-19b0-account-create-update-tm69m" Mar 18 18:20:08 crc kubenswrapper[4830]: I0318 18:20:08.624849 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d328c0f-c9ac-4381-884a-44182b2544d7-operator-scripts\") pod \"glance-db-create-4l8qp\" (UID: \"7d328c0f-c9ac-4381-884a-44182b2544d7\") " pod="openstack/glance-db-create-4l8qp" Mar 18 18:20:08 crc kubenswrapper[4830]: I0318 18:20:08.625512 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5c80123-0588-4b50-a44b-18dca565e2ed-operator-scripts\") pod \"glance-19b0-account-create-update-tm69m\" (UID: \"c5c80123-0588-4b50-a44b-18dca565e2ed\") " pod="openstack/glance-19b0-account-create-update-tm69m" Mar 18 18:20:08 crc kubenswrapper[4830]: I0318 18:20:08.625557 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d328c0f-c9ac-4381-884a-44182b2544d7-operator-scripts\") pod \"glance-db-create-4l8qp\" (UID: \"7d328c0f-c9ac-4381-884a-44182b2544d7\") " pod="openstack/glance-db-create-4l8qp" Mar 18 18:20:08 crc kubenswrapper[4830]: I0318 18:20:08.642146 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6lmc\" (UniqueName: \"kubernetes.io/projected/7d328c0f-c9ac-4381-884a-44182b2544d7-kube-api-access-f6lmc\") pod \"glance-db-create-4l8qp\" (UID: \"7d328c0f-c9ac-4381-884a-44182b2544d7\") " pod="openstack/glance-db-create-4l8qp" Mar 18 18:20:08 crc kubenswrapper[4830]: I0318 18:20:08.651306 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd99n\" (UniqueName: \"kubernetes.io/projected/c5c80123-0588-4b50-a44b-18dca565e2ed-kube-api-access-qd99n\") pod \"glance-19b0-account-create-update-tm69m\" (UID: \"c5c80123-0588-4b50-a44b-18dca565e2ed\") " pod="openstack/glance-19b0-account-create-update-tm69m" Mar 18 18:20:08 crc kubenswrapper[4830]: I0318 18:20:08.750039 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4l8qp" Mar 18 18:20:08 crc kubenswrapper[4830]: I0318 18:20:08.797155 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-19b0-account-create-update-tm69m" Mar 18 18:20:09 crc kubenswrapper[4830]: I0318 18:20:09.059966 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f697c8bff-4zxns" Mar 18 18:20:09 crc kubenswrapper[4830]: I0318 18:20:09.226400 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-4l8qp"] Mar 18 18:20:09 crc kubenswrapper[4830]: I0318 18:20:09.391837 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-19b0-account-create-update-tm69m"] Mar 18 18:20:10 crc kubenswrapper[4830]: I0318 18:20:10.029827 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-f24hk"] Mar 18 18:20:10 crc kubenswrapper[4830]: I0318 18:20:10.031064 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-f24hk" Mar 18 18:20:10 crc kubenswrapper[4830]: I0318 18:20:10.034490 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 18 18:20:10 crc kubenswrapper[4830]: I0318 18:20:10.071062 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-f24hk"] Mar 18 18:20:10 crc kubenswrapper[4830]: I0318 18:20:10.075273 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4l8qp" event={"ID":"7d328c0f-c9ac-4381-884a-44182b2544d7","Type":"ContainerStarted","Data":"c8b6039ff82377e3cbdbc870e80a62c67e761a89afd7485c46655f72832fbaf9"} Mar 18 18:20:10 crc kubenswrapper[4830]: I0318 18:20:10.155352 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ffce816-aaf7-430e-98c7-df9f85c17e0d-operator-scripts\") pod \"root-account-create-update-f24hk\" (UID: \"8ffce816-aaf7-430e-98c7-df9f85c17e0d\") " pod="openstack/root-account-create-update-f24hk" Mar 18 18:20:10 crc kubenswrapper[4830]: I0318 18:20:10.155424 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84t98\" (UniqueName: \"kubernetes.io/projected/8ffce816-aaf7-430e-98c7-df9f85c17e0d-kube-api-access-84t98\") pod \"root-account-create-update-f24hk\" (UID: \"8ffce816-aaf7-430e-98c7-df9f85c17e0d\") " pod="openstack/root-account-create-update-f24hk" Mar 18 18:20:10 crc kubenswrapper[4830]: I0318 18:20:10.260863 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ffce816-aaf7-430e-98c7-df9f85c17e0d-operator-scripts\") pod \"root-account-create-update-f24hk\" (UID: \"8ffce816-aaf7-430e-98c7-df9f85c17e0d\") " pod="openstack/root-account-create-update-f24hk" Mar 18 18:20:10 crc kubenswrapper[4830]: I0318 18:20:10.260974 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ccc6cbaa-b562-49fc-9add-94aac04d60ed-etc-swift\") pod \"swift-storage-0\" (UID: \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\") " pod="openstack/swift-storage-0" Mar 18 18:20:10 crc kubenswrapper[4830]: I0318 18:20:10.261017 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84t98\" (UniqueName: \"kubernetes.io/projected/8ffce816-aaf7-430e-98c7-df9f85c17e0d-kube-api-access-84t98\") pod \"root-account-create-update-f24hk\" (UID: \"8ffce816-aaf7-430e-98c7-df9f85c17e0d\") " pod="openstack/root-account-create-update-f24hk" Mar 18 18:20:10 crc kubenswrapper[4830]: E0318 18:20:10.261498 4830 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 18:20:10 crc kubenswrapper[4830]: E0318 18:20:10.261531 4830 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 18:20:10 crc kubenswrapper[4830]: E0318 18:20:10.261593 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ccc6cbaa-b562-49fc-9add-94aac04d60ed-etc-swift podName:ccc6cbaa-b562-49fc-9add-94aac04d60ed nodeName:}" failed. No retries permitted until 2026-03-18 18:20:14.261571311 +0000 UTC m=+1048.829201673 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ccc6cbaa-b562-49fc-9add-94aac04d60ed-etc-swift") pod "swift-storage-0" (UID: "ccc6cbaa-b562-49fc-9add-94aac04d60ed") : configmap "swift-ring-files" not found Mar 18 18:20:10 crc kubenswrapper[4830]: I0318 18:20:10.262399 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ffce816-aaf7-430e-98c7-df9f85c17e0d-operator-scripts\") pod \"root-account-create-update-f24hk\" (UID: \"8ffce816-aaf7-430e-98c7-df9f85c17e0d\") " pod="openstack/root-account-create-update-f24hk" Mar 18 18:20:10 crc kubenswrapper[4830]: I0318 18:20:10.287595 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84t98\" (UniqueName: \"kubernetes.io/projected/8ffce816-aaf7-430e-98c7-df9f85c17e0d-kube-api-access-84t98\") pod \"root-account-create-update-f24hk\" (UID: \"8ffce816-aaf7-430e-98c7-df9f85c17e0d\") " pod="openstack/root-account-create-update-f24hk" Mar 18 18:20:10 crc kubenswrapper[4830]: I0318 18:20:10.368679 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-f24hk" Mar 18 18:20:10 crc kubenswrapper[4830]: I0318 18:20:10.726175 4830 scope.go:117] "RemoveContainer" containerID="1c1b4c219495822da312818182ba9a7042f0be47364fe23f93d8553abdd3d518" Mar 18 18:20:11 crc kubenswrapper[4830]: I0318 18:20:11.083223 4830 generic.go:334] "Generic (PLEG): container finished" podID="7d328c0f-c9ac-4381-884a-44182b2544d7" containerID="c8b481dbe3098add8fb7e6339fdbf0bcaecb99b50c86f76e144ae1ccaa2b3c6f" exitCode=0 Mar 18 18:20:11 crc kubenswrapper[4830]: I0318 18:20:11.083266 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4l8qp" event={"ID":"7d328c0f-c9ac-4381-884a-44182b2544d7","Type":"ContainerDied","Data":"c8b481dbe3098add8fb7e6339fdbf0bcaecb99b50c86f76e144ae1ccaa2b3c6f"} Mar 18 18:20:11 crc kubenswrapper[4830]: W0318 18:20:11.374468 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5c80123_0588_4b50_a44b_18dca565e2ed.slice/crio-725be4dfd417ce5abd093e0204fa2888b2f6d43aeb80893dc0863ac8366db7d6 WatchSource:0}: Error finding container 725be4dfd417ce5abd093e0204fa2888b2f6d43aeb80893dc0863ac8366db7d6: Status 404 returned error can't find the container with id 725be4dfd417ce5abd093e0204fa2888b2f6d43aeb80893dc0863ac8366db7d6 Mar 18 18:20:11 crc kubenswrapper[4830]: I0318 18:20:11.905270 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-f24hk"] Mar 18 18:20:11 crc kubenswrapper[4830]: W0318 18:20:11.914111 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ffce816_aaf7_430e_98c7_df9f85c17e0d.slice/crio-2a80516310af76bf49be72873744cf64b8d9cec11ab3e718072e807e4d50b4db WatchSource:0}: Error finding container 2a80516310af76bf49be72873744cf64b8d9cec11ab3e718072e807e4d50b4db: Status 404 returned error can't find the container with id 2a80516310af76bf49be72873744cf64b8d9cec11ab3e718072e807e4d50b4db Mar 18 18:20:12 crc kubenswrapper[4830]: I0318 18:20:12.097701 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-f24hk" event={"ID":"8ffce816-aaf7-430e-98c7-df9f85c17e0d","Type":"ContainerStarted","Data":"2a80516310af76bf49be72873744cf64b8d9cec11ab3e718072e807e4d50b4db"} Mar 18 18:20:12 crc kubenswrapper[4830]: I0318 18:20:12.102214 4830 generic.go:334] "Generic (PLEG): container finished" podID="c5c80123-0588-4b50-a44b-18dca565e2ed" containerID="a4338a1a65a166268b1973a86ac23dc739a16a3c7ea8b2c8be7a7cf3c241ecee" exitCode=0 Mar 18 18:20:12 crc kubenswrapper[4830]: I0318 18:20:12.102329 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-19b0-account-create-update-tm69m" event={"ID":"c5c80123-0588-4b50-a44b-18dca565e2ed","Type":"ContainerDied","Data":"a4338a1a65a166268b1973a86ac23dc739a16a3c7ea8b2c8be7a7cf3c241ecee"} Mar 18 18:20:12 crc kubenswrapper[4830]: I0318 18:20:12.102399 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-19b0-account-create-update-tm69m" event={"ID":"c5c80123-0588-4b50-a44b-18dca565e2ed","Type":"ContainerStarted","Data":"725be4dfd417ce5abd093e0204fa2888b2f6d43aeb80893dc0863ac8366db7d6"} Mar 18 18:20:12 crc kubenswrapper[4830]: I0318 18:20:12.106415 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nmp7q" event={"ID":"cb262f8b-f0ed-4644-b313-2a2b46815860","Type":"ContainerStarted","Data":"ca885e950a7893618761320e7bff3491a6c83307e3faaddbb8ddf40f9d55f77e"} Mar 18 18:20:12 crc kubenswrapper[4830]: I0318 18:20:12.148081 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-nmp7q" podStartSLOduration=2.3200179739999998 podStartE2EDuration="6.148055374s" podCreationTimestamp="2026-03-18 18:20:06 +0000 UTC" firstStartedPulling="2026-03-18 18:20:07.657065192 +0000 UTC m=+1042.224695534" lastFinishedPulling="2026-03-18 18:20:11.485102591 +0000 UTC m=+1046.052732934" observedRunningTime="2026-03-18 18:20:12.139388353 +0000 UTC m=+1046.707018695" watchObservedRunningTime="2026-03-18 18:20:12.148055374 +0000 UTC m=+1046.715685716" Mar 18 18:20:12 crc kubenswrapper[4830]: I0318 18:20:12.501591 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4l8qp" Mar 18 18:20:12 crc kubenswrapper[4830]: I0318 18:20:12.700994 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d328c0f-c9ac-4381-884a-44182b2544d7-operator-scripts\") pod \"7d328c0f-c9ac-4381-884a-44182b2544d7\" (UID: \"7d328c0f-c9ac-4381-884a-44182b2544d7\") " Mar 18 18:20:12 crc kubenswrapper[4830]: I0318 18:20:12.701127 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6lmc\" (UniqueName: \"kubernetes.io/projected/7d328c0f-c9ac-4381-884a-44182b2544d7-kube-api-access-f6lmc\") pod \"7d328c0f-c9ac-4381-884a-44182b2544d7\" (UID: \"7d328c0f-c9ac-4381-884a-44182b2544d7\") " Mar 18 18:20:12 crc kubenswrapper[4830]: I0318 18:20:12.702029 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d328c0f-c9ac-4381-884a-44182b2544d7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7d328c0f-c9ac-4381-884a-44182b2544d7" (UID: "7d328c0f-c9ac-4381-884a-44182b2544d7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:12 crc kubenswrapper[4830]: I0318 18:20:12.708120 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d328c0f-c9ac-4381-884a-44182b2544d7-kube-api-access-f6lmc" (OuterVolumeSpecName: "kube-api-access-f6lmc") pod "7d328c0f-c9ac-4381-884a-44182b2544d7" (UID: "7d328c0f-c9ac-4381-884a-44182b2544d7"). InnerVolumeSpecName "kube-api-access-f6lmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:20:12 crc kubenswrapper[4830]: I0318 18:20:12.802685 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6lmc\" (UniqueName: \"kubernetes.io/projected/7d328c0f-c9ac-4381-884a-44182b2544d7-kube-api-access-f6lmc\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:12 crc kubenswrapper[4830]: I0318 18:20:12.802716 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d328c0f-c9ac-4381-884a-44182b2544d7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:13 crc kubenswrapper[4830]: I0318 18:20:13.119858 4830 generic.go:334] "Generic (PLEG): container finished" podID="8ffce816-aaf7-430e-98c7-df9f85c17e0d" containerID="1ccee24cadf8c0c5bd3c5bc1d38965eb6e50e70aa0322e3306f65f0b4f8a4891" exitCode=0 Mar 18 18:20:13 crc kubenswrapper[4830]: I0318 18:20:13.119974 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-f24hk" event={"ID":"8ffce816-aaf7-430e-98c7-df9f85c17e0d","Type":"ContainerDied","Data":"1ccee24cadf8c0c5bd3c5bc1d38965eb6e50e70aa0322e3306f65f0b4f8a4891"} Mar 18 18:20:13 crc kubenswrapper[4830]: I0318 18:20:13.123114 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4l8qp" event={"ID":"7d328c0f-c9ac-4381-884a-44182b2544d7","Type":"ContainerDied","Data":"c8b6039ff82377e3cbdbc870e80a62c67e761a89afd7485c46655f72832fbaf9"} Mar 18 18:20:13 crc kubenswrapper[4830]: I0318 18:20:13.123167 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4l8qp" Mar 18 18:20:13 crc kubenswrapper[4830]: I0318 18:20:13.123177 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8b6039ff82377e3cbdbc870e80a62c67e761a89afd7485c46655f72832fbaf9" Mar 18 18:20:13 crc kubenswrapper[4830]: I0318 18:20:13.519656 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-19b0-account-create-update-tm69m" Mar 18 18:20:13 crc kubenswrapper[4830]: I0318 18:20:13.624965 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qd99n\" (UniqueName: \"kubernetes.io/projected/c5c80123-0588-4b50-a44b-18dca565e2ed-kube-api-access-qd99n\") pod \"c5c80123-0588-4b50-a44b-18dca565e2ed\" (UID: \"c5c80123-0588-4b50-a44b-18dca565e2ed\") " Mar 18 18:20:13 crc kubenswrapper[4830]: I0318 18:20:13.625105 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5c80123-0588-4b50-a44b-18dca565e2ed-operator-scripts\") pod \"c5c80123-0588-4b50-a44b-18dca565e2ed\" (UID: \"c5c80123-0588-4b50-a44b-18dca565e2ed\") " Mar 18 18:20:13 crc kubenswrapper[4830]: I0318 18:20:13.626484 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5c80123-0588-4b50-a44b-18dca565e2ed-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c5c80123-0588-4b50-a44b-18dca565e2ed" (UID: "c5c80123-0588-4b50-a44b-18dca565e2ed"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:13 crc kubenswrapper[4830]: I0318 18:20:13.640819 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5c80123-0588-4b50-a44b-18dca565e2ed-kube-api-access-qd99n" (OuterVolumeSpecName: "kube-api-access-qd99n") pod "c5c80123-0588-4b50-a44b-18dca565e2ed" (UID: "c5c80123-0588-4b50-a44b-18dca565e2ed"). InnerVolumeSpecName "kube-api-access-qd99n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:20:13 crc kubenswrapper[4830]: I0318 18:20:13.727261 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qd99n\" (UniqueName: \"kubernetes.io/projected/c5c80123-0588-4b50-a44b-18dca565e2ed-kube-api-access-qd99n\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:13 crc kubenswrapper[4830]: I0318 18:20:13.727308 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5c80123-0588-4b50-a44b-18dca565e2ed-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.039134 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-btc59"] Mar 18 18:20:14 crc kubenswrapper[4830]: E0318 18:20:14.039468 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d328c0f-c9ac-4381-884a-44182b2544d7" containerName="mariadb-database-create" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.039479 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d328c0f-c9ac-4381-884a-44182b2544d7" containerName="mariadb-database-create" Mar 18 18:20:14 crc kubenswrapper[4830]: E0318 18:20:14.039490 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5c80123-0588-4b50-a44b-18dca565e2ed" containerName="mariadb-account-create-update" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.039496 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c80123-0588-4b50-a44b-18dca565e2ed" containerName="mariadb-account-create-update" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.039647 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5c80123-0588-4b50-a44b-18dca565e2ed" containerName="mariadb-account-create-update" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.039673 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d328c0f-c9ac-4381-884a-44182b2544d7" containerName="mariadb-database-create" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.040163 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-btc59" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.051217 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-btc59"] Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.129804 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-df3e-account-create-update-vd9pb"] Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.134354 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-19b0-account-create-update-tm69m" event={"ID":"c5c80123-0588-4b50-a44b-18dca565e2ed","Type":"ContainerDied","Data":"725be4dfd417ce5abd093e0204fa2888b2f6d43aeb80893dc0863ac8366db7d6"} Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.134419 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="725be4dfd417ce5abd093e0204fa2888b2f6d43aeb80893dc0863ac8366db7d6" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.134664 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-19b0-account-create-update-tm69m" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.135663 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b738a352-32c7-4373-8324-8a02c359d300-operator-scripts\") pod \"keystone-db-create-btc59\" (UID: \"b738a352-32c7-4373-8324-8a02c359d300\") " pod="openstack/keystone-db-create-btc59" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.135809 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pp7f\" (UniqueName: \"kubernetes.io/projected/b738a352-32c7-4373-8324-8a02c359d300-kube-api-access-9pp7f\") pod \"keystone-db-create-btc59\" (UID: \"b738a352-32c7-4373-8324-8a02c359d300\") " pod="openstack/keystone-db-create-btc59" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.135931 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-df3e-account-create-update-vd9pb" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.140292 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-df3e-account-create-update-vd9pb"] Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.141270 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.256306 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b738a352-32c7-4373-8324-8a02c359d300-operator-scripts\") pod \"keystone-db-create-btc59\" (UID: \"b738a352-32c7-4373-8324-8a02c359d300\") " pod="openstack/keystone-db-create-btc59" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.256546 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pp7f\" (UniqueName: \"kubernetes.io/projected/b738a352-32c7-4373-8324-8a02c359d300-kube-api-access-9pp7f\") pod \"keystone-db-create-btc59\" (UID: \"b738a352-32c7-4373-8324-8a02c359d300\") " pod="openstack/keystone-db-create-btc59" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.257618 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee2185d9-90ed-4fc9-a38b-eab30e813652-operator-scripts\") pod \"keystone-df3e-account-create-update-vd9pb\" (UID: \"ee2185d9-90ed-4fc9-a38b-eab30e813652\") " pod="openstack/keystone-df3e-account-create-update-vd9pb" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.257819 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gp2m\" (UniqueName: \"kubernetes.io/projected/ee2185d9-90ed-4fc9-a38b-eab30e813652-kube-api-access-8gp2m\") pod \"keystone-df3e-account-create-update-vd9pb\" (UID: \"ee2185d9-90ed-4fc9-a38b-eab30e813652\") " pod="openstack/keystone-df3e-account-create-update-vd9pb" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.257346 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b738a352-32c7-4373-8324-8a02c359d300-operator-scripts\") pod \"keystone-db-create-btc59\" (UID: \"b738a352-32c7-4373-8324-8a02c359d300\") " pod="openstack/keystone-db-create-btc59" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.271496 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-49qmf"] Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.276844 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-49qmf" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.276859 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-49qmf"] Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.301632 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pp7f\" (UniqueName: \"kubernetes.io/projected/b738a352-32c7-4373-8324-8a02c359d300-kube-api-access-9pp7f\") pod \"keystone-db-create-btc59\" (UID: \"b738a352-32c7-4373-8324-8a02c359d300\") " pod="openstack/keystone-db-create-btc59" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.359580 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee2185d9-90ed-4fc9-a38b-eab30e813652-operator-scripts\") pod \"keystone-df3e-account-create-update-vd9pb\" (UID: \"ee2185d9-90ed-4fc9-a38b-eab30e813652\") " pod="openstack/keystone-df3e-account-create-update-vd9pb" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.359623 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ccc6cbaa-b562-49fc-9add-94aac04d60ed-etc-swift\") pod \"swift-storage-0\" (UID: \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\") " pod="openstack/swift-storage-0" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.359649 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvrl7\" (UniqueName: \"kubernetes.io/projected/9ae038dc-03d7-407c-81eb-ae1bae65d555-kube-api-access-pvrl7\") pod \"placement-db-create-49qmf\" (UID: \"9ae038dc-03d7-407c-81eb-ae1bae65d555\") " pod="openstack/placement-db-create-49qmf" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.359680 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gp2m\" (UniqueName: \"kubernetes.io/projected/ee2185d9-90ed-4fc9-a38b-eab30e813652-kube-api-access-8gp2m\") pod \"keystone-df3e-account-create-update-vd9pb\" (UID: \"ee2185d9-90ed-4fc9-a38b-eab30e813652\") " pod="openstack/keystone-df3e-account-create-update-vd9pb" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.359703 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ae038dc-03d7-407c-81eb-ae1bae65d555-operator-scripts\") pod \"placement-db-create-49qmf\" (UID: \"9ae038dc-03d7-407c-81eb-ae1bae65d555\") " pod="openstack/placement-db-create-49qmf" Mar 18 18:20:14 crc kubenswrapper[4830]: E0318 18:20:14.361465 4830 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 18:20:14 crc kubenswrapper[4830]: E0318 18:20:14.361491 4830 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 18:20:14 crc kubenswrapper[4830]: E0318 18:20:14.361539 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ccc6cbaa-b562-49fc-9add-94aac04d60ed-etc-swift podName:ccc6cbaa-b562-49fc-9add-94aac04d60ed nodeName:}" failed. No retries permitted until 2026-03-18 18:20:22.361521701 +0000 UTC m=+1056.929152143 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ccc6cbaa-b562-49fc-9add-94aac04d60ed-etc-swift") pod "swift-storage-0" (UID: "ccc6cbaa-b562-49fc-9add-94aac04d60ed") : configmap "swift-ring-files" not found Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.361684 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee2185d9-90ed-4fc9-a38b-eab30e813652-operator-scripts\") pod \"keystone-df3e-account-create-update-vd9pb\" (UID: \"ee2185d9-90ed-4fc9-a38b-eab30e813652\") " pod="openstack/keystone-df3e-account-create-update-vd9pb" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.383331 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gp2m\" (UniqueName: \"kubernetes.io/projected/ee2185d9-90ed-4fc9-a38b-eab30e813652-kube-api-access-8gp2m\") pod \"keystone-df3e-account-create-update-vd9pb\" (UID: \"ee2185d9-90ed-4fc9-a38b-eab30e813652\") " pod="openstack/keystone-df3e-account-create-update-vd9pb" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.388024 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-btc59" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.461020 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvrl7\" (UniqueName: \"kubernetes.io/projected/9ae038dc-03d7-407c-81eb-ae1bae65d555-kube-api-access-pvrl7\") pod \"placement-db-create-49qmf\" (UID: \"9ae038dc-03d7-407c-81eb-ae1bae65d555\") " pod="openstack/placement-db-create-49qmf" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.461077 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ae038dc-03d7-407c-81eb-ae1bae65d555-operator-scripts\") pod \"placement-db-create-49qmf\" (UID: \"9ae038dc-03d7-407c-81eb-ae1bae65d555\") " pod="openstack/placement-db-create-49qmf" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.461841 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ae038dc-03d7-407c-81eb-ae1bae65d555-operator-scripts\") pod \"placement-db-create-49qmf\" (UID: \"9ae038dc-03d7-407c-81eb-ae1bae65d555\") " pod="openstack/placement-db-create-49qmf" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.473506 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-df3e-account-create-update-vd9pb" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.477897 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvrl7\" (UniqueName: \"kubernetes.io/projected/9ae038dc-03d7-407c-81eb-ae1bae65d555-kube-api-access-pvrl7\") pod \"placement-db-create-49qmf\" (UID: \"9ae038dc-03d7-407c-81eb-ae1bae65d555\") " pod="openstack/placement-db-create-49qmf" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.578481 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-49qmf" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.588880 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-f24hk" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.665102 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84t98\" (UniqueName: \"kubernetes.io/projected/8ffce816-aaf7-430e-98c7-df9f85c17e0d-kube-api-access-84t98\") pod \"8ffce816-aaf7-430e-98c7-df9f85c17e0d\" (UID: \"8ffce816-aaf7-430e-98c7-df9f85c17e0d\") " Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.665188 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ffce816-aaf7-430e-98c7-df9f85c17e0d-operator-scripts\") pod \"8ffce816-aaf7-430e-98c7-df9f85c17e0d\" (UID: \"8ffce816-aaf7-430e-98c7-df9f85c17e0d\") " Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.666095 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ffce816-aaf7-430e-98c7-df9f85c17e0d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8ffce816-aaf7-430e-98c7-df9f85c17e0d" (UID: "8ffce816-aaf7-430e-98c7-df9f85c17e0d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.669822 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ffce816-aaf7-430e-98c7-df9f85c17e0d-kube-api-access-84t98" (OuterVolumeSpecName: "kube-api-access-84t98") pod "8ffce816-aaf7-430e-98c7-df9f85c17e0d" (UID: "8ffce816-aaf7-430e-98c7-df9f85c17e0d"). InnerVolumeSpecName "kube-api-access-84t98". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.767997 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84t98\" (UniqueName: \"kubernetes.io/projected/8ffce816-aaf7-430e-98c7-df9f85c17e0d-kube-api-access-84t98\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.768348 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ffce816-aaf7-430e-98c7-df9f85c17e0d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.831553 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-btc59"] Mar 18 18:20:14 crc kubenswrapper[4830]: W0318 18:20:14.837668 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb738a352_32c7_4373_8324_8a02c359d300.slice/crio-880350f92aa5df10b3e31d7062eef49b0211eee82d59b720b0dc63fbc8c2820a WatchSource:0}: Error finding container 880350f92aa5df10b3e31d7062eef49b0211eee82d59b720b0dc63fbc8c2820a: Status 404 returned error can't find the container with id 880350f92aa5df10b3e31d7062eef49b0211eee82d59b720b0dc63fbc8c2820a Mar 18 18:20:14 crc kubenswrapper[4830]: I0318 18:20:14.973107 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-df3e-account-create-update-vd9pb"] Mar 18 18:20:14 crc kubenswrapper[4830]: W0318 18:20:14.981395 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee2185d9_90ed_4fc9_a38b_eab30e813652.slice/crio-4d92a121b10e8001205f95e42fa485b95b069671e86dd3403fd52861d57f0a38 WatchSource:0}: Error finding container 4d92a121b10e8001205f95e42fa485b95b069671e86dd3403fd52861d57f0a38: Status 404 returned error can't find the container with id 4d92a121b10e8001205f95e42fa485b95b069671e86dd3403fd52861d57f0a38 Mar 18 18:20:15 crc kubenswrapper[4830]: I0318 18:20:15.062117 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-49qmf"] Mar 18 18:20:15 crc kubenswrapper[4830]: I0318 18:20:15.153966 4830 generic.go:334] "Generic (PLEG): container finished" podID="56fb6c83-b748-4e21-9b1c-90fb37cefea1" containerID="15382a1e088fc1fb63a24c483cd644ccab35ae8fb871c5907d4ff1a797361333" exitCode=0 Mar 18 18:20:15 crc kubenswrapper[4830]: I0318 18:20:15.154053 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"56fb6c83-b748-4e21-9b1c-90fb37cefea1","Type":"ContainerDied","Data":"15382a1e088fc1fb63a24c483cd644ccab35ae8fb871c5907d4ff1a797361333"} Mar 18 18:20:15 crc kubenswrapper[4830]: I0318 18:20:15.164079 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-f24hk" event={"ID":"8ffce816-aaf7-430e-98c7-df9f85c17e0d","Type":"ContainerDied","Data":"2a80516310af76bf49be72873744cf64b8d9cec11ab3e718072e807e4d50b4db"} Mar 18 18:20:15 crc kubenswrapper[4830]: I0318 18:20:15.164444 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a80516310af76bf49be72873744cf64b8d9cec11ab3e718072e807e4d50b4db" Mar 18 18:20:15 crc kubenswrapper[4830]: I0318 18:20:15.164352 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-f24hk" Mar 18 18:20:15 crc kubenswrapper[4830]: I0318 18:20:15.171540 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-btc59" event={"ID":"b738a352-32c7-4373-8324-8a02c359d300","Type":"ContainerStarted","Data":"964d29d1c66eefd1d61b425bffee997a70c6c81bf69aabae1c5c8383843cd69b"} Mar 18 18:20:15 crc kubenswrapper[4830]: I0318 18:20:15.171584 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-btc59" event={"ID":"b738a352-32c7-4373-8324-8a02c359d300","Type":"ContainerStarted","Data":"880350f92aa5df10b3e31d7062eef49b0211eee82d59b720b0dc63fbc8c2820a"} Mar 18 18:20:15 crc kubenswrapper[4830]: I0318 18:20:15.173786 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-49qmf" event={"ID":"9ae038dc-03d7-407c-81eb-ae1bae65d555","Type":"ContainerStarted","Data":"b6bea9636b459cecd9bf0538a742d7a38c7b3dd3c92ee7a82d5b683847fe49c5"} Mar 18 18:20:15 crc kubenswrapper[4830]: I0318 18:20:15.176191 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-df3e-account-create-update-vd9pb" event={"ID":"ee2185d9-90ed-4fc9-a38b-eab30e813652","Type":"ContainerStarted","Data":"4d92a121b10e8001205f95e42fa485b95b069671e86dd3403fd52861d57f0a38"} Mar 18 18:20:15 crc kubenswrapper[4830]: I0318 18:20:15.200257 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-btc59" podStartSLOduration=1.200235282 podStartE2EDuration="1.200235282s" podCreationTimestamp="2026-03-18 18:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:20:15.19658761 +0000 UTC m=+1049.764217932" watchObservedRunningTime="2026-03-18 18:20:15.200235282 +0000 UTC m=+1049.767865624" Mar 18 18:20:15 crc kubenswrapper[4830]: I0318 18:20:15.222498 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-df3e-account-create-update-vd9pb" podStartSLOduration=1.222393018 podStartE2EDuration="1.222393018s" podCreationTimestamp="2026-03-18 18:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:20:15.212242016 +0000 UTC m=+1049.779872378" watchObservedRunningTime="2026-03-18 18:20:15.222393018 +0000 UTC m=+1049.790023390" Mar 18 18:20:15 crc kubenswrapper[4830]: I0318 18:20:15.793051 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b4ddd5fb7-4q6g2" Mar 18 18:20:15 crc kubenswrapper[4830]: I0318 18:20:15.873148 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f697c8bff-4zxns"] Mar 18 18:20:15 crc kubenswrapper[4830]: I0318 18:20:15.873398 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f697c8bff-4zxns" podUID="3a9f706f-6103-4f0e-bf89-0389d47ef9ed" containerName="dnsmasq-dns" containerID="cri-o://6b0a1000ef0b2d9f58a639bef40f03b22a1d61614e2c9ca09ec4a411f863e252" gracePeriod=10 Mar 18 18:20:16 crc kubenswrapper[4830]: I0318 18:20:16.191261 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"56fb6c83-b748-4e21-9b1c-90fb37cefea1","Type":"ContainerStarted","Data":"27c872d698c8ee18fd3cec86b3f4b99ede08b94d9fdc72c7ec17bce05d4a979d"} Mar 18 18:20:16 crc kubenswrapper[4830]: I0318 18:20:16.191853 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 18:20:16 crc kubenswrapper[4830]: I0318 18:20:16.204953 4830 generic.go:334] "Generic (PLEG): container finished" podID="3a9f706f-6103-4f0e-bf89-0389d47ef9ed" containerID="6b0a1000ef0b2d9f58a639bef40f03b22a1d61614e2c9ca09ec4a411f863e252" exitCode=0 Mar 18 18:20:16 crc kubenswrapper[4830]: I0318 18:20:16.204996 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f697c8bff-4zxns" event={"ID":"3a9f706f-6103-4f0e-bf89-0389d47ef9ed","Type":"ContainerDied","Data":"6b0a1000ef0b2d9f58a639bef40f03b22a1d61614e2c9ca09ec4a411f863e252"} Mar 18 18:20:16 crc kubenswrapper[4830]: I0318 18:20:16.208667 4830 generic.go:334] "Generic (PLEG): container finished" podID="b738a352-32c7-4373-8324-8a02c359d300" containerID="964d29d1c66eefd1d61b425bffee997a70c6c81bf69aabae1c5c8383843cd69b" exitCode=0 Mar 18 18:20:16 crc kubenswrapper[4830]: I0318 18:20:16.208957 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-btc59" event={"ID":"b738a352-32c7-4373-8324-8a02c359d300","Type":"ContainerDied","Data":"964d29d1c66eefd1d61b425bffee997a70c6c81bf69aabae1c5c8383843cd69b"} Mar 18 18:20:16 crc kubenswrapper[4830]: I0318 18:20:16.213176 4830 generic.go:334] "Generic (PLEG): container finished" podID="9ae038dc-03d7-407c-81eb-ae1bae65d555" containerID="e9e760a71aaf066d2755a2ade0ff78d11fa557bca8fdd20885df296496a7747d" exitCode=0 Mar 18 18:20:16 crc kubenswrapper[4830]: I0318 18:20:16.213330 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-49qmf" event={"ID":"9ae038dc-03d7-407c-81eb-ae1bae65d555","Type":"ContainerDied","Data":"e9e760a71aaf066d2755a2ade0ff78d11fa557bca8fdd20885df296496a7747d"} Mar 18 18:20:16 crc kubenswrapper[4830]: I0318 18:20:16.217020 4830 generic.go:334] "Generic (PLEG): container finished" podID="ee2185d9-90ed-4fc9-a38b-eab30e813652" containerID="6147c63258a2165648c12a669fffd532aabced926b601af8d7cde628ab4b44a4" exitCode=0 Mar 18 18:20:16 crc kubenswrapper[4830]: I0318 18:20:16.217060 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-df3e-account-create-update-vd9pb" event={"ID":"ee2185d9-90ed-4fc9-a38b-eab30e813652","Type":"ContainerDied","Data":"6147c63258a2165648c12a669fffd532aabced926b601af8d7cde628ab4b44a4"} Mar 18 18:20:16 crc kubenswrapper[4830]: I0318 18:20:16.221584 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.041277781 podStartE2EDuration="57.221566149s" podCreationTimestamp="2026-03-18 18:19:19 +0000 UTC" firstStartedPulling="2026-03-18 18:19:20.955728208 +0000 UTC m=+995.523358540" lastFinishedPulling="2026-03-18 18:19:41.136016576 +0000 UTC m=+1015.703646908" observedRunningTime="2026-03-18 18:20:16.215579763 +0000 UTC m=+1050.783210105" watchObservedRunningTime="2026-03-18 18:20:16.221566149 +0000 UTC m=+1050.789196481" Mar 18 18:20:16 crc kubenswrapper[4830]: I0318 18:20:16.299114 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-f24hk"] Mar 18 18:20:16 crc kubenswrapper[4830]: I0318 18:20:16.304688 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-f24hk"] Mar 18 18:20:16 crc kubenswrapper[4830]: I0318 18:20:16.388708 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f697c8bff-4zxns" Mar 18 18:20:16 crc kubenswrapper[4830]: I0318 18:20:16.491438 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a9f706f-6103-4f0e-bf89-0389d47ef9ed-config\") pod \"3a9f706f-6103-4f0e-bf89-0389d47ef9ed\" (UID: \"3a9f706f-6103-4f0e-bf89-0389d47ef9ed\") " Mar 18 18:20:16 crc kubenswrapper[4830]: I0318 18:20:16.491501 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a9f706f-6103-4f0e-bf89-0389d47ef9ed-ovsdbserver-nb\") pod \"3a9f706f-6103-4f0e-bf89-0389d47ef9ed\" (UID: \"3a9f706f-6103-4f0e-bf89-0389d47ef9ed\") " Mar 18 18:20:16 crc kubenswrapper[4830]: I0318 18:20:16.491530 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a9f706f-6103-4f0e-bf89-0389d47ef9ed-ovsdbserver-sb\") pod \"3a9f706f-6103-4f0e-bf89-0389d47ef9ed\" (UID: \"3a9f706f-6103-4f0e-bf89-0389d47ef9ed\") " Mar 18 18:20:16 crc kubenswrapper[4830]: I0318 18:20:16.491590 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a9f706f-6103-4f0e-bf89-0389d47ef9ed-dns-svc\") pod \"3a9f706f-6103-4f0e-bf89-0389d47ef9ed\" (UID: \"3a9f706f-6103-4f0e-bf89-0389d47ef9ed\") " Mar 18 18:20:16 crc kubenswrapper[4830]: I0318 18:20:16.491648 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whg87\" (UniqueName: \"kubernetes.io/projected/3a9f706f-6103-4f0e-bf89-0389d47ef9ed-kube-api-access-whg87\") pod \"3a9f706f-6103-4f0e-bf89-0389d47ef9ed\" (UID: \"3a9f706f-6103-4f0e-bf89-0389d47ef9ed\") " Mar 18 18:20:16 crc kubenswrapper[4830]: I0318 18:20:16.498201 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a9f706f-6103-4f0e-bf89-0389d47ef9ed-kube-api-access-whg87" (OuterVolumeSpecName: "kube-api-access-whg87") pod "3a9f706f-6103-4f0e-bf89-0389d47ef9ed" (UID: "3a9f706f-6103-4f0e-bf89-0389d47ef9ed"). InnerVolumeSpecName "kube-api-access-whg87". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:20:16 crc kubenswrapper[4830]: I0318 18:20:16.530165 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a9f706f-6103-4f0e-bf89-0389d47ef9ed-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3a9f706f-6103-4f0e-bf89-0389d47ef9ed" (UID: "3a9f706f-6103-4f0e-bf89-0389d47ef9ed"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:16 crc kubenswrapper[4830]: I0318 18:20:16.535630 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a9f706f-6103-4f0e-bf89-0389d47ef9ed-config" (OuterVolumeSpecName: "config") pod "3a9f706f-6103-4f0e-bf89-0389d47ef9ed" (UID: "3a9f706f-6103-4f0e-bf89-0389d47ef9ed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:16 crc kubenswrapper[4830]: I0318 18:20:16.543603 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a9f706f-6103-4f0e-bf89-0389d47ef9ed-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3a9f706f-6103-4f0e-bf89-0389d47ef9ed" (UID: "3a9f706f-6103-4f0e-bf89-0389d47ef9ed"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:16 crc kubenswrapper[4830]: I0318 18:20:16.545512 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a9f706f-6103-4f0e-bf89-0389d47ef9ed-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3a9f706f-6103-4f0e-bf89-0389d47ef9ed" (UID: "3a9f706f-6103-4f0e-bf89-0389d47ef9ed"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:16 crc kubenswrapper[4830]: I0318 18:20:16.593816 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a9f706f-6103-4f0e-bf89-0389d47ef9ed-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:16 crc kubenswrapper[4830]: I0318 18:20:16.593850 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a9f706f-6103-4f0e-bf89-0389d47ef9ed-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:16 crc kubenswrapper[4830]: I0318 18:20:16.593860 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a9f706f-6103-4f0e-bf89-0389d47ef9ed-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:16 crc kubenswrapper[4830]: I0318 18:20:16.593871 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whg87\" (UniqueName: \"kubernetes.io/projected/3a9f706f-6103-4f0e-bf89-0389d47ef9ed-kube-api-access-whg87\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:16 crc kubenswrapper[4830]: I0318 18:20:16.593885 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a9f706f-6103-4f0e-bf89-0389d47ef9ed-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:17 crc kubenswrapper[4830]: I0318 18:20:17.229182 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f697c8bff-4zxns" Mar 18 18:20:17 crc kubenswrapper[4830]: I0318 18:20:17.229253 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f697c8bff-4zxns" event={"ID":"3a9f706f-6103-4f0e-bf89-0389d47ef9ed","Type":"ContainerDied","Data":"4ddcb02676be975041680d561dab6178bae6e3ccac9cfcd37497a1bf0db1dbba"} Mar 18 18:20:17 crc kubenswrapper[4830]: I0318 18:20:17.229422 4830 scope.go:117] "RemoveContainer" containerID="6b0a1000ef0b2d9f58a639bef40f03b22a1d61614e2c9ca09ec4a411f863e252" Mar 18 18:20:17 crc kubenswrapper[4830]: I0318 18:20:17.283923 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f697c8bff-4zxns"] Mar 18 18:20:17 crc kubenswrapper[4830]: I0318 18:20:17.287605 4830 scope.go:117] "RemoveContainer" containerID="02cf15d2b959ead835b6e8299f4c0e47ada8a43833e6352b80f8cccd3d2f01d4" Mar 18 18:20:17 crc kubenswrapper[4830]: I0318 18:20:17.296205 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f697c8bff-4zxns"] Mar 18 18:20:17 crc kubenswrapper[4830]: I0318 18:20:17.705428 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-btc59" Mar 18 18:20:17 crc kubenswrapper[4830]: I0318 18:20:17.729490 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b738a352-32c7-4373-8324-8a02c359d300-operator-scripts\") pod \"b738a352-32c7-4373-8324-8a02c359d300\" (UID: \"b738a352-32c7-4373-8324-8a02c359d300\") " Mar 18 18:20:17 crc kubenswrapper[4830]: I0318 18:20:17.729576 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pp7f\" (UniqueName: \"kubernetes.io/projected/b738a352-32c7-4373-8324-8a02c359d300-kube-api-access-9pp7f\") pod \"b738a352-32c7-4373-8324-8a02c359d300\" (UID: \"b738a352-32c7-4373-8324-8a02c359d300\") " Mar 18 18:20:17 crc kubenswrapper[4830]: I0318 18:20:17.731909 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b738a352-32c7-4373-8324-8a02c359d300-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b738a352-32c7-4373-8324-8a02c359d300" (UID: "b738a352-32c7-4373-8324-8a02c359d300"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:17 crc kubenswrapper[4830]: I0318 18:20:17.747599 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b738a352-32c7-4373-8324-8a02c359d300-kube-api-access-9pp7f" (OuterVolumeSpecName: "kube-api-access-9pp7f") pod "b738a352-32c7-4373-8324-8a02c359d300" (UID: "b738a352-32c7-4373-8324-8a02c359d300"). InnerVolumeSpecName "kube-api-access-9pp7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:20:17 crc kubenswrapper[4830]: I0318 18:20:17.831097 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b738a352-32c7-4373-8324-8a02c359d300-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:17 crc kubenswrapper[4830]: I0318 18:20:17.831127 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pp7f\" (UniqueName: \"kubernetes.io/projected/b738a352-32c7-4373-8324-8a02c359d300-kube-api-access-9pp7f\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:17 crc kubenswrapper[4830]: I0318 18:20:17.831177 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-49qmf" Mar 18 18:20:17 crc kubenswrapper[4830]: I0318 18:20:17.837364 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-df3e-account-create-update-vd9pb" Mar 18 18:20:17 crc kubenswrapper[4830]: I0318 18:20:17.932003 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ae038dc-03d7-407c-81eb-ae1bae65d555-operator-scripts\") pod \"9ae038dc-03d7-407c-81eb-ae1bae65d555\" (UID: \"9ae038dc-03d7-407c-81eb-ae1bae65d555\") " Mar 18 18:20:17 crc kubenswrapper[4830]: I0318 18:20:17.932067 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gp2m\" (UniqueName: \"kubernetes.io/projected/ee2185d9-90ed-4fc9-a38b-eab30e813652-kube-api-access-8gp2m\") pod \"ee2185d9-90ed-4fc9-a38b-eab30e813652\" (UID: \"ee2185d9-90ed-4fc9-a38b-eab30e813652\") " Mar 18 18:20:17 crc kubenswrapper[4830]: I0318 18:20:17.932120 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee2185d9-90ed-4fc9-a38b-eab30e813652-operator-scripts\") pod \"ee2185d9-90ed-4fc9-a38b-eab30e813652\" (UID: \"ee2185d9-90ed-4fc9-a38b-eab30e813652\") " Mar 18 18:20:17 crc kubenswrapper[4830]: I0318 18:20:17.932259 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvrl7\" (UniqueName: \"kubernetes.io/projected/9ae038dc-03d7-407c-81eb-ae1bae65d555-kube-api-access-pvrl7\") pod \"9ae038dc-03d7-407c-81eb-ae1bae65d555\" (UID: \"9ae038dc-03d7-407c-81eb-ae1bae65d555\") " Mar 18 18:20:17 crc kubenswrapper[4830]: I0318 18:20:17.932446 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ae038dc-03d7-407c-81eb-ae1bae65d555-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9ae038dc-03d7-407c-81eb-ae1bae65d555" (UID: "9ae038dc-03d7-407c-81eb-ae1bae65d555"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:17 crc kubenswrapper[4830]: I0318 18:20:17.932620 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ae038dc-03d7-407c-81eb-ae1bae65d555-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:17 crc kubenswrapper[4830]: I0318 18:20:17.932667 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee2185d9-90ed-4fc9-a38b-eab30e813652-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ee2185d9-90ed-4fc9-a38b-eab30e813652" (UID: "ee2185d9-90ed-4fc9-a38b-eab30e813652"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:17 crc kubenswrapper[4830]: I0318 18:20:17.935116 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee2185d9-90ed-4fc9-a38b-eab30e813652-kube-api-access-8gp2m" (OuterVolumeSpecName: "kube-api-access-8gp2m") pod "ee2185d9-90ed-4fc9-a38b-eab30e813652" (UID: "ee2185d9-90ed-4fc9-a38b-eab30e813652"). InnerVolumeSpecName "kube-api-access-8gp2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:20:17 crc kubenswrapper[4830]: I0318 18:20:17.935742 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ae038dc-03d7-407c-81eb-ae1bae65d555-kube-api-access-pvrl7" (OuterVolumeSpecName: "kube-api-access-pvrl7") pod "9ae038dc-03d7-407c-81eb-ae1bae65d555" (UID: "9ae038dc-03d7-407c-81eb-ae1bae65d555"). InnerVolumeSpecName "kube-api-access-pvrl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.034424 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gp2m\" (UniqueName: \"kubernetes.io/projected/ee2185d9-90ed-4fc9-a38b-eab30e813652-kube-api-access-8gp2m\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.034481 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee2185d9-90ed-4fc9-a38b-eab30e813652-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.034500 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvrl7\" (UniqueName: \"kubernetes.io/projected/9ae038dc-03d7-407c-81eb-ae1bae65d555-kube-api-access-pvrl7\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.256584 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a9f706f-6103-4f0e-bf89-0389d47ef9ed" path="/var/lib/kubelet/pods/3a9f706f-6103-4f0e-bf89-0389d47ef9ed/volumes" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.257079 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-btc59" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.257213 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ffce816-aaf7-430e-98c7-df9f85c17e0d" path="/var/lib/kubelet/pods/8ffce816-aaf7-430e-98c7-df9f85c17e0d/volumes" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.258037 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-btc59" event={"ID":"b738a352-32c7-4373-8324-8a02c359d300","Type":"ContainerDied","Data":"880350f92aa5df10b3e31d7062eef49b0211eee82d59b720b0dc63fbc8c2820a"} Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.258060 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="880350f92aa5df10b3e31d7062eef49b0211eee82d59b720b0dc63fbc8c2820a" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.269856 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-49qmf" event={"ID":"9ae038dc-03d7-407c-81eb-ae1bae65d555","Type":"ContainerDied","Data":"b6bea9636b459cecd9bf0538a742d7a38c7b3dd3c92ee7a82d5b683847fe49c5"} Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.269903 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6bea9636b459cecd9bf0538a742d7a38c7b3dd3c92ee7a82d5b683847fe49c5" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.269975 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-49qmf" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.272891 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-df3e-account-create-update-vd9pb" event={"ID":"ee2185d9-90ed-4fc9-a38b-eab30e813652","Type":"ContainerDied","Data":"4d92a121b10e8001205f95e42fa485b95b069671e86dd3403fd52861d57f0a38"} Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.272922 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d92a121b10e8001205f95e42fa485b95b069671e86dd3403fd52861d57f0a38" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.272978 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-df3e-account-create-update-vd9pb" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.611762 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-hxlrh"] Mar 18 18:20:18 crc kubenswrapper[4830]: E0318 18:20:18.612374 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a9f706f-6103-4f0e-bf89-0389d47ef9ed" containerName="init" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.612385 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a9f706f-6103-4f0e-bf89-0389d47ef9ed" containerName="init" Mar 18 18:20:18 crc kubenswrapper[4830]: E0318 18:20:18.612397 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ffce816-aaf7-430e-98c7-df9f85c17e0d" containerName="mariadb-account-create-update" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.612403 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ffce816-aaf7-430e-98c7-df9f85c17e0d" containerName="mariadb-account-create-update" Mar 18 18:20:18 crc kubenswrapper[4830]: E0318 18:20:18.612413 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee2185d9-90ed-4fc9-a38b-eab30e813652" containerName="mariadb-account-create-update" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.612419 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee2185d9-90ed-4fc9-a38b-eab30e813652" containerName="mariadb-account-create-update" Mar 18 18:20:18 crc kubenswrapper[4830]: E0318 18:20:18.612430 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b738a352-32c7-4373-8324-8a02c359d300" containerName="mariadb-database-create" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.612436 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b738a352-32c7-4373-8324-8a02c359d300" containerName="mariadb-database-create" Mar 18 18:20:18 crc kubenswrapper[4830]: E0318 18:20:18.612447 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a9f706f-6103-4f0e-bf89-0389d47ef9ed" containerName="dnsmasq-dns" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.612452 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a9f706f-6103-4f0e-bf89-0389d47ef9ed" containerName="dnsmasq-dns" Mar 18 18:20:18 crc kubenswrapper[4830]: E0318 18:20:18.612463 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ae038dc-03d7-407c-81eb-ae1bae65d555" containerName="mariadb-database-create" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.612469 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ae038dc-03d7-407c-81eb-ae1bae65d555" containerName="mariadb-database-create" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.612621 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee2185d9-90ed-4fc9-a38b-eab30e813652" containerName="mariadb-account-create-update" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.612635 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ffce816-aaf7-430e-98c7-df9f85c17e0d" containerName="mariadb-account-create-update" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.612647 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="b738a352-32c7-4373-8324-8a02c359d300" containerName="mariadb-database-create" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.612654 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ae038dc-03d7-407c-81eb-ae1bae65d555" containerName="mariadb-database-create" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.612661 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a9f706f-6103-4f0e-bf89-0389d47ef9ed" containerName="dnsmasq-dns" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.613138 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hxlrh" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.618115 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.618284 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-x4gbn" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.634390 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-hxlrh"] Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.649758 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2cad194-a0a7-44e7-8e5c-4653ae33983c-combined-ca-bundle\") pod \"glance-db-sync-hxlrh\" (UID: \"e2cad194-a0a7-44e7-8e5c-4653ae33983c\") " pod="openstack/glance-db-sync-hxlrh" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.649830 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e2cad194-a0a7-44e7-8e5c-4653ae33983c-db-sync-config-data\") pod \"glance-db-sync-hxlrh\" (UID: \"e2cad194-a0a7-44e7-8e5c-4653ae33983c\") " pod="openstack/glance-db-sync-hxlrh" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.649887 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ckns\" (UniqueName: \"kubernetes.io/projected/e2cad194-a0a7-44e7-8e5c-4653ae33983c-kube-api-access-5ckns\") pod \"glance-db-sync-hxlrh\" (UID: \"e2cad194-a0a7-44e7-8e5c-4653ae33983c\") " pod="openstack/glance-db-sync-hxlrh" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.649916 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2cad194-a0a7-44e7-8e5c-4653ae33983c-config-data\") pod \"glance-db-sync-hxlrh\" (UID: \"e2cad194-a0a7-44e7-8e5c-4653ae33983c\") " pod="openstack/glance-db-sync-hxlrh" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.751300 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2cad194-a0a7-44e7-8e5c-4653ae33983c-combined-ca-bundle\") pod \"glance-db-sync-hxlrh\" (UID: \"e2cad194-a0a7-44e7-8e5c-4653ae33983c\") " pod="openstack/glance-db-sync-hxlrh" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.751656 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e2cad194-a0a7-44e7-8e5c-4653ae33983c-db-sync-config-data\") pod \"glance-db-sync-hxlrh\" (UID: \"e2cad194-a0a7-44e7-8e5c-4653ae33983c\") " pod="openstack/glance-db-sync-hxlrh" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.751896 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ckns\" (UniqueName: \"kubernetes.io/projected/e2cad194-a0a7-44e7-8e5c-4653ae33983c-kube-api-access-5ckns\") pod \"glance-db-sync-hxlrh\" (UID: \"e2cad194-a0a7-44e7-8e5c-4653ae33983c\") " pod="openstack/glance-db-sync-hxlrh" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.752084 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2cad194-a0a7-44e7-8e5c-4653ae33983c-config-data\") pod \"glance-db-sync-hxlrh\" (UID: \"e2cad194-a0a7-44e7-8e5c-4653ae33983c\") " pod="openstack/glance-db-sync-hxlrh" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.758057 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2cad194-a0a7-44e7-8e5c-4653ae33983c-config-data\") pod \"glance-db-sync-hxlrh\" (UID: \"e2cad194-a0a7-44e7-8e5c-4653ae33983c\") " pod="openstack/glance-db-sync-hxlrh" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.758573 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2cad194-a0a7-44e7-8e5c-4653ae33983c-combined-ca-bundle\") pod \"glance-db-sync-hxlrh\" (UID: \"e2cad194-a0a7-44e7-8e5c-4653ae33983c\") " pod="openstack/glance-db-sync-hxlrh" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.758884 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e2cad194-a0a7-44e7-8e5c-4653ae33983c-db-sync-config-data\") pod \"glance-db-sync-hxlrh\" (UID: \"e2cad194-a0a7-44e7-8e5c-4653ae33983c\") " pod="openstack/glance-db-sync-hxlrh" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.777182 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ckns\" (UniqueName: \"kubernetes.io/projected/e2cad194-a0a7-44e7-8e5c-4653ae33983c-kube-api-access-5ckns\") pod \"glance-db-sync-hxlrh\" (UID: \"e2cad194-a0a7-44e7-8e5c-4653ae33983c\") " pod="openstack/glance-db-sync-hxlrh" Mar 18 18:20:18 crc kubenswrapper[4830]: I0318 18:20:18.940092 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hxlrh" Mar 18 18:20:19 crc kubenswrapper[4830]: I0318 18:20:19.290837 4830 generic.go:334] "Generic (PLEG): container finished" podID="cb262f8b-f0ed-4644-b313-2a2b46815860" containerID="ca885e950a7893618761320e7bff3491a6c83307e3faaddbb8ddf40f9d55f77e" exitCode=0 Mar 18 18:20:19 crc kubenswrapper[4830]: I0318 18:20:19.290905 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nmp7q" event={"ID":"cb262f8b-f0ed-4644-b313-2a2b46815860","Type":"ContainerDied","Data":"ca885e950a7893618761320e7bff3491a6c83307e3faaddbb8ddf40f9d55f77e"} Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:19.440295 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 18 18:20:20 crc kubenswrapper[4830]: W0318 18:20:19.529813 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2cad194_a0a7_44e7_8e5c_4653ae33983c.slice/crio-b41163353b6d25a7f3e9839add4632f868ed51420313e36589cafc75778950d5 WatchSource:0}: Error finding container b41163353b6d25a7f3e9839add4632f868ed51420313e36589cafc75778950d5: Status 404 returned error can't find the container with id b41163353b6d25a7f3e9839add4632f868ed51420313e36589cafc75778950d5 Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:19.535809 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-hxlrh"] Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:20.073809 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-dnbxp"] Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:20.075794 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dnbxp" Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:20.078263 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:20.085744 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-dnbxp"] Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:20.180268 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fscsh\" (UniqueName: \"kubernetes.io/projected/6d548cc7-98a5-4291-a212-1a61ec2ed8bb-kube-api-access-fscsh\") pod \"root-account-create-update-dnbxp\" (UID: \"6d548cc7-98a5-4291-a212-1a61ec2ed8bb\") " pod="openstack/root-account-create-update-dnbxp" Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:20.180493 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d548cc7-98a5-4291-a212-1a61ec2ed8bb-operator-scripts\") pod \"root-account-create-update-dnbxp\" (UID: \"6d548cc7-98a5-4291-a212-1a61ec2ed8bb\") " pod="openstack/root-account-create-update-dnbxp" Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:20.282490 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d548cc7-98a5-4291-a212-1a61ec2ed8bb-operator-scripts\") pod \"root-account-create-update-dnbxp\" (UID: \"6d548cc7-98a5-4291-a212-1a61ec2ed8bb\") " pod="openstack/root-account-create-update-dnbxp" Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:20.282576 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fscsh\" (UniqueName: \"kubernetes.io/projected/6d548cc7-98a5-4291-a212-1a61ec2ed8bb-kube-api-access-fscsh\") pod \"root-account-create-update-dnbxp\" (UID: \"6d548cc7-98a5-4291-a212-1a61ec2ed8bb\") " pod="openstack/root-account-create-update-dnbxp" Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:20.284258 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d548cc7-98a5-4291-a212-1a61ec2ed8bb-operator-scripts\") pod \"root-account-create-update-dnbxp\" (UID: \"6d548cc7-98a5-4291-a212-1a61ec2ed8bb\") " pod="openstack/root-account-create-update-dnbxp" Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:20.300655 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hxlrh" event={"ID":"e2cad194-a0a7-44e7-8e5c-4653ae33983c","Type":"ContainerStarted","Data":"b41163353b6d25a7f3e9839add4632f868ed51420313e36589cafc75778950d5"} Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:20.345800 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fscsh\" (UniqueName: \"kubernetes.io/projected/6d548cc7-98a5-4291-a212-1a61ec2ed8bb-kube-api-access-fscsh\") pod \"root-account-create-update-dnbxp\" (UID: \"6d548cc7-98a5-4291-a212-1a61ec2ed8bb\") " pod="openstack/root-account-create-update-dnbxp" Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:20.405715 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dnbxp" Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:20.641138 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nmp7q" Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:20.692391 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cb262f8b-f0ed-4644-b313-2a2b46815860-ring-data-devices\") pod \"cb262f8b-f0ed-4644-b313-2a2b46815860\" (UID: \"cb262f8b-f0ed-4644-b313-2a2b46815860\") " Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:20.692522 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cb262f8b-f0ed-4644-b313-2a2b46815860-swiftconf\") pod \"cb262f8b-f0ed-4644-b313-2a2b46815860\" (UID: \"cb262f8b-f0ed-4644-b313-2a2b46815860\") " Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:20.692576 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb262f8b-f0ed-4644-b313-2a2b46815860-combined-ca-bundle\") pod \"cb262f8b-f0ed-4644-b313-2a2b46815860\" (UID: \"cb262f8b-f0ed-4644-b313-2a2b46815860\") " Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:20.692646 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljb9m\" (UniqueName: \"kubernetes.io/projected/cb262f8b-f0ed-4644-b313-2a2b46815860-kube-api-access-ljb9m\") pod \"cb262f8b-f0ed-4644-b313-2a2b46815860\" (UID: \"cb262f8b-f0ed-4644-b313-2a2b46815860\") " Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:20.692673 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cb262f8b-f0ed-4644-b313-2a2b46815860-dispersionconf\") pod \"cb262f8b-f0ed-4644-b313-2a2b46815860\" (UID: \"cb262f8b-f0ed-4644-b313-2a2b46815860\") " Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:20.692736 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb262f8b-f0ed-4644-b313-2a2b46815860-scripts\") pod \"cb262f8b-f0ed-4644-b313-2a2b46815860\" (UID: \"cb262f8b-f0ed-4644-b313-2a2b46815860\") " Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:20.692818 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cb262f8b-f0ed-4644-b313-2a2b46815860-etc-swift\") pod \"cb262f8b-f0ed-4644-b313-2a2b46815860\" (UID: \"cb262f8b-f0ed-4644-b313-2a2b46815860\") " Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:20.693672 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb262f8b-f0ed-4644-b313-2a2b46815860-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "cb262f8b-f0ed-4644-b313-2a2b46815860" (UID: "cb262f8b-f0ed-4644-b313-2a2b46815860"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:20.693813 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb262f8b-f0ed-4644-b313-2a2b46815860-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "cb262f8b-f0ed-4644-b313-2a2b46815860" (UID: "cb262f8b-f0ed-4644-b313-2a2b46815860"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:20.722063 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb262f8b-f0ed-4644-b313-2a2b46815860-kube-api-access-ljb9m" (OuterVolumeSpecName: "kube-api-access-ljb9m") pod "cb262f8b-f0ed-4644-b313-2a2b46815860" (UID: "cb262f8b-f0ed-4644-b313-2a2b46815860"). InnerVolumeSpecName "kube-api-access-ljb9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:20.727167 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb262f8b-f0ed-4644-b313-2a2b46815860-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb262f8b-f0ed-4644-b313-2a2b46815860" (UID: "cb262f8b-f0ed-4644-b313-2a2b46815860"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:20.727259 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb262f8b-f0ed-4644-b313-2a2b46815860-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "cb262f8b-f0ed-4644-b313-2a2b46815860" (UID: "cb262f8b-f0ed-4644-b313-2a2b46815860"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:20.735396 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-dnbxp"] Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:20.738157 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb262f8b-f0ed-4644-b313-2a2b46815860-scripts" (OuterVolumeSpecName: "scripts") pod "cb262f8b-f0ed-4644-b313-2a2b46815860" (UID: "cb262f8b-f0ed-4644-b313-2a2b46815860"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:20 crc kubenswrapper[4830]: W0318 18:20:20.742670 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d548cc7_98a5_4291_a212_1a61ec2ed8bb.slice/crio-c0dec381c9f702824cccc956935d0df953dd2b92df5c3c57c7b1d178f67134a9 WatchSource:0}: Error finding container c0dec381c9f702824cccc956935d0df953dd2b92df5c3c57c7b1d178f67134a9: Status 404 returned error can't find the container with id c0dec381c9f702824cccc956935d0df953dd2b92df5c3c57c7b1d178f67134a9 Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:20.746427 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb262f8b-f0ed-4644-b313-2a2b46815860-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "cb262f8b-f0ed-4644-b313-2a2b46815860" (UID: "cb262f8b-f0ed-4644-b313-2a2b46815860"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:20.794930 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb262f8b-f0ed-4644-b313-2a2b46815860-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:20.795023 4830 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cb262f8b-f0ed-4644-b313-2a2b46815860-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:20.795076 4830 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cb262f8b-f0ed-4644-b313-2a2b46815860-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:20.795123 4830 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cb262f8b-f0ed-4644-b313-2a2b46815860-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:20.795166 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb262f8b-f0ed-4644-b313-2a2b46815860-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:20.795222 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljb9m\" (UniqueName: \"kubernetes.io/projected/cb262f8b-f0ed-4644-b313-2a2b46815860-kube-api-access-ljb9m\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:20 crc kubenswrapper[4830]: I0318 18:20:20.795269 4830 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cb262f8b-f0ed-4644-b313-2a2b46815860-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:21 crc kubenswrapper[4830]: I0318 18:20:21.311394 4830 generic.go:334] "Generic (PLEG): container finished" podID="6d548cc7-98a5-4291-a212-1a61ec2ed8bb" containerID="a00384955c734a22389aeb43e6ded62b5afecb8b1d30da824d579f5a71c2a71a" exitCode=0 Mar 18 18:20:21 crc kubenswrapper[4830]: I0318 18:20:21.311501 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dnbxp" event={"ID":"6d548cc7-98a5-4291-a212-1a61ec2ed8bb","Type":"ContainerDied","Data":"a00384955c734a22389aeb43e6ded62b5afecb8b1d30da824d579f5a71c2a71a"} Mar 18 18:20:21 crc kubenswrapper[4830]: I0318 18:20:21.311813 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dnbxp" event={"ID":"6d548cc7-98a5-4291-a212-1a61ec2ed8bb","Type":"ContainerStarted","Data":"c0dec381c9f702824cccc956935d0df953dd2b92df5c3c57c7b1d178f67134a9"} Mar 18 18:20:21 crc kubenswrapper[4830]: I0318 18:20:21.315511 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nmp7q" event={"ID":"cb262f8b-f0ed-4644-b313-2a2b46815860","Type":"ContainerDied","Data":"ef7cb5dcf01c3803669c426fc1213faf11fd090df14fffeb0df9b3a3c040c170"} Mar 18 18:20:21 crc kubenswrapper[4830]: I0318 18:20:21.315564 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef7cb5dcf01c3803669c426fc1213faf11fd090df14fffeb0df9b3a3c040c170" Mar 18 18:20:21 crc kubenswrapper[4830]: I0318 18:20:21.315578 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nmp7q" Mar 18 18:20:22 crc kubenswrapper[4830]: I0318 18:20:22.426139 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ccc6cbaa-b562-49fc-9add-94aac04d60ed-etc-swift\") pod \"swift-storage-0\" (UID: \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\") " pod="openstack/swift-storage-0" Mar 18 18:20:22 crc kubenswrapper[4830]: I0318 18:20:22.451181 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ccc6cbaa-b562-49fc-9add-94aac04d60ed-etc-swift\") pod \"swift-storage-0\" (UID: \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\") " pod="openstack/swift-storage-0" Mar 18 18:20:22 crc kubenswrapper[4830]: I0318 18:20:22.470341 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 18 18:20:22 crc kubenswrapper[4830]: I0318 18:20:22.862362 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dnbxp" Mar 18 18:20:22 crc kubenswrapper[4830]: I0318 18:20:22.934634 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d548cc7-98a5-4291-a212-1a61ec2ed8bb-operator-scripts\") pod \"6d548cc7-98a5-4291-a212-1a61ec2ed8bb\" (UID: \"6d548cc7-98a5-4291-a212-1a61ec2ed8bb\") " Mar 18 18:20:22 crc kubenswrapper[4830]: I0318 18:20:22.934713 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fscsh\" (UniqueName: \"kubernetes.io/projected/6d548cc7-98a5-4291-a212-1a61ec2ed8bb-kube-api-access-fscsh\") pod \"6d548cc7-98a5-4291-a212-1a61ec2ed8bb\" (UID: \"6d548cc7-98a5-4291-a212-1a61ec2ed8bb\") " Mar 18 18:20:22 crc kubenswrapper[4830]: I0318 18:20:22.935792 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d548cc7-98a5-4291-a212-1a61ec2ed8bb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6d548cc7-98a5-4291-a212-1a61ec2ed8bb" (UID: "6d548cc7-98a5-4291-a212-1a61ec2ed8bb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:22 crc kubenswrapper[4830]: I0318 18:20:22.939367 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d548cc7-98a5-4291-a212-1a61ec2ed8bb-kube-api-access-fscsh" (OuterVolumeSpecName: "kube-api-access-fscsh") pod "6d548cc7-98a5-4291-a212-1a61ec2ed8bb" (UID: "6d548cc7-98a5-4291-a212-1a61ec2ed8bb"). InnerVolumeSpecName "kube-api-access-fscsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:20:23 crc kubenswrapper[4830]: I0318 18:20:23.013028 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 18 18:20:23 crc kubenswrapper[4830]: W0318 18:20:23.026001 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podccc6cbaa_b562_49fc_9add_94aac04d60ed.slice/crio-b0a04edc260b00fec775e4448595db34b84f53d267f187b4f82e648c4ae48655 WatchSource:0}: Error finding container b0a04edc260b00fec775e4448595db34b84f53d267f187b4f82e648c4ae48655: Status 404 returned error can't find the container with id b0a04edc260b00fec775e4448595db34b84f53d267f187b4f82e648c4ae48655 Mar 18 18:20:23 crc kubenswrapper[4830]: I0318 18:20:23.037530 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d548cc7-98a5-4291-a212-1a61ec2ed8bb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:23 crc kubenswrapper[4830]: I0318 18:20:23.037574 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fscsh\" (UniqueName: \"kubernetes.io/projected/6d548cc7-98a5-4291-a212-1a61ec2ed8bb-kube-api-access-fscsh\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:23 crc kubenswrapper[4830]: I0318 18:20:23.346130 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dnbxp" event={"ID":"6d548cc7-98a5-4291-a212-1a61ec2ed8bb","Type":"ContainerDied","Data":"c0dec381c9f702824cccc956935d0df953dd2b92df5c3c57c7b1d178f67134a9"} Mar 18 18:20:23 crc kubenswrapper[4830]: I0318 18:20:23.346170 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0dec381c9f702824cccc956935d0df953dd2b92df5c3c57c7b1d178f67134a9" Mar 18 18:20:23 crc kubenswrapper[4830]: I0318 18:20:23.346251 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dnbxp" Mar 18 18:20:23 crc kubenswrapper[4830]: I0318 18:20:23.347637 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccc6cbaa-b562-49fc-9add-94aac04d60ed","Type":"ContainerStarted","Data":"b0a04edc260b00fec775e4448595db34b84f53d267f187b4f82e648c4ae48655"} Mar 18 18:20:23 crc kubenswrapper[4830]: I0318 18:20:23.532540 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-chwf9" podUID="544c01f7-a6da-45de-96f2-9ab9dea0567c" containerName="ovn-controller" probeResult="failure" output=< Mar 18 18:20:23 crc kubenswrapper[4830]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 18 18:20:23 crc kubenswrapper[4830]: > Mar 18 18:20:23 crc kubenswrapper[4830]: I0318 18:20:23.824610 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-dv8kn" Mar 18 18:20:23 crc kubenswrapper[4830]: I0318 18:20:23.829126 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-dv8kn" Mar 18 18:20:24 crc kubenswrapper[4830]: I0318 18:20:24.076629 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-chwf9-config-b6tzr"] Mar 18 18:20:24 crc kubenswrapper[4830]: E0318 18:20:24.077032 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d548cc7-98a5-4291-a212-1a61ec2ed8bb" containerName="mariadb-account-create-update" Mar 18 18:20:24 crc kubenswrapper[4830]: I0318 18:20:24.077048 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d548cc7-98a5-4291-a212-1a61ec2ed8bb" containerName="mariadb-account-create-update" Mar 18 18:20:24 crc kubenswrapper[4830]: E0318 18:20:24.077062 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb262f8b-f0ed-4644-b313-2a2b46815860" containerName="swift-ring-rebalance" Mar 18 18:20:24 crc kubenswrapper[4830]: I0318 18:20:24.077069 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb262f8b-f0ed-4644-b313-2a2b46815860" containerName="swift-ring-rebalance" Mar 18 18:20:24 crc kubenswrapper[4830]: I0318 18:20:24.077252 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb262f8b-f0ed-4644-b313-2a2b46815860" containerName="swift-ring-rebalance" Mar 18 18:20:24 crc kubenswrapper[4830]: I0318 18:20:24.077269 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d548cc7-98a5-4291-a212-1a61ec2ed8bb" containerName="mariadb-account-create-update" Mar 18 18:20:24 crc kubenswrapper[4830]: I0318 18:20:24.077856 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-chwf9-config-b6tzr" Mar 18 18:20:24 crc kubenswrapper[4830]: I0318 18:20:24.082587 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 18 18:20:24 crc kubenswrapper[4830]: I0318 18:20:24.092534 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-chwf9-config-b6tzr"] Mar 18 18:20:24 crc kubenswrapper[4830]: I0318 18:20:24.155289 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-additional-scripts\") pod \"ovn-controller-chwf9-config-b6tzr\" (UID: \"eb167c2e-263e-4566-bfa2-0c5ccbdb65b0\") " pod="openstack/ovn-controller-chwf9-config-b6tzr" Mar 18 18:20:24 crc kubenswrapper[4830]: I0318 18:20:24.155385 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-var-run-ovn\") pod \"ovn-controller-chwf9-config-b6tzr\" (UID: \"eb167c2e-263e-4566-bfa2-0c5ccbdb65b0\") " pod="openstack/ovn-controller-chwf9-config-b6tzr" Mar 18 18:20:24 crc kubenswrapper[4830]: I0318 18:20:24.155439 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-var-log-ovn\") pod \"ovn-controller-chwf9-config-b6tzr\" (UID: \"eb167c2e-263e-4566-bfa2-0c5ccbdb65b0\") " pod="openstack/ovn-controller-chwf9-config-b6tzr" Mar 18 18:20:24 crc kubenswrapper[4830]: I0318 18:20:24.155676 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-var-run\") pod \"ovn-controller-chwf9-config-b6tzr\" (UID: \"eb167c2e-263e-4566-bfa2-0c5ccbdb65b0\") " pod="openstack/ovn-controller-chwf9-config-b6tzr" Mar 18 18:20:24 crc kubenswrapper[4830]: I0318 18:20:24.155846 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gffjn\" (UniqueName: \"kubernetes.io/projected/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-kube-api-access-gffjn\") pod \"ovn-controller-chwf9-config-b6tzr\" (UID: \"eb167c2e-263e-4566-bfa2-0c5ccbdb65b0\") " pod="openstack/ovn-controller-chwf9-config-b6tzr" Mar 18 18:20:24 crc kubenswrapper[4830]: I0318 18:20:24.155921 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-scripts\") pod \"ovn-controller-chwf9-config-b6tzr\" (UID: \"eb167c2e-263e-4566-bfa2-0c5ccbdb65b0\") " pod="openstack/ovn-controller-chwf9-config-b6tzr" Mar 18 18:20:24 crc kubenswrapper[4830]: I0318 18:20:24.257173 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-scripts\") pod \"ovn-controller-chwf9-config-b6tzr\" (UID: \"eb167c2e-263e-4566-bfa2-0c5ccbdb65b0\") " pod="openstack/ovn-controller-chwf9-config-b6tzr" Mar 18 18:20:24 crc kubenswrapper[4830]: I0318 18:20:24.257221 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-additional-scripts\") pod \"ovn-controller-chwf9-config-b6tzr\" (UID: \"eb167c2e-263e-4566-bfa2-0c5ccbdb65b0\") " pod="openstack/ovn-controller-chwf9-config-b6tzr" Mar 18 18:20:24 crc kubenswrapper[4830]: I0318 18:20:24.257248 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-var-run-ovn\") pod \"ovn-controller-chwf9-config-b6tzr\" (UID: \"eb167c2e-263e-4566-bfa2-0c5ccbdb65b0\") " pod="openstack/ovn-controller-chwf9-config-b6tzr" Mar 18 18:20:24 crc kubenswrapper[4830]: I0318 18:20:24.257277 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-var-log-ovn\") pod \"ovn-controller-chwf9-config-b6tzr\" (UID: \"eb167c2e-263e-4566-bfa2-0c5ccbdb65b0\") " pod="openstack/ovn-controller-chwf9-config-b6tzr" Mar 18 18:20:24 crc kubenswrapper[4830]: I0318 18:20:24.257336 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-var-run\") pod \"ovn-controller-chwf9-config-b6tzr\" (UID: \"eb167c2e-263e-4566-bfa2-0c5ccbdb65b0\") " pod="openstack/ovn-controller-chwf9-config-b6tzr" Mar 18 18:20:24 crc kubenswrapper[4830]: I0318 18:20:24.257363 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gffjn\" (UniqueName: \"kubernetes.io/projected/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-kube-api-access-gffjn\") pod \"ovn-controller-chwf9-config-b6tzr\" (UID: \"eb167c2e-263e-4566-bfa2-0c5ccbdb65b0\") " pod="openstack/ovn-controller-chwf9-config-b6tzr" Mar 18 18:20:24 crc kubenswrapper[4830]: I0318 18:20:24.260219 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-scripts\") pod \"ovn-controller-chwf9-config-b6tzr\" (UID: \"eb167c2e-263e-4566-bfa2-0c5ccbdb65b0\") " pod="openstack/ovn-controller-chwf9-config-b6tzr" Mar 18 18:20:24 crc kubenswrapper[4830]: I0318 18:20:24.261417 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-additional-scripts\") pod \"ovn-controller-chwf9-config-b6tzr\" (UID: \"eb167c2e-263e-4566-bfa2-0c5ccbdb65b0\") " pod="openstack/ovn-controller-chwf9-config-b6tzr" Mar 18 18:20:24 crc kubenswrapper[4830]: I0318 18:20:24.261764 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-var-run-ovn\") pod \"ovn-controller-chwf9-config-b6tzr\" (UID: \"eb167c2e-263e-4566-bfa2-0c5ccbdb65b0\") " pod="openstack/ovn-controller-chwf9-config-b6tzr" Mar 18 18:20:24 crc kubenswrapper[4830]: I0318 18:20:24.261854 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-var-log-ovn\") pod \"ovn-controller-chwf9-config-b6tzr\" (UID: \"eb167c2e-263e-4566-bfa2-0c5ccbdb65b0\") " pod="openstack/ovn-controller-chwf9-config-b6tzr" Mar 18 18:20:24 crc kubenswrapper[4830]: I0318 18:20:24.261934 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-var-run\") pod \"ovn-controller-chwf9-config-b6tzr\" (UID: \"eb167c2e-263e-4566-bfa2-0c5ccbdb65b0\") " pod="openstack/ovn-controller-chwf9-config-b6tzr" Mar 18 18:20:24 crc kubenswrapper[4830]: I0318 18:20:24.280097 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gffjn\" (UniqueName: \"kubernetes.io/projected/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-kube-api-access-gffjn\") pod \"ovn-controller-chwf9-config-b6tzr\" (UID: \"eb167c2e-263e-4566-bfa2-0c5ccbdb65b0\") " pod="openstack/ovn-controller-chwf9-config-b6tzr" Mar 18 18:20:24 crc kubenswrapper[4830]: I0318 18:20:24.407337 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-chwf9-config-b6tzr" Mar 18 18:20:24 crc kubenswrapper[4830]: I0318 18:20:24.863047 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-chwf9-config-b6tzr"] Mar 18 18:20:25 crc kubenswrapper[4830]: W0318 18:20:25.127535 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb167c2e_263e_4566_bfa2_0c5ccbdb65b0.slice/crio-920139b5f496df419436c9f51436e10011200f4e81a0cc9fe5ce7a45185c713e WatchSource:0}: Error finding container 920139b5f496df419436c9f51436e10011200f4e81a0cc9fe5ce7a45185c713e: Status 404 returned error can't find the container with id 920139b5f496df419436c9f51436e10011200f4e81a0cc9fe5ce7a45185c713e Mar 18 18:20:25 crc kubenswrapper[4830]: I0318 18:20:25.363967 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-chwf9-config-b6tzr" event={"ID":"eb167c2e-263e-4566-bfa2-0c5ccbdb65b0","Type":"ContainerStarted","Data":"920139b5f496df419436c9f51436e10011200f4e81a0cc9fe5ce7a45185c713e"} Mar 18 18:20:26 crc kubenswrapper[4830]: I0318 18:20:26.335580 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-dnbxp"] Mar 18 18:20:26 crc kubenswrapper[4830]: I0318 18:20:26.342002 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-dnbxp"] Mar 18 18:20:26 crc kubenswrapper[4830]: I0318 18:20:26.372511 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccc6cbaa-b562-49fc-9add-94aac04d60ed","Type":"ContainerStarted","Data":"6c5bd47f9683cd9c5e03e6fd6c51407a8085923fe0e9f8ac3506a2c980271f44"} Mar 18 18:20:26 crc kubenswrapper[4830]: I0318 18:20:26.372579 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccc6cbaa-b562-49fc-9add-94aac04d60ed","Type":"ContainerStarted","Data":"933487d6b7c0d60ba81cf11b01ceaae63489030bbb5fd50148a67d7724abf942"} Mar 18 18:20:26 crc kubenswrapper[4830]: I0318 18:20:26.372598 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccc6cbaa-b562-49fc-9add-94aac04d60ed","Type":"ContainerStarted","Data":"7e87a03e3adb66017525596597b8739a2dd883902ed90c632e4e5cbfbfade6fe"} Mar 18 18:20:26 crc kubenswrapper[4830]: I0318 18:20:26.373780 4830 generic.go:334] "Generic (PLEG): container finished" podID="eb167c2e-263e-4566-bfa2-0c5ccbdb65b0" containerID="d154a60d139fbff2ee36c61dbc4db23537b10afebf0dabcddf0ce4e5874a933a" exitCode=0 Mar 18 18:20:26 crc kubenswrapper[4830]: I0318 18:20:26.373830 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-chwf9-config-b6tzr" event={"ID":"eb167c2e-263e-4566-bfa2-0c5ccbdb65b0","Type":"ContainerDied","Data":"d154a60d139fbff2ee36c61dbc4db23537b10afebf0dabcddf0ce4e5874a933a"} Mar 18 18:20:27 crc kubenswrapper[4830]: I0318 18:20:27.389205 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccc6cbaa-b562-49fc-9add-94aac04d60ed","Type":"ContainerStarted","Data":"1176cdff085a64af57931e22a9423ae76c0f52837134d47b63aa9518c32e92c6"} Mar 18 18:20:27 crc kubenswrapper[4830]: I0318 18:20:27.802280 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-chwf9-config-b6tzr" Mar 18 18:20:27 crc kubenswrapper[4830]: I0318 18:20:27.923382 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-additional-scripts\") pod \"eb167c2e-263e-4566-bfa2-0c5ccbdb65b0\" (UID: \"eb167c2e-263e-4566-bfa2-0c5ccbdb65b0\") " Mar 18 18:20:27 crc kubenswrapper[4830]: I0318 18:20:27.923708 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-var-run-ovn\") pod \"eb167c2e-263e-4566-bfa2-0c5ccbdb65b0\" (UID: \"eb167c2e-263e-4566-bfa2-0c5ccbdb65b0\") " Mar 18 18:20:27 crc kubenswrapper[4830]: I0318 18:20:27.923760 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-scripts\") pod \"eb167c2e-263e-4566-bfa2-0c5ccbdb65b0\" (UID: \"eb167c2e-263e-4566-bfa2-0c5ccbdb65b0\") " Mar 18 18:20:27 crc kubenswrapper[4830]: I0318 18:20:27.923867 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-var-log-ovn\") pod \"eb167c2e-263e-4566-bfa2-0c5ccbdb65b0\" (UID: \"eb167c2e-263e-4566-bfa2-0c5ccbdb65b0\") " Mar 18 18:20:27 crc kubenswrapper[4830]: I0318 18:20:27.923887 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-var-run\") pod \"eb167c2e-263e-4566-bfa2-0c5ccbdb65b0\" (UID: \"eb167c2e-263e-4566-bfa2-0c5ccbdb65b0\") " Mar 18 18:20:27 crc kubenswrapper[4830]: I0318 18:20:27.923902 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gffjn\" (UniqueName: \"kubernetes.io/projected/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-kube-api-access-gffjn\") pod \"eb167c2e-263e-4566-bfa2-0c5ccbdb65b0\" (UID: \"eb167c2e-263e-4566-bfa2-0c5ccbdb65b0\") " Mar 18 18:20:27 crc kubenswrapper[4830]: I0318 18:20:27.924139 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "eb167c2e-263e-4566-bfa2-0c5ccbdb65b0" (UID: "eb167c2e-263e-4566-bfa2-0c5ccbdb65b0"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:20:27 crc kubenswrapper[4830]: I0318 18:20:27.924164 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-var-run" (OuterVolumeSpecName: "var-run") pod "eb167c2e-263e-4566-bfa2-0c5ccbdb65b0" (UID: "eb167c2e-263e-4566-bfa2-0c5ccbdb65b0"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:20:27 crc kubenswrapper[4830]: I0318 18:20:27.924174 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "eb167c2e-263e-4566-bfa2-0c5ccbdb65b0" (UID: "eb167c2e-263e-4566-bfa2-0c5ccbdb65b0"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:20:27 crc kubenswrapper[4830]: I0318 18:20:27.924468 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "eb167c2e-263e-4566-bfa2-0c5ccbdb65b0" (UID: "eb167c2e-263e-4566-bfa2-0c5ccbdb65b0"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:27 crc kubenswrapper[4830]: I0318 18:20:27.924820 4830 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:27 crc kubenswrapper[4830]: I0318 18:20:27.924843 4830 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-var-run\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:27 crc kubenswrapper[4830]: I0318 18:20:27.924855 4830 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:27 crc kubenswrapper[4830]: I0318 18:20:27.924875 4830 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:27 crc kubenswrapper[4830]: I0318 18:20:27.924983 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-scripts" (OuterVolumeSpecName: "scripts") pod "eb167c2e-263e-4566-bfa2-0c5ccbdb65b0" (UID: "eb167c2e-263e-4566-bfa2-0c5ccbdb65b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:27 crc kubenswrapper[4830]: I0318 18:20:27.931893 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-kube-api-access-gffjn" (OuterVolumeSpecName: "kube-api-access-gffjn") pod "eb167c2e-263e-4566-bfa2-0c5ccbdb65b0" (UID: "eb167c2e-263e-4566-bfa2-0c5ccbdb65b0"). InnerVolumeSpecName "kube-api-access-gffjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:20:28 crc kubenswrapper[4830]: I0318 18:20:28.027104 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gffjn\" (UniqueName: \"kubernetes.io/projected/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-kube-api-access-gffjn\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:28 crc kubenswrapper[4830]: I0318 18:20:28.027154 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:28 crc kubenswrapper[4830]: I0318 18:20:28.251900 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d548cc7-98a5-4291-a212-1a61ec2ed8bb" path="/var/lib/kubelet/pods/6d548cc7-98a5-4291-a212-1a61ec2ed8bb/volumes" Mar 18 18:20:28 crc kubenswrapper[4830]: I0318 18:20:28.409465 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-chwf9-config-b6tzr" event={"ID":"eb167c2e-263e-4566-bfa2-0c5ccbdb65b0","Type":"ContainerDied","Data":"920139b5f496df419436c9f51436e10011200f4e81a0cc9fe5ce7a45185c713e"} Mar 18 18:20:28 crc kubenswrapper[4830]: I0318 18:20:28.409525 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="920139b5f496df419436c9f51436e10011200f4e81a0cc9fe5ce7a45185c713e" Mar 18 18:20:28 crc kubenswrapper[4830]: I0318 18:20:28.409709 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-chwf9-config-b6tzr" Mar 18 18:20:28 crc kubenswrapper[4830]: I0318 18:20:28.544400 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-chwf9" Mar 18 18:20:28 crc kubenswrapper[4830]: I0318 18:20:28.931175 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-chwf9-config-b6tzr"] Mar 18 18:20:28 crc kubenswrapper[4830]: I0318 18:20:28.940585 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-chwf9-config-b6tzr"] Mar 18 18:20:29 crc kubenswrapper[4830]: I0318 18:20:29.035787 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-chwf9-config-7qp5d"] Mar 18 18:20:29 crc kubenswrapper[4830]: E0318 18:20:29.036278 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb167c2e-263e-4566-bfa2-0c5ccbdb65b0" containerName="ovn-config" Mar 18 18:20:29 crc kubenswrapper[4830]: I0318 18:20:29.036299 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb167c2e-263e-4566-bfa2-0c5ccbdb65b0" containerName="ovn-config" Mar 18 18:20:29 crc kubenswrapper[4830]: I0318 18:20:29.036595 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb167c2e-263e-4566-bfa2-0c5ccbdb65b0" containerName="ovn-config" Mar 18 18:20:29 crc kubenswrapper[4830]: I0318 18:20:29.037390 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-chwf9-config-7qp5d" Mar 18 18:20:29 crc kubenswrapper[4830]: I0318 18:20:29.046610 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-chwf9-config-7qp5d"] Mar 18 18:20:29 crc kubenswrapper[4830]: I0318 18:20:29.060389 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 18 18:20:29 crc kubenswrapper[4830]: I0318 18:20:29.148024 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/79bf951a-722d-42b3-a813-c16de894ee1f-additional-scripts\") pod \"ovn-controller-chwf9-config-7qp5d\" (UID: \"79bf951a-722d-42b3-a813-c16de894ee1f\") " pod="openstack/ovn-controller-chwf9-config-7qp5d" Mar 18 18:20:29 crc kubenswrapper[4830]: I0318 18:20:29.148075 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/79bf951a-722d-42b3-a813-c16de894ee1f-var-run\") pod \"ovn-controller-chwf9-config-7qp5d\" (UID: \"79bf951a-722d-42b3-a813-c16de894ee1f\") " pod="openstack/ovn-controller-chwf9-config-7qp5d" Mar 18 18:20:29 crc kubenswrapper[4830]: I0318 18:20:29.148141 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6cnn\" (UniqueName: \"kubernetes.io/projected/79bf951a-722d-42b3-a813-c16de894ee1f-kube-api-access-f6cnn\") pod \"ovn-controller-chwf9-config-7qp5d\" (UID: \"79bf951a-722d-42b3-a813-c16de894ee1f\") " pod="openstack/ovn-controller-chwf9-config-7qp5d" Mar 18 18:20:29 crc kubenswrapper[4830]: I0318 18:20:29.148162 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79bf951a-722d-42b3-a813-c16de894ee1f-scripts\") pod \"ovn-controller-chwf9-config-7qp5d\" (UID: \"79bf951a-722d-42b3-a813-c16de894ee1f\") " pod="openstack/ovn-controller-chwf9-config-7qp5d" Mar 18 18:20:29 crc kubenswrapper[4830]: I0318 18:20:29.148186 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/79bf951a-722d-42b3-a813-c16de894ee1f-var-run-ovn\") pod \"ovn-controller-chwf9-config-7qp5d\" (UID: \"79bf951a-722d-42b3-a813-c16de894ee1f\") " pod="openstack/ovn-controller-chwf9-config-7qp5d" Mar 18 18:20:29 crc kubenswrapper[4830]: I0318 18:20:29.148335 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/79bf951a-722d-42b3-a813-c16de894ee1f-var-log-ovn\") pod \"ovn-controller-chwf9-config-7qp5d\" (UID: \"79bf951a-722d-42b3-a813-c16de894ee1f\") " pod="openstack/ovn-controller-chwf9-config-7qp5d" Mar 18 18:20:29 crc kubenswrapper[4830]: I0318 18:20:29.252857 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/79bf951a-722d-42b3-a813-c16de894ee1f-additional-scripts\") pod \"ovn-controller-chwf9-config-7qp5d\" (UID: \"79bf951a-722d-42b3-a813-c16de894ee1f\") " pod="openstack/ovn-controller-chwf9-config-7qp5d" Mar 18 18:20:29 crc kubenswrapper[4830]: I0318 18:20:29.252925 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/79bf951a-722d-42b3-a813-c16de894ee1f-var-run\") pod \"ovn-controller-chwf9-config-7qp5d\" (UID: \"79bf951a-722d-42b3-a813-c16de894ee1f\") " pod="openstack/ovn-controller-chwf9-config-7qp5d" Mar 18 18:20:29 crc kubenswrapper[4830]: I0318 18:20:29.253009 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6cnn\" (UniqueName: \"kubernetes.io/projected/79bf951a-722d-42b3-a813-c16de894ee1f-kube-api-access-f6cnn\") pod \"ovn-controller-chwf9-config-7qp5d\" (UID: \"79bf951a-722d-42b3-a813-c16de894ee1f\") " pod="openstack/ovn-controller-chwf9-config-7qp5d" Mar 18 18:20:29 crc kubenswrapper[4830]: I0318 18:20:29.253035 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79bf951a-722d-42b3-a813-c16de894ee1f-scripts\") pod \"ovn-controller-chwf9-config-7qp5d\" (UID: \"79bf951a-722d-42b3-a813-c16de894ee1f\") " pod="openstack/ovn-controller-chwf9-config-7qp5d" Mar 18 18:20:29 crc kubenswrapper[4830]: I0318 18:20:29.253064 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/79bf951a-722d-42b3-a813-c16de894ee1f-var-run-ovn\") pod \"ovn-controller-chwf9-config-7qp5d\" (UID: \"79bf951a-722d-42b3-a813-c16de894ee1f\") " pod="openstack/ovn-controller-chwf9-config-7qp5d" Mar 18 18:20:29 crc kubenswrapper[4830]: I0318 18:20:29.253114 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/79bf951a-722d-42b3-a813-c16de894ee1f-var-log-ovn\") pod \"ovn-controller-chwf9-config-7qp5d\" (UID: \"79bf951a-722d-42b3-a813-c16de894ee1f\") " pod="openstack/ovn-controller-chwf9-config-7qp5d" Mar 18 18:20:29 crc kubenswrapper[4830]: I0318 18:20:29.253251 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/79bf951a-722d-42b3-a813-c16de894ee1f-var-run\") pod \"ovn-controller-chwf9-config-7qp5d\" (UID: \"79bf951a-722d-42b3-a813-c16de894ee1f\") " pod="openstack/ovn-controller-chwf9-config-7qp5d" Mar 18 18:20:29 crc kubenswrapper[4830]: I0318 18:20:29.253291 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/79bf951a-722d-42b3-a813-c16de894ee1f-var-run-ovn\") pod \"ovn-controller-chwf9-config-7qp5d\" (UID: \"79bf951a-722d-42b3-a813-c16de894ee1f\") " pod="openstack/ovn-controller-chwf9-config-7qp5d" Mar 18 18:20:29 crc kubenswrapper[4830]: I0318 18:20:29.253358 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/79bf951a-722d-42b3-a813-c16de894ee1f-var-log-ovn\") pod \"ovn-controller-chwf9-config-7qp5d\" (UID: \"79bf951a-722d-42b3-a813-c16de894ee1f\") " pod="openstack/ovn-controller-chwf9-config-7qp5d" Mar 18 18:20:29 crc kubenswrapper[4830]: I0318 18:20:29.254175 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/79bf951a-722d-42b3-a813-c16de894ee1f-additional-scripts\") pod \"ovn-controller-chwf9-config-7qp5d\" (UID: \"79bf951a-722d-42b3-a813-c16de894ee1f\") " pod="openstack/ovn-controller-chwf9-config-7qp5d" Mar 18 18:20:29 crc kubenswrapper[4830]: I0318 18:20:29.255115 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79bf951a-722d-42b3-a813-c16de894ee1f-scripts\") pod \"ovn-controller-chwf9-config-7qp5d\" (UID: \"79bf951a-722d-42b3-a813-c16de894ee1f\") " pod="openstack/ovn-controller-chwf9-config-7qp5d" Mar 18 18:20:29 crc kubenswrapper[4830]: I0318 18:20:29.283331 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6cnn\" (UniqueName: \"kubernetes.io/projected/79bf951a-722d-42b3-a813-c16de894ee1f-kube-api-access-f6cnn\") pod \"ovn-controller-chwf9-config-7qp5d\" (UID: \"79bf951a-722d-42b3-a813-c16de894ee1f\") " pod="openstack/ovn-controller-chwf9-config-7qp5d" Mar 18 18:20:29 crc kubenswrapper[4830]: I0318 18:20:29.374196 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-chwf9-config-7qp5d" Mar 18 18:20:29 crc kubenswrapper[4830]: I0318 18:20:29.420346 4830 generic.go:334] "Generic (PLEG): container finished" podID="a639262d-5bc7-4b14-a6ef-59583fdffb07" containerID="4b3823ab703387f205ec3b36349fb621c98a8c89a6e4303228224586840c10d9" exitCode=0 Mar 18 18:20:29 crc kubenswrapper[4830]: I0318 18:20:29.420385 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a639262d-5bc7-4b14-a6ef-59583fdffb07","Type":"ContainerDied","Data":"4b3823ab703387f205ec3b36349fb621c98a8c89a6e4303228224586840c10d9"} Mar 18 18:20:30 crc kubenswrapper[4830]: I0318 18:20:30.251752 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb167c2e-263e-4566-bfa2-0c5ccbdb65b0" path="/var/lib/kubelet/pods/eb167c2e-263e-4566-bfa2-0c5ccbdb65b0/volumes" Mar 18 18:20:30 crc kubenswrapper[4830]: I0318 18:20:30.617013 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 18 18:20:30 crc kubenswrapper[4830]: I0318 18:20:30.997825 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-6h9l5"] Mar 18 18:20:30 crc kubenswrapper[4830]: I0318 18:20:30.998725 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6h9l5" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.012468 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-6h9l5"] Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.113529 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-7f98-account-create-update-5lnx7"] Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.114550 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7f98-account-create-update-5lnx7" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.116376 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.124841 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7f98-account-create-update-5lnx7"] Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.184079 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b61dfaea-fa74-44a5-b2a2-d6b7232008f9-operator-scripts\") pod \"cinder-db-create-6h9l5\" (UID: \"b61dfaea-fa74-44a5-b2a2-d6b7232008f9\") " pod="openstack/cinder-db-create-6h9l5" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.184190 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2bmh\" (UniqueName: \"kubernetes.io/projected/b61dfaea-fa74-44a5-b2a2-d6b7232008f9-kube-api-access-n2bmh\") pod \"cinder-db-create-6h9l5\" (UID: \"b61dfaea-fa74-44a5-b2a2-d6b7232008f9\") " pod="openstack/cinder-db-create-6h9l5" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.187710 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-fpvks"] Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.189483 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fpvks" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.202593 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-4043-account-create-update-d7kjb"] Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.203650 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4043-account-create-update-d7kjb" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.207419 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.216951 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-fpvks"] Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.231168 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-4043-account-create-update-d7kjb"] Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.286941 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf0bd2ef-825f-4fad-8a4a-135941d72b5b-operator-scripts\") pod \"cinder-7f98-account-create-update-5lnx7\" (UID: \"cf0bd2ef-825f-4fad-8a4a-135941d72b5b\") " pod="openstack/cinder-7f98-account-create-update-5lnx7" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.287014 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ca42574-9bdb-4ef3-bd58-3973e9144285-operator-scripts\") pod \"barbican-db-create-fpvks\" (UID: \"9ca42574-9bdb-4ef3-bd58-3973e9144285\") " pod="openstack/barbican-db-create-fpvks" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.287081 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2bmh\" (UniqueName: \"kubernetes.io/projected/b61dfaea-fa74-44a5-b2a2-d6b7232008f9-kube-api-access-n2bmh\") pod \"cinder-db-create-6h9l5\" (UID: \"b61dfaea-fa74-44a5-b2a2-d6b7232008f9\") " pod="openstack/cinder-db-create-6h9l5" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.287940 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bbbc\" (UniqueName: \"kubernetes.io/projected/cf0bd2ef-825f-4fad-8a4a-135941d72b5b-kube-api-access-9bbbc\") pod \"cinder-7f98-account-create-update-5lnx7\" (UID: \"cf0bd2ef-825f-4fad-8a4a-135941d72b5b\") " pod="openstack/cinder-7f98-account-create-update-5lnx7" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.288411 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b61dfaea-fa74-44a5-b2a2-d6b7232008f9-operator-scripts\") pod \"cinder-db-create-6h9l5\" (UID: \"b61dfaea-fa74-44a5-b2a2-d6b7232008f9\") " pod="openstack/cinder-db-create-6h9l5" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.288491 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzvbt\" (UniqueName: \"kubernetes.io/projected/9ca42574-9bdb-4ef3-bd58-3973e9144285-kube-api-access-bzvbt\") pod \"barbican-db-create-fpvks\" (UID: \"9ca42574-9bdb-4ef3-bd58-3973e9144285\") " pod="openstack/barbican-db-create-fpvks" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.290353 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b61dfaea-fa74-44a5-b2a2-d6b7232008f9-operator-scripts\") pod \"cinder-db-create-6h9l5\" (UID: \"b61dfaea-fa74-44a5-b2a2-d6b7232008f9\") " pod="openstack/cinder-db-create-6h9l5" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.314036 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-rkkhc"] Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.314998 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-rkkhc"] Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.315072 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rkkhc" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.318869 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-cg9pq" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.319274 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.319323 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.319416 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.326316 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2bmh\" (UniqueName: \"kubernetes.io/projected/b61dfaea-fa74-44a5-b2a2-d6b7232008f9-kube-api-access-n2bmh\") pod \"cinder-db-create-6h9l5\" (UID: \"b61dfaea-fa74-44a5-b2a2-d6b7232008f9\") " pod="openstack/cinder-db-create-6h9l5" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.332909 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-q96zj"] Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.334007 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-q96zj" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.342445 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-q96zj"] Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.390832 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzvbt\" (UniqueName: \"kubernetes.io/projected/9ca42574-9bdb-4ef3-bd58-3973e9144285-kube-api-access-bzvbt\") pod \"barbican-db-create-fpvks\" (UID: \"9ca42574-9bdb-4ef3-bd58-3973e9144285\") " pod="openstack/barbican-db-create-fpvks" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.390912 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdkmp\" (UniqueName: \"kubernetes.io/projected/be89bcb9-66b2-4bbd-bc78-be14e9503088-kube-api-access-xdkmp\") pod \"barbican-4043-account-create-update-d7kjb\" (UID: \"be89bcb9-66b2-4bbd-bc78-be14e9503088\") " pod="openstack/barbican-4043-account-create-update-d7kjb" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.390937 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf0bd2ef-825f-4fad-8a4a-135941d72b5b-operator-scripts\") pod \"cinder-7f98-account-create-update-5lnx7\" (UID: \"cf0bd2ef-825f-4fad-8a4a-135941d72b5b\") " pod="openstack/cinder-7f98-account-create-update-5lnx7" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.390969 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ca42574-9bdb-4ef3-bd58-3973e9144285-operator-scripts\") pod \"barbican-db-create-fpvks\" (UID: \"9ca42574-9bdb-4ef3-bd58-3973e9144285\") " pod="openstack/barbican-db-create-fpvks" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.391024 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be89bcb9-66b2-4bbd-bc78-be14e9503088-operator-scripts\") pod \"barbican-4043-account-create-update-d7kjb\" (UID: \"be89bcb9-66b2-4bbd-bc78-be14e9503088\") " pod="openstack/barbican-4043-account-create-update-d7kjb" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.391047 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bbbc\" (UniqueName: \"kubernetes.io/projected/cf0bd2ef-825f-4fad-8a4a-135941d72b5b-kube-api-access-9bbbc\") pod \"cinder-7f98-account-create-update-5lnx7\" (UID: \"cf0bd2ef-825f-4fad-8a4a-135941d72b5b\") " pod="openstack/cinder-7f98-account-create-update-5lnx7" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.392244 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf0bd2ef-825f-4fad-8a4a-135941d72b5b-operator-scripts\") pod \"cinder-7f98-account-create-update-5lnx7\" (UID: \"cf0bd2ef-825f-4fad-8a4a-135941d72b5b\") " pod="openstack/cinder-7f98-account-create-update-5lnx7" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.392736 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ca42574-9bdb-4ef3-bd58-3973e9144285-operator-scripts\") pod \"barbican-db-create-fpvks\" (UID: \"9ca42574-9bdb-4ef3-bd58-3973e9144285\") " pod="openstack/barbican-db-create-fpvks" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.398239 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b8f8-account-create-update-vwb9c"] Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.399623 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b8f8-account-create-update-vwb9c" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.401599 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.408295 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b8f8-account-create-update-vwb9c"] Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.412559 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzvbt\" (UniqueName: \"kubernetes.io/projected/9ca42574-9bdb-4ef3-bd58-3973e9144285-kube-api-access-bzvbt\") pod \"barbican-db-create-fpvks\" (UID: \"9ca42574-9bdb-4ef3-bd58-3973e9144285\") " pod="openstack/barbican-db-create-fpvks" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.413055 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bbbc\" (UniqueName: \"kubernetes.io/projected/cf0bd2ef-825f-4fad-8a4a-135941d72b5b-kube-api-access-9bbbc\") pod \"cinder-7f98-account-create-update-5lnx7\" (UID: \"cf0bd2ef-825f-4fad-8a4a-135941d72b5b\") " pod="openstack/cinder-7f98-account-create-update-5lnx7" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.428478 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7f98-account-create-update-5lnx7" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.494188 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkx9g\" (UniqueName: \"kubernetes.io/projected/d25630f0-a59e-44ba-94ba-bd0ae9216b42-kube-api-access-nkx9g\") pod \"keystone-db-sync-rkkhc\" (UID: \"d25630f0-a59e-44ba-94ba-bd0ae9216b42\") " pod="openstack/keystone-db-sync-rkkhc" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.494232 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be89bcb9-66b2-4bbd-bc78-be14e9503088-operator-scripts\") pod \"barbican-4043-account-create-update-d7kjb\" (UID: \"be89bcb9-66b2-4bbd-bc78-be14e9503088\") " pod="openstack/barbican-4043-account-create-update-d7kjb" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.494270 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e0852c0-51a3-4de2-9f84-e1c7042f4f13-operator-scripts\") pod \"neutron-db-create-q96zj\" (UID: \"5e0852c0-51a3-4de2-9f84-e1c7042f4f13\") " pod="openstack/neutron-db-create-q96zj" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.494364 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d25630f0-a59e-44ba-94ba-bd0ae9216b42-config-data\") pod \"keystone-db-sync-rkkhc\" (UID: \"d25630f0-a59e-44ba-94ba-bd0ae9216b42\") " pod="openstack/keystone-db-sync-rkkhc" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.494396 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdkmp\" (UniqueName: \"kubernetes.io/projected/be89bcb9-66b2-4bbd-bc78-be14e9503088-kube-api-access-xdkmp\") pod \"barbican-4043-account-create-update-d7kjb\" (UID: \"be89bcb9-66b2-4bbd-bc78-be14e9503088\") " pod="openstack/barbican-4043-account-create-update-d7kjb" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.494454 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d25630f0-a59e-44ba-94ba-bd0ae9216b42-combined-ca-bundle\") pod \"keystone-db-sync-rkkhc\" (UID: \"d25630f0-a59e-44ba-94ba-bd0ae9216b42\") " pod="openstack/keystone-db-sync-rkkhc" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.494472 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fpqg\" (UniqueName: \"kubernetes.io/projected/5e0852c0-51a3-4de2-9f84-e1c7042f4f13-kube-api-access-7fpqg\") pod \"neutron-db-create-q96zj\" (UID: \"5e0852c0-51a3-4de2-9f84-e1c7042f4f13\") " pod="openstack/neutron-db-create-q96zj" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.495392 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be89bcb9-66b2-4bbd-bc78-be14e9503088-operator-scripts\") pod \"barbican-4043-account-create-update-d7kjb\" (UID: \"be89bcb9-66b2-4bbd-bc78-be14e9503088\") " pod="openstack/barbican-4043-account-create-update-d7kjb" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.502068 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fpvks" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.510517 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdkmp\" (UniqueName: \"kubernetes.io/projected/be89bcb9-66b2-4bbd-bc78-be14e9503088-kube-api-access-xdkmp\") pod \"barbican-4043-account-create-update-d7kjb\" (UID: \"be89bcb9-66b2-4bbd-bc78-be14e9503088\") " pod="openstack/barbican-4043-account-create-update-d7kjb" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.513330 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-z5m7v"] Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.514391 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z5m7v" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.517917 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4043-account-create-update-d7kjb" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.518552 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.522356 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-z5m7v"] Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.595823 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnjsp\" (UniqueName: \"kubernetes.io/projected/7db73a7a-33c0-4d36-9e96-39b5d68e5af8-kube-api-access-rnjsp\") pod \"neutron-b8f8-account-create-update-vwb9c\" (UID: \"7db73a7a-33c0-4d36-9e96-39b5d68e5af8\") " pod="openstack/neutron-b8f8-account-create-update-vwb9c" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.595900 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d25630f0-a59e-44ba-94ba-bd0ae9216b42-config-data\") pod \"keystone-db-sync-rkkhc\" (UID: \"d25630f0-a59e-44ba-94ba-bd0ae9216b42\") " pod="openstack/keystone-db-sync-rkkhc" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.595926 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7db73a7a-33c0-4d36-9e96-39b5d68e5af8-operator-scripts\") pod \"neutron-b8f8-account-create-update-vwb9c\" (UID: \"7db73a7a-33c0-4d36-9e96-39b5d68e5af8\") " pod="openstack/neutron-b8f8-account-create-update-vwb9c" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.595972 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d25630f0-a59e-44ba-94ba-bd0ae9216b42-combined-ca-bundle\") pod \"keystone-db-sync-rkkhc\" (UID: \"d25630f0-a59e-44ba-94ba-bd0ae9216b42\") " pod="openstack/keystone-db-sync-rkkhc" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.595993 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fpqg\" (UniqueName: \"kubernetes.io/projected/5e0852c0-51a3-4de2-9f84-e1c7042f4f13-kube-api-access-7fpqg\") pod \"neutron-db-create-q96zj\" (UID: \"5e0852c0-51a3-4de2-9f84-e1c7042f4f13\") " pod="openstack/neutron-db-create-q96zj" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.596029 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkx9g\" (UniqueName: \"kubernetes.io/projected/d25630f0-a59e-44ba-94ba-bd0ae9216b42-kube-api-access-nkx9g\") pod \"keystone-db-sync-rkkhc\" (UID: \"d25630f0-a59e-44ba-94ba-bd0ae9216b42\") " pod="openstack/keystone-db-sync-rkkhc" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.596046 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e0852c0-51a3-4de2-9f84-e1c7042f4f13-operator-scripts\") pod \"neutron-db-create-q96zj\" (UID: \"5e0852c0-51a3-4de2-9f84-e1c7042f4f13\") " pod="openstack/neutron-db-create-q96zj" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.596846 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e0852c0-51a3-4de2-9f84-e1c7042f4f13-operator-scripts\") pod \"neutron-db-create-q96zj\" (UID: \"5e0852c0-51a3-4de2-9f84-e1c7042f4f13\") " pod="openstack/neutron-db-create-q96zj" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.599522 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d25630f0-a59e-44ba-94ba-bd0ae9216b42-combined-ca-bundle\") pod \"keystone-db-sync-rkkhc\" (UID: \"d25630f0-a59e-44ba-94ba-bd0ae9216b42\") " pod="openstack/keystone-db-sync-rkkhc" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.604576 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d25630f0-a59e-44ba-94ba-bd0ae9216b42-config-data\") pod \"keystone-db-sync-rkkhc\" (UID: \"d25630f0-a59e-44ba-94ba-bd0ae9216b42\") " pod="openstack/keystone-db-sync-rkkhc" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.614713 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6h9l5" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.615003 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkx9g\" (UniqueName: \"kubernetes.io/projected/d25630f0-a59e-44ba-94ba-bd0ae9216b42-kube-api-access-nkx9g\") pod \"keystone-db-sync-rkkhc\" (UID: \"d25630f0-a59e-44ba-94ba-bd0ae9216b42\") " pod="openstack/keystone-db-sync-rkkhc" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.616972 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fpqg\" (UniqueName: \"kubernetes.io/projected/5e0852c0-51a3-4de2-9f84-e1c7042f4f13-kube-api-access-7fpqg\") pod \"neutron-db-create-q96zj\" (UID: \"5e0852c0-51a3-4de2-9f84-e1c7042f4f13\") " pod="openstack/neutron-db-create-q96zj" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.654925 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rkkhc" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.662946 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-q96zj" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.697461 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnjsp\" (UniqueName: \"kubernetes.io/projected/7db73a7a-33c0-4d36-9e96-39b5d68e5af8-kube-api-access-rnjsp\") pod \"neutron-b8f8-account-create-update-vwb9c\" (UID: \"7db73a7a-33c0-4d36-9e96-39b5d68e5af8\") " pod="openstack/neutron-b8f8-account-create-update-vwb9c" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.697575 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7db73a7a-33c0-4d36-9e96-39b5d68e5af8-operator-scripts\") pod \"neutron-b8f8-account-create-update-vwb9c\" (UID: \"7db73a7a-33c0-4d36-9e96-39b5d68e5af8\") " pod="openstack/neutron-b8f8-account-create-update-vwb9c" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.697606 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dx8x\" (UniqueName: \"kubernetes.io/projected/7e6b8f3f-7b85-4504-b582-07edbfee2020-kube-api-access-4dx8x\") pod \"root-account-create-update-z5m7v\" (UID: \"7e6b8f3f-7b85-4504-b582-07edbfee2020\") " pod="openstack/root-account-create-update-z5m7v" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.697681 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e6b8f3f-7b85-4504-b582-07edbfee2020-operator-scripts\") pod \"root-account-create-update-z5m7v\" (UID: \"7e6b8f3f-7b85-4504-b582-07edbfee2020\") " pod="openstack/root-account-create-update-z5m7v" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.698900 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7db73a7a-33c0-4d36-9e96-39b5d68e5af8-operator-scripts\") pod \"neutron-b8f8-account-create-update-vwb9c\" (UID: \"7db73a7a-33c0-4d36-9e96-39b5d68e5af8\") " pod="openstack/neutron-b8f8-account-create-update-vwb9c" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.712022 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnjsp\" (UniqueName: \"kubernetes.io/projected/7db73a7a-33c0-4d36-9e96-39b5d68e5af8-kube-api-access-rnjsp\") pod \"neutron-b8f8-account-create-update-vwb9c\" (UID: \"7db73a7a-33c0-4d36-9e96-39b5d68e5af8\") " pod="openstack/neutron-b8f8-account-create-update-vwb9c" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.799084 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e6b8f3f-7b85-4504-b582-07edbfee2020-operator-scripts\") pod \"root-account-create-update-z5m7v\" (UID: \"7e6b8f3f-7b85-4504-b582-07edbfee2020\") " pod="openstack/root-account-create-update-z5m7v" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.799481 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dx8x\" (UniqueName: \"kubernetes.io/projected/7e6b8f3f-7b85-4504-b582-07edbfee2020-kube-api-access-4dx8x\") pod \"root-account-create-update-z5m7v\" (UID: \"7e6b8f3f-7b85-4504-b582-07edbfee2020\") " pod="openstack/root-account-create-update-z5m7v" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.800006 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e6b8f3f-7b85-4504-b582-07edbfee2020-operator-scripts\") pod \"root-account-create-update-z5m7v\" (UID: \"7e6b8f3f-7b85-4504-b582-07edbfee2020\") " pod="openstack/root-account-create-update-z5m7v" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.818090 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dx8x\" (UniqueName: \"kubernetes.io/projected/7e6b8f3f-7b85-4504-b582-07edbfee2020-kube-api-access-4dx8x\") pod \"root-account-create-update-z5m7v\" (UID: \"7e6b8f3f-7b85-4504-b582-07edbfee2020\") " pod="openstack/root-account-create-update-z5m7v" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.849989 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b8f8-account-create-update-vwb9c" Mar 18 18:20:31 crc kubenswrapper[4830]: I0318 18:20:31.850914 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z5m7v" Mar 18 18:20:36 crc kubenswrapper[4830]: E0318 18:20:36.496693 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api@sha256:dae5e39780d5a15eed030c7009f8e5317139d447558ac83f038497be594be120" Mar 18 18:20:36 crc kubenswrapper[4830]: E0318 18:20:36.497427 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:dae5e39780d5a15eed030c7009f8e5317139d447558ac83f038497be594be120,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5ckns,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-hxlrh_openstack(e2cad194-a0a7-44e7-8e5c-4653ae33983c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 18:20:36 crc kubenswrapper[4830]: E0318 18:20:36.498762 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-hxlrh" podUID="e2cad194-a0a7-44e7-8e5c-4653ae33983c" Mar 18 18:20:37 crc kubenswrapper[4830]: I0318 18:20:37.017054 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-q96zj"] Mar 18 18:20:37 crc kubenswrapper[4830]: W0318 18:20:37.030755 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e0852c0_51a3_4de2_9f84_e1c7042f4f13.slice/crio-d676fe515bd53cad2c250ea409334e3e12ea9a2dae6bf32fdade516fa6215a67 WatchSource:0}: Error finding container d676fe515bd53cad2c250ea409334e3e12ea9a2dae6bf32fdade516fa6215a67: Status 404 returned error can't find the container with id d676fe515bd53cad2c250ea409334e3e12ea9a2dae6bf32fdade516fa6215a67 Mar 18 18:20:37 crc kubenswrapper[4830]: I0318 18:20:37.140414 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-rkkhc"] Mar 18 18:20:37 crc kubenswrapper[4830]: I0318 18:20:37.267358 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b8f8-account-create-update-vwb9c"] Mar 18 18:20:37 crc kubenswrapper[4830]: I0318 18:20:37.292108 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-4043-account-create-update-d7kjb"] Mar 18 18:20:37 crc kubenswrapper[4830]: W0318 18:20:37.302314 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb61dfaea_fa74_44a5_b2a2_d6b7232008f9.slice/crio-f2b88a9cbbc941b8f38e5d0f67da05b4e0202606526cb0478abae41ab33c8d96 WatchSource:0}: Error finding container f2b88a9cbbc941b8f38e5d0f67da05b4e0202606526cb0478abae41ab33c8d96: Status 404 returned error can't find the container with id f2b88a9cbbc941b8f38e5d0f67da05b4e0202606526cb0478abae41ab33c8d96 Mar 18 18:20:37 crc kubenswrapper[4830]: I0318 18:20:37.306008 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-6h9l5"] Mar 18 18:20:37 crc kubenswrapper[4830]: I0318 18:20:37.314088 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-chwf9-config-7qp5d"] Mar 18 18:20:37 crc kubenswrapper[4830]: I0318 18:20:37.324182 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-fpvks"] Mar 18 18:20:37 crc kubenswrapper[4830]: I0318 18:20:37.451639 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7f98-account-create-update-5lnx7"] Mar 18 18:20:37 crc kubenswrapper[4830]: I0318 18:20:37.480842 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-z5m7v"] Mar 18 18:20:37 crc kubenswrapper[4830]: I0318 18:20:37.489604 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rkkhc" event={"ID":"d25630f0-a59e-44ba-94ba-bd0ae9216b42","Type":"ContainerStarted","Data":"e794dbc575c25828782d3ca2410722879654b0390c73596c5e6983878eca9f28"} Mar 18 18:20:37 crc kubenswrapper[4830]: I0318 18:20:37.491086 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a639262d-5bc7-4b14-a6ef-59583fdffb07","Type":"ContainerStarted","Data":"dae4cab83feb5262c8c7a5b8b0cb453b9f964431009385de80e3e0c21a526b8f"} Mar 18 18:20:37 crc kubenswrapper[4830]: I0318 18:20:37.491292 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:20:37 crc kubenswrapper[4830]: I0318 18:20:37.492249 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-chwf9-config-7qp5d" event={"ID":"79bf951a-722d-42b3-a813-c16de894ee1f","Type":"ContainerStarted","Data":"9f648471b7cd1b293eb6db52e2f7587ed713eb58c4f265f61593323747ddd861"} Mar 18 18:20:37 crc kubenswrapper[4830]: I0318 18:20:37.493395 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-fpvks" event={"ID":"9ca42574-9bdb-4ef3-bd58-3973e9144285","Type":"ContainerStarted","Data":"45b0e3c41a5c789e8e90e8c613785f69cfb31793c1c5909b6c196fef7d39432c"} Mar 18 18:20:37 crc kubenswrapper[4830]: I0318 18:20:37.494554 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b8f8-account-create-update-vwb9c" event={"ID":"7db73a7a-33c0-4d36-9e96-39b5d68e5af8","Type":"ContainerStarted","Data":"ee96dd527596e3d1a14530c8012e6152ff819bcc8aa6b2c382016dba4e9822e6"} Mar 18 18:20:37 crc kubenswrapper[4830]: I0318 18:20:37.496000 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4043-account-create-update-d7kjb" event={"ID":"be89bcb9-66b2-4bbd-bc78-be14e9503088","Type":"ContainerStarted","Data":"23f60446c0c8f76b742f35ec3f1a76450786bfff28a09af08633cef6ed0e30e8"} Mar 18 18:20:37 crc kubenswrapper[4830]: I0318 18:20:37.496778 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6h9l5" event={"ID":"b61dfaea-fa74-44a5-b2a2-d6b7232008f9","Type":"ContainerStarted","Data":"f2b88a9cbbc941b8f38e5d0f67da05b4e0202606526cb0478abae41ab33c8d96"} Mar 18 18:20:37 crc kubenswrapper[4830]: I0318 18:20:37.498961 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-q96zj" event={"ID":"5e0852c0-51a3-4de2-9f84-e1c7042f4f13","Type":"ContainerStarted","Data":"230ee49d6b1a370d560b5372a13a835245324de53ff64139c40f778e9c9df746"} Mar 18 18:20:37 crc kubenswrapper[4830]: I0318 18:20:37.499954 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-q96zj" event={"ID":"5e0852c0-51a3-4de2-9f84-e1c7042f4f13","Type":"ContainerStarted","Data":"d676fe515bd53cad2c250ea409334e3e12ea9a2dae6bf32fdade516fa6215a67"} Mar 18 18:20:37 crc kubenswrapper[4830]: E0318 18:20:37.500104 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api@sha256:dae5e39780d5a15eed030c7009f8e5317139d447558ac83f038497be594be120\\\"\"" pod="openstack/glance-db-sync-hxlrh" podUID="e2cad194-a0a7-44e7-8e5c-4653ae33983c" Mar 18 18:20:37 crc kubenswrapper[4830]: I0318 18:20:37.519858 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=-9223371957.33494 podStartE2EDuration="1m19.519835983s" podCreationTimestamp="2026-03-18 18:19:18 +0000 UTC" firstStartedPulling="2026-03-18 18:19:20.705353446 +0000 UTC m=+995.272983778" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:20:37.51037107 +0000 UTC m=+1072.078001412" watchObservedRunningTime="2026-03-18 18:20:37.519835983 +0000 UTC m=+1072.087466325" Mar 18 18:20:37 crc kubenswrapper[4830]: I0318 18:20:37.534620 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-q96zj" podStartSLOduration=6.534603293 podStartE2EDuration="6.534603293s" podCreationTimestamp="2026-03-18 18:20:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:20:37.525271944 +0000 UTC m=+1072.092902286" watchObservedRunningTime="2026-03-18 18:20:37.534603293 +0000 UTC m=+1072.102233625" Mar 18 18:20:38 crc kubenswrapper[4830]: I0318 18:20:38.526277 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccc6cbaa-b562-49fc-9add-94aac04d60ed","Type":"ContainerStarted","Data":"a19a7ebcd14a4be0ac0694088743b29c2a922f84d22e54d870830b7764d78682"} Mar 18 18:20:38 crc kubenswrapper[4830]: I0318 18:20:38.526863 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccc6cbaa-b562-49fc-9add-94aac04d60ed","Type":"ContainerStarted","Data":"44d7f582b1786283b3e923d17f41dabde89bb1069ef6be7a6bc4c163e7c6d398"} Mar 18 18:20:38 crc kubenswrapper[4830]: I0318 18:20:38.527981 4830 generic.go:334] "Generic (PLEG): container finished" podID="7db73a7a-33c0-4d36-9e96-39b5d68e5af8" containerID="dfdcf803a034b493e9fb3e0d17cf9adc12be15bac2d2e9b4f31b3f1c84c90d38" exitCode=0 Mar 18 18:20:38 crc kubenswrapper[4830]: I0318 18:20:38.528409 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b8f8-account-create-update-vwb9c" event={"ID":"7db73a7a-33c0-4d36-9e96-39b5d68e5af8","Type":"ContainerDied","Data":"dfdcf803a034b493e9fb3e0d17cf9adc12be15bac2d2e9b4f31b3f1c84c90d38"} Mar 18 18:20:38 crc kubenswrapper[4830]: I0318 18:20:38.536988 4830 generic.go:334] "Generic (PLEG): container finished" podID="be89bcb9-66b2-4bbd-bc78-be14e9503088" containerID="149f880fe60e94677e4b390ed1783c9f69df4024032f5dd5d9d59bc9f45e506a" exitCode=0 Mar 18 18:20:38 crc kubenswrapper[4830]: I0318 18:20:38.537081 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4043-account-create-update-d7kjb" event={"ID":"be89bcb9-66b2-4bbd-bc78-be14e9503088","Type":"ContainerDied","Data":"149f880fe60e94677e4b390ed1783c9f69df4024032f5dd5d9d59bc9f45e506a"} Mar 18 18:20:38 crc kubenswrapper[4830]: I0318 18:20:38.538616 4830 generic.go:334] "Generic (PLEG): container finished" podID="b61dfaea-fa74-44a5-b2a2-d6b7232008f9" containerID="aa08cf82fb2e3b65a4db0cafef455de1e72e29ea5cb6fff9ccdb05335df61a7a" exitCode=0 Mar 18 18:20:38 crc kubenswrapper[4830]: I0318 18:20:38.538715 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6h9l5" event={"ID":"b61dfaea-fa74-44a5-b2a2-d6b7232008f9","Type":"ContainerDied","Data":"aa08cf82fb2e3b65a4db0cafef455de1e72e29ea5cb6fff9ccdb05335df61a7a"} Mar 18 18:20:38 crc kubenswrapper[4830]: I0318 18:20:38.539963 4830 generic.go:334] "Generic (PLEG): container finished" podID="7e6b8f3f-7b85-4504-b582-07edbfee2020" containerID="2ec0b23590aa07b58f0bce309a6926e720bbad903a64b11e9d199099e2f7a8f8" exitCode=0 Mar 18 18:20:38 crc kubenswrapper[4830]: I0318 18:20:38.540013 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-z5m7v" event={"ID":"7e6b8f3f-7b85-4504-b582-07edbfee2020","Type":"ContainerDied","Data":"2ec0b23590aa07b58f0bce309a6926e720bbad903a64b11e9d199099e2f7a8f8"} Mar 18 18:20:38 crc kubenswrapper[4830]: I0318 18:20:38.540031 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-z5m7v" event={"ID":"7e6b8f3f-7b85-4504-b582-07edbfee2020","Type":"ContainerStarted","Data":"fad8c95c3de1c6b71faebc2936531ec0c08a86068ad8efbc3322f20e10edf088"} Mar 18 18:20:38 crc kubenswrapper[4830]: I0318 18:20:38.552392 4830 generic.go:334] "Generic (PLEG): container finished" podID="79bf951a-722d-42b3-a813-c16de894ee1f" containerID="76403c1ba924bd720fa9ad0e8b9fdcc56b2531f042be32e994982f3fc7c33064" exitCode=0 Mar 18 18:20:38 crc kubenswrapper[4830]: I0318 18:20:38.552469 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-chwf9-config-7qp5d" event={"ID":"79bf951a-722d-42b3-a813-c16de894ee1f","Type":"ContainerDied","Data":"76403c1ba924bd720fa9ad0e8b9fdcc56b2531f042be32e994982f3fc7c33064"} Mar 18 18:20:38 crc kubenswrapper[4830]: I0318 18:20:38.555136 4830 generic.go:334] "Generic (PLEG): container finished" podID="9ca42574-9bdb-4ef3-bd58-3973e9144285" containerID="f310910b8917cc5993658361fcb89166ad9ce2e06a60d4b649ae75c286fa88be" exitCode=0 Mar 18 18:20:38 crc kubenswrapper[4830]: I0318 18:20:38.555193 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-fpvks" event={"ID":"9ca42574-9bdb-4ef3-bd58-3973e9144285","Type":"ContainerDied","Data":"f310910b8917cc5993658361fcb89166ad9ce2e06a60d4b649ae75c286fa88be"} Mar 18 18:20:38 crc kubenswrapper[4830]: I0318 18:20:38.556901 4830 generic.go:334] "Generic (PLEG): container finished" podID="5e0852c0-51a3-4de2-9f84-e1c7042f4f13" containerID="230ee49d6b1a370d560b5372a13a835245324de53ff64139c40f778e9c9df746" exitCode=0 Mar 18 18:20:38 crc kubenswrapper[4830]: I0318 18:20:38.556947 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-q96zj" event={"ID":"5e0852c0-51a3-4de2-9f84-e1c7042f4f13","Type":"ContainerDied","Data":"230ee49d6b1a370d560b5372a13a835245324de53ff64139c40f778e9c9df746"} Mar 18 18:20:38 crc kubenswrapper[4830]: I0318 18:20:38.565296 4830 generic.go:334] "Generic (PLEG): container finished" podID="cf0bd2ef-825f-4fad-8a4a-135941d72b5b" containerID="bfe502cfd69f0dcee4198317e7340351aefeb4d8e022de2043630ae66cc8612a" exitCode=0 Mar 18 18:20:38 crc kubenswrapper[4830]: I0318 18:20:38.565398 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7f98-account-create-update-5lnx7" event={"ID":"cf0bd2ef-825f-4fad-8a4a-135941d72b5b","Type":"ContainerDied","Data":"bfe502cfd69f0dcee4198317e7340351aefeb4d8e022de2043630ae66cc8612a"} Mar 18 18:20:38 crc kubenswrapper[4830]: I0318 18:20:38.565464 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7f98-account-create-update-5lnx7" event={"ID":"cf0bd2ef-825f-4fad-8a4a-135941d72b5b","Type":"ContainerStarted","Data":"f88a06adbcb9023c467b0a3066510f91933a675bbae714b733fbb3b4d0f046b4"} Mar 18 18:20:39 crc kubenswrapper[4830]: I0318 18:20:39.583520 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccc6cbaa-b562-49fc-9add-94aac04d60ed","Type":"ContainerStarted","Data":"19e2f77105d5703f0646d3c61e7fe7c902c627dbb91bbc626d9e5d5bb3fa485c"} Mar 18 18:20:39 crc kubenswrapper[4830]: I0318 18:20:39.584126 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccc6cbaa-b562-49fc-9add-94aac04d60ed","Type":"ContainerStarted","Data":"68ad223077ac746b9802f4eba8764e5eaa00ca98bf3773872d2cd95daf9b38f0"} Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.280423 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6h9l5" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.286101 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fpvks" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.291438 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-q96zj" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.301716 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z5m7v" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.342024 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7f98-account-create-update-5lnx7" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.344963 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzvbt\" (UniqueName: \"kubernetes.io/projected/9ca42574-9bdb-4ef3-bd58-3973e9144285-kube-api-access-bzvbt\") pod \"9ca42574-9bdb-4ef3-bd58-3973e9144285\" (UID: \"9ca42574-9bdb-4ef3-bd58-3973e9144285\") " Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.345062 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e0852c0-51a3-4de2-9f84-e1c7042f4f13-operator-scripts\") pod \"5e0852c0-51a3-4de2-9f84-e1c7042f4f13\" (UID: \"5e0852c0-51a3-4de2-9f84-e1c7042f4f13\") " Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.345153 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ca42574-9bdb-4ef3-bd58-3973e9144285-operator-scripts\") pod \"9ca42574-9bdb-4ef3-bd58-3973e9144285\" (UID: \"9ca42574-9bdb-4ef3-bd58-3973e9144285\") " Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.345199 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2bmh\" (UniqueName: \"kubernetes.io/projected/b61dfaea-fa74-44a5-b2a2-d6b7232008f9-kube-api-access-n2bmh\") pod \"b61dfaea-fa74-44a5-b2a2-d6b7232008f9\" (UID: \"b61dfaea-fa74-44a5-b2a2-d6b7232008f9\") " Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.345274 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fpqg\" (UniqueName: \"kubernetes.io/projected/5e0852c0-51a3-4de2-9f84-e1c7042f4f13-kube-api-access-7fpqg\") pod \"5e0852c0-51a3-4de2-9f84-e1c7042f4f13\" (UID: \"5e0852c0-51a3-4de2-9f84-e1c7042f4f13\") " Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.345363 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e6b8f3f-7b85-4504-b582-07edbfee2020-operator-scripts\") pod \"7e6b8f3f-7b85-4504-b582-07edbfee2020\" (UID: \"7e6b8f3f-7b85-4504-b582-07edbfee2020\") " Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.345429 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b61dfaea-fa74-44a5-b2a2-d6b7232008f9-operator-scripts\") pod \"b61dfaea-fa74-44a5-b2a2-d6b7232008f9\" (UID: \"b61dfaea-fa74-44a5-b2a2-d6b7232008f9\") " Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.345454 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dx8x\" (UniqueName: \"kubernetes.io/projected/7e6b8f3f-7b85-4504-b582-07edbfee2020-kube-api-access-4dx8x\") pod \"7e6b8f3f-7b85-4504-b582-07edbfee2020\" (UID: \"7e6b8f3f-7b85-4504-b582-07edbfee2020\") " Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.346067 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4043-account-create-update-d7kjb" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.346129 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e6b8f3f-7b85-4504-b582-07edbfee2020-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7e6b8f3f-7b85-4504-b582-07edbfee2020" (UID: "7e6b8f3f-7b85-4504-b582-07edbfee2020"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.346158 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b8f8-account-create-update-vwb9c" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.346159 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b61dfaea-fa74-44a5-b2a2-d6b7232008f9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b61dfaea-fa74-44a5-b2a2-d6b7232008f9" (UID: "b61dfaea-fa74-44a5-b2a2-d6b7232008f9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.346337 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e0852c0-51a3-4de2-9f84-e1c7042f4f13-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5e0852c0-51a3-4de2-9f84-e1c7042f4f13" (UID: "5e0852c0-51a3-4de2-9f84-e1c7042f4f13"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.346455 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ca42574-9bdb-4ef3-bd58-3973e9144285-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9ca42574-9bdb-4ef3-bd58-3973e9144285" (UID: "9ca42574-9bdb-4ef3-bd58-3973e9144285"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.348210 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-chwf9-config-7qp5d" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.352213 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ca42574-9bdb-4ef3-bd58-3973e9144285-kube-api-access-bzvbt" (OuterVolumeSpecName: "kube-api-access-bzvbt") pod "9ca42574-9bdb-4ef3-bd58-3973e9144285" (UID: "9ca42574-9bdb-4ef3-bd58-3973e9144285"). InnerVolumeSpecName "kube-api-access-bzvbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.356022 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e0852c0-51a3-4de2-9f84-e1c7042f4f13-kube-api-access-7fpqg" (OuterVolumeSpecName: "kube-api-access-7fpqg") pod "5e0852c0-51a3-4de2-9f84-e1c7042f4f13" (UID: "5e0852c0-51a3-4de2-9f84-e1c7042f4f13"). InnerVolumeSpecName "kube-api-access-7fpqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.368433 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e6b8f3f-7b85-4504-b582-07edbfee2020-kube-api-access-4dx8x" (OuterVolumeSpecName: "kube-api-access-4dx8x") pod "7e6b8f3f-7b85-4504-b582-07edbfee2020" (UID: "7e6b8f3f-7b85-4504-b582-07edbfee2020"). InnerVolumeSpecName "kube-api-access-4dx8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.381608 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b61dfaea-fa74-44a5-b2a2-d6b7232008f9-kube-api-access-n2bmh" (OuterVolumeSpecName: "kube-api-access-n2bmh") pod "b61dfaea-fa74-44a5-b2a2-d6b7232008f9" (UID: "b61dfaea-fa74-44a5-b2a2-d6b7232008f9"). InnerVolumeSpecName "kube-api-access-n2bmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.447501 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/79bf951a-722d-42b3-a813-c16de894ee1f-var-log-ovn\") pod \"79bf951a-722d-42b3-a813-c16de894ee1f\" (UID: \"79bf951a-722d-42b3-a813-c16de894ee1f\") " Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.447566 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf0bd2ef-825f-4fad-8a4a-135941d72b5b-operator-scripts\") pod \"cf0bd2ef-825f-4fad-8a4a-135941d72b5b\" (UID: \"cf0bd2ef-825f-4fad-8a4a-135941d72b5b\") " Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.447592 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/79bf951a-722d-42b3-a813-c16de894ee1f-var-run-ovn\") pod \"79bf951a-722d-42b3-a813-c16de894ee1f\" (UID: \"79bf951a-722d-42b3-a813-c16de894ee1f\") " Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.447638 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79bf951a-722d-42b3-a813-c16de894ee1f-scripts\") pod \"79bf951a-722d-42b3-a813-c16de894ee1f\" (UID: \"79bf951a-722d-42b3-a813-c16de894ee1f\") " Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.447666 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/79bf951a-722d-42b3-a813-c16de894ee1f-var-run\") pod \"79bf951a-722d-42b3-a813-c16de894ee1f\" (UID: \"79bf951a-722d-42b3-a813-c16de894ee1f\") " Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.447660 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79bf951a-722d-42b3-a813-c16de894ee1f-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "79bf951a-722d-42b3-a813-c16de894ee1f" (UID: "79bf951a-722d-42b3-a813-c16de894ee1f"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.447706 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdkmp\" (UniqueName: \"kubernetes.io/projected/be89bcb9-66b2-4bbd-bc78-be14e9503088-kube-api-access-xdkmp\") pod \"be89bcb9-66b2-4bbd-bc78-be14e9503088\" (UID: \"be89bcb9-66b2-4bbd-bc78-be14e9503088\") " Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.447726 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79bf951a-722d-42b3-a813-c16de894ee1f-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "79bf951a-722d-42b3-a813-c16de894ee1f" (UID: "79bf951a-722d-42b3-a813-c16de894ee1f"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.447752 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/79bf951a-722d-42b3-a813-c16de894ee1f-additional-scripts\") pod \"79bf951a-722d-42b3-a813-c16de894ee1f\" (UID: \"79bf951a-722d-42b3-a813-c16de894ee1f\") " Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.447784 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnjsp\" (UniqueName: \"kubernetes.io/projected/7db73a7a-33c0-4d36-9e96-39b5d68e5af8-kube-api-access-rnjsp\") pod \"7db73a7a-33c0-4d36-9e96-39b5d68e5af8\" (UID: \"7db73a7a-33c0-4d36-9e96-39b5d68e5af8\") " Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.447821 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bbbc\" (UniqueName: \"kubernetes.io/projected/cf0bd2ef-825f-4fad-8a4a-135941d72b5b-kube-api-access-9bbbc\") pod \"cf0bd2ef-825f-4fad-8a4a-135941d72b5b\" (UID: \"cf0bd2ef-825f-4fad-8a4a-135941d72b5b\") " Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.447842 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7db73a7a-33c0-4d36-9e96-39b5d68e5af8-operator-scripts\") pod \"7db73a7a-33c0-4d36-9e96-39b5d68e5af8\" (UID: \"7db73a7a-33c0-4d36-9e96-39b5d68e5af8\") " Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.447891 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6cnn\" (UniqueName: \"kubernetes.io/projected/79bf951a-722d-42b3-a813-c16de894ee1f-kube-api-access-f6cnn\") pod \"79bf951a-722d-42b3-a813-c16de894ee1f\" (UID: \"79bf951a-722d-42b3-a813-c16de894ee1f\") " Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.447916 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be89bcb9-66b2-4bbd-bc78-be14e9503088-operator-scripts\") pod \"be89bcb9-66b2-4bbd-bc78-be14e9503088\" (UID: \"be89bcb9-66b2-4bbd-bc78-be14e9503088\") " Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.448131 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf0bd2ef-825f-4fad-8a4a-135941d72b5b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cf0bd2ef-825f-4fad-8a4a-135941d72b5b" (UID: "cf0bd2ef-825f-4fad-8a4a-135941d72b5b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.448252 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2bmh\" (UniqueName: \"kubernetes.io/projected/b61dfaea-fa74-44a5-b2a2-d6b7232008f9-kube-api-access-n2bmh\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.448270 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fpqg\" (UniqueName: \"kubernetes.io/projected/5e0852c0-51a3-4de2-9f84-e1c7042f4f13-kube-api-access-7fpqg\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.448279 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e6b8f3f-7b85-4504-b582-07edbfee2020-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.448288 4830 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/79bf951a-722d-42b3-a813-c16de894ee1f-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.448297 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b61dfaea-fa74-44a5-b2a2-d6b7232008f9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.448305 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dx8x\" (UniqueName: \"kubernetes.io/projected/7e6b8f3f-7b85-4504-b582-07edbfee2020-kube-api-access-4dx8x\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.448314 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf0bd2ef-825f-4fad-8a4a-135941d72b5b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.448322 4830 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/79bf951a-722d-42b3-a813-c16de894ee1f-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.448331 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzvbt\" (UniqueName: \"kubernetes.io/projected/9ca42574-9bdb-4ef3-bd58-3973e9144285-kube-api-access-bzvbt\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.448342 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e0852c0-51a3-4de2-9f84-e1c7042f4f13-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.448350 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ca42574-9bdb-4ef3-bd58-3973e9144285-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.448733 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be89bcb9-66b2-4bbd-bc78-be14e9503088-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "be89bcb9-66b2-4bbd-bc78-be14e9503088" (UID: "be89bcb9-66b2-4bbd-bc78-be14e9503088"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.449171 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7db73a7a-33c0-4d36-9e96-39b5d68e5af8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7db73a7a-33c0-4d36-9e96-39b5d68e5af8" (UID: "7db73a7a-33c0-4d36-9e96-39b5d68e5af8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.449651 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79bf951a-722d-42b3-a813-c16de894ee1f-var-run" (OuterVolumeSpecName: "var-run") pod "79bf951a-722d-42b3-a813-c16de894ee1f" (UID: "79bf951a-722d-42b3-a813-c16de894ee1f"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.450951 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf0bd2ef-825f-4fad-8a4a-135941d72b5b-kube-api-access-9bbbc" (OuterVolumeSpecName: "kube-api-access-9bbbc") pod "cf0bd2ef-825f-4fad-8a4a-135941d72b5b" (UID: "cf0bd2ef-825f-4fad-8a4a-135941d72b5b"). InnerVolumeSpecName "kube-api-access-9bbbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.453041 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79bf951a-722d-42b3-a813-c16de894ee1f-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "79bf951a-722d-42b3-a813-c16de894ee1f" (UID: "79bf951a-722d-42b3-a813-c16de894ee1f"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.453263 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79bf951a-722d-42b3-a813-c16de894ee1f-kube-api-access-f6cnn" (OuterVolumeSpecName: "kube-api-access-f6cnn") pod "79bf951a-722d-42b3-a813-c16de894ee1f" (UID: "79bf951a-722d-42b3-a813-c16de894ee1f"). InnerVolumeSpecName "kube-api-access-f6cnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.453481 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be89bcb9-66b2-4bbd-bc78-be14e9503088-kube-api-access-xdkmp" (OuterVolumeSpecName: "kube-api-access-xdkmp") pod "be89bcb9-66b2-4bbd-bc78-be14e9503088" (UID: "be89bcb9-66b2-4bbd-bc78-be14e9503088"). InnerVolumeSpecName "kube-api-access-xdkmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.454753 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79bf951a-722d-42b3-a813-c16de894ee1f-scripts" (OuterVolumeSpecName: "scripts") pod "79bf951a-722d-42b3-a813-c16de894ee1f" (UID: "79bf951a-722d-42b3-a813-c16de894ee1f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.456909 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7db73a7a-33c0-4d36-9e96-39b5d68e5af8-kube-api-access-rnjsp" (OuterVolumeSpecName: "kube-api-access-rnjsp") pod "7db73a7a-33c0-4d36-9e96-39b5d68e5af8" (UID: "7db73a7a-33c0-4d36-9e96-39b5d68e5af8"). InnerVolumeSpecName "kube-api-access-rnjsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.549319 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bbbc\" (UniqueName: \"kubernetes.io/projected/cf0bd2ef-825f-4fad-8a4a-135941d72b5b-kube-api-access-9bbbc\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.549351 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7db73a7a-33c0-4d36-9e96-39b5d68e5af8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.549360 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6cnn\" (UniqueName: \"kubernetes.io/projected/79bf951a-722d-42b3-a813-c16de894ee1f-kube-api-access-f6cnn\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.549369 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be89bcb9-66b2-4bbd-bc78-be14e9503088-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.549378 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79bf951a-722d-42b3-a813-c16de894ee1f-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.549386 4830 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/79bf951a-722d-42b3-a813-c16de894ee1f-var-run\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.549395 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdkmp\" (UniqueName: \"kubernetes.io/projected/be89bcb9-66b2-4bbd-bc78-be14e9503088-kube-api-access-xdkmp\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.549405 4830 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/79bf951a-722d-42b3-a813-c16de894ee1f-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.549413 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnjsp\" (UniqueName: \"kubernetes.io/projected/7db73a7a-33c0-4d36-9e96-39b5d68e5af8-kube-api-access-rnjsp\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.618237 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b8f8-account-create-update-vwb9c" event={"ID":"7db73a7a-33c0-4d36-9e96-39b5d68e5af8","Type":"ContainerDied","Data":"ee96dd527596e3d1a14530c8012e6152ff819bcc8aa6b2c382016dba4e9822e6"} Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.618275 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee96dd527596e3d1a14530c8012e6152ff819bcc8aa6b2c382016dba4e9822e6" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.618320 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b8f8-account-create-update-vwb9c" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.623738 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4043-account-create-update-d7kjb" event={"ID":"be89bcb9-66b2-4bbd-bc78-be14e9503088","Type":"ContainerDied","Data":"23f60446c0c8f76b742f35ec3f1a76450786bfff28a09af08633cef6ed0e30e8"} Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.623764 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23f60446c0c8f76b742f35ec3f1a76450786bfff28a09af08633cef6ed0e30e8" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.623817 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4043-account-create-update-d7kjb" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.631911 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-z5m7v" event={"ID":"7e6b8f3f-7b85-4504-b582-07edbfee2020","Type":"ContainerDied","Data":"fad8c95c3de1c6b71faebc2936531ec0c08a86068ad8efbc3322f20e10edf088"} Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.632006 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fad8c95c3de1c6b71faebc2936531ec0c08a86068ad8efbc3322f20e10edf088" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.632049 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z5m7v" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.638326 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-q96zj" event={"ID":"5e0852c0-51a3-4de2-9f84-e1c7042f4f13","Type":"ContainerDied","Data":"d676fe515bd53cad2c250ea409334e3e12ea9a2dae6bf32fdade516fa6215a67"} Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.638431 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d676fe515bd53cad2c250ea409334e3e12ea9a2dae6bf32fdade516fa6215a67" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.638536 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-q96zj" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.644837 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7f98-account-create-update-5lnx7" event={"ID":"cf0bd2ef-825f-4fad-8a4a-135941d72b5b","Type":"ContainerDied","Data":"f88a06adbcb9023c467b0a3066510f91933a675bbae714b733fbb3b4d0f046b4"} Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.644860 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f88a06adbcb9023c467b0a3066510f91933a675bbae714b733fbb3b4d0f046b4" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.645008 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7f98-account-create-update-5lnx7" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.646279 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-chwf9-config-7qp5d" event={"ID":"79bf951a-722d-42b3-a813-c16de894ee1f","Type":"ContainerDied","Data":"9f648471b7cd1b293eb6db52e2f7587ed713eb58c4f265f61593323747ddd861"} Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.646300 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f648471b7cd1b293eb6db52e2f7587ed713eb58c4f265f61593323747ddd861" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.646336 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-chwf9-config-7qp5d" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.649391 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-fpvks" event={"ID":"9ca42574-9bdb-4ef3-bd58-3973e9144285","Type":"ContainerDied","Data":"45b0e3c41a5c789e8e90e8c613785f69cfb31793c1c5909b6c196fef7d39432c"} Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.649413 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45b0e3c41a5c789e8e90e8c613785f69cfb31793c1c5909b6c196fef7d39432c" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.649443 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fpvks" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.653807 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6h9l5" event={"ID":"b61dfaea-fa74-44a5-b2a2-d6b7232008f9","Type":"ContainerDied","Data":"f2b88a9cbbc941b8f38e5d0f67da05b4e0202606526cb0478abae41ab33c8d96"} Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.653841 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2b88a9cbbc941b8f38e5d0f67da05b4e0202606526cb0478abae41ab33c8d96" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.653890 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6h9l5" Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.656392 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rkkhc" event={"ID":"d25630f0-a59e-44ba-94ba-bd0ae9216b42","Type":"ContainerStarted","Data":"3e305ed4bde4816d27ef7a6fa961104500d0307d2c1c0ea2ab3b18a26aa4b3f2"} Mar 18 18:20:42 crc kubenswrapper[4830]: I0318 18:20:42.687976 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-rkkhc" podStartSLOduration=6.90490583 podStartE2EDuration="11.687956135s" podCreationTimestamp="2026-03-18 18:20:31 +0000 UTC" firstStartedPulling="2026-03-18 18:20:37.294543558 +0000 UTC m=+1071.862173890" lastFinishedPulling="2026-03-18 18:20:42.077593853 +0000 UTC m=+1076.645224195" observedRunningTime="2026-03-18 18:20:42.678341187 +0000 UTC m=+1077.245971509" watchObservedRunningTime="2026-03-18 18:20:42.687956135 +0000 UTC m=+1077.255586467" Mar 18 18:20:43 crc kubenswrapper[4830]: I0318 18:20:43.482807 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-chwf9-config-7qp5d"] Mar 18 18:20:43 crc kubenswrapper[4830]: I0318 18:20:43.488355 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-chwf9-config-7qp5d"] Mar 18 18:20:43 crc kubenswrapper[4830]: I0318 18:20:43.670650 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccc6cbaa-b562-49fc-9add-94aac04d60ed","Type":"ContainerStarted","Data":"36d3831532c5080f76c17b505df06b38c560192c7e4793abf714c8adf589ca70"} Mar 18 18:20:43 crc kubenswrapper[4830]: I0318 18:20:43.670688 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccc6cbaa-b562-49fc-9add-94aac04d60ed","Type":"ContainerStarted","Data":"b34b3a8f9ec6ede03cd125304c68ec8d92f19169893d3ffa48c8c3477adb2572"} Mar 18 18:20:43 crc kubenswrapper[4830]: I0318 18:20:43.670701 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccc6cbaa-b562-49fc-9add-94aac04d60ed","Type":"ContainerStarted","Data":"3d815a588191bb1f303bdb826c5890be7d59362cd896066cfe0bd7ed228c7623"} Mar 18 18:20:44 crc kubenswrapper[4830]: I0318 18:20:44.248757 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79bf951a-722d-42b3-a813-c16de894ee1f" path="/var/lib/kubelet/pods/79bf951a-722d-42b3-a813-c16de894ee1f/volumes" Mar 18 18:20:44 crc kubenswrapper[4830]: I0318 18:20:44.692913 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccc6cbaa-b562-49fc-9add-94aac04d60ed","Type":"ContainerStarted","Data":"a20dea92408e3316b920fe3e34c3564167b91f44ec33c56fd94553ec6a29e550"} Mar 18 18:20:44 crc kubenswrapper[4830]: I0318 18:20:44.693372 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccc6cbaa-b562-49fc-9add-94aac04d60ed","Type":"ContainerStarted","Data":"ddac036c21cf6e7f7086be2d69ffc2a2c68d39299f23922898025a29a0596dc2"} Mar 18 18:20:44 crc kubenswrapper[4830]: I0318 18:20:44.693395 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccc6cbaa-b562-49fc-9add-94aac04d60ed","Type":"ContainerStarted","Data":"93c102b5fc9f4a88a8768bdc36062b71725eccef648daf124aa807bb59ea8cd8"} Mar 18 18:20:44 crc kubenswrapper[4830]: I0318 18:20:44.693415 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccc6cbaa-b562-49fc-9add-94aac04d60ed","Type":"ContainerStarted","Data":"a4919c4786f2548b6767558777a241dc56d419f9904e042566ee841adbe1f1e3"} Mar 18 18:20:44 crc kubenswrapper[4830]: I0318 18:20:44.750025 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.016702771 podStartE2EDuration="39.749995681s" podCreationTimestamp="2026-03-18 18:20:05 +0000 UTC" firstStartedPulling="2026-03-18 18:20:23.028853658 +0000 UTC m=+1057.596483990" lastFinishedPulling="2026-03-18 18:20:42.762146568 +0000 UTC m=+1077.329776900" observedRunningTime="2026-03-18 18:20:44.738163932 +0000 UTC m=+1079.305794304" watchObservedRunningTime="2026-03-18 18:20:44.749995681 +0000 UTC m=+1079.317626053" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.120381 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cb65b4b5-sqpx7"] Mar 18 18:20:45 crc kubenswrapper[4830]: E0318 18:20:45.120924 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e0852c0-51a3-4de2-9f84-e1c7042f4f13" containerName="mariadb-database-create" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.120955 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e0852c0-51a3-4de2-9f84-e1c7042f4f13" containerName="mariadb-database-create" Mar 18 18:20:45 crc kubenswrapper[4830]: E0318 18:20:45.120973 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ca42574-9bdb-4ef3-bd58-3973e9144285" containerName="mariadb-database-create" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.120985 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ca42574-9bdb-4ef3-bd58-3973e9144285" containerName="mariadb-database-create" Mar 18 18:20:45 crc kubenswrapper[4830]: E0318 18:20:45.121008 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79bf951a-722d-42b3-a813-c16de894ee1f" containerName="ovn-config" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.121020 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="79bf951a-722d-42b3-a813-c16de894ee1f" containerName="ovn-config" Mar 18 18:20:45 crc kubenswrapper[4830]: E0318 18:20:45.121035 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e6b8f3f-7b85-4504-b582-07edbfee2020" containerName="mariadb-account-create-update" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.121045 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e6b8f3f-7b85-4504-b582-07edbfee2020" containerName="mariadb-account-create-update" Mar 18 18:20:45 crc kubenswrapper[4830]: E0318 18:20:45.121066 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b61dfaea-fa74-44a5-b2a2-d6b7232008f9" containerName="mariadb-database-create" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.121076 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b61dfaea-fa74-44a5-b2a2-d6b7232008f9" containerName="mariadb-database-create" Mar 18 18:20:45 crc kubenswrapper[4830]: E0318 18:20:45.121114 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf0bd2ef-825f-4fad-8a4a-135941d72b5b" containerName="mariadb-account-create-update" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.121124 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf0bd2ef-825f-4fad-8a4a-135941d72b5b" containerName="mariadb-account-create-update" Mar 18 18:20:45 crc kubenswrapper[4830]: E0318 18:20:45.121137 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be89bcb9-66b2-4bbd-bc78-be14e9503088" containerName="mariadb-account-create-update" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.121147 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="be89bcb9-66b2-4bbd-bc78-be14e9503088" containerName="mariadb-account-create-update" Mar 18 18:20:45 crc kubenswrapper[4830]: E0318 18:20:45.121163 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7db73a7a-33c0-4d36-9e96-39b5d68e5af8" containerName="mariadb-account-create-update" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.121175 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="7db73a7a-33c0-4d36-9e96-39b5d68e5af8" containerName="mariadb-account-create-update" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.121423 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="b61dfaea-fa74-44a5-b2a2-d6b7232008f9" containerName="mariadb-database-create" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.121441 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e0852c0-51a3-4de2-9f84-e1c7042f4f13" containerName="mariadb-database-create" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.121462 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ca42574-9bdb-4ef3-bd58-3973e9144285" containerName="mariadb-database-create" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.121478 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf0bd2ef-825f-4fad-8a4a-135941d72b5b" containerName="mariadb-account-create-update" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.121502 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="be89bcb9-66b2-4bbd-bc78-be14e9503088" containerName="mariadb-account-create-update" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.121515 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="7db73a7a-33c0-4d36-9e96-39b5d68e5af8" containerName="mariadb-account-create-update" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.121531 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e6b8f3f-7b85-4504-b582-07edbfee2020" containerName="mariadb-account-create-update" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.121549 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="79bf951a-722d-42b3-a813-c16de894ee1f" containerName="ovn-config" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.122943 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb65b4b5-sqpx7" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.125697 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.142596 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb65b4b5-sqpx7"] Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.203620 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/001a299a-877e-4e4b-a2e4-24b82df345b8-dns-svc\") pod \"dnsmasq-dns-cb65b4b5-sqpx7\" (UID: \"001a299a-877e-4e4b-a2e4-24b82df345b8\") " pod="openstack/dnsmasq-dns-cb65b4b5-sqpx7" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.203662 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/001a299a-877e-4e4b-a2e4-24b82df345b8-ovsdbserver-nb\") pod \"dnsmasq-dns-cb65b4b5-sqpx7\" (UID: \"001a299a-877e-4e4b-a2e4-24b82df345b8\") " pod="openstack/dnsmasq-dns-cb65b4b5-sqpx7" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.203697 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2bb2\" (UniqueName: \"kubernetes.io/projected/001a299a-877e-4e4b-a2e4-24b82df345b8-kube-api-access-v2bb2\") pod \"dnsmasq-dns-cb65b4b5-sqpx7\" (UID: \"001a299a-877e-4e4b-a2e4-24b82df345b8\") " pod="openstack/dnsmasq-dns-cb65b4b5-sqpx7" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.203736 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/001a299a-877e-4e4b-a2e4-24b82df345b8-ovsdbserver-sb\") pod \"dnsmasq-dns-cb65b4b5-sqpx7\" (UID: \"001a299a-877e-4e4b-a2e4-24b82df345b8\") " pod="openstack/dnsmasq-dns-cb65b4b5-sqpx7" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.204071 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/001a299a-877e-4e4b-a2e4-24b82df345b8-config\") pod \"dnsmasq-dns-cb65b4b5-sqpx7\" (UID: \"001a299a-877e-4e4b-a2e4-24b82df345b8\") " pod="openstack/dnsmasq-dns-cb65b4b5-sqpx7" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.204136 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/001a299a-877e-4e4b-a2e4-24b82df345b8-dns-swift-storage-0\") pod \"dnsmasq-dns-cb65b4b5-sqpx7\" (UID: \"001a299a-877e-4e4b-a2e4-24b82df345b8\") " pod="openstack/dnsmasq-dns-cb65b4b5-sqpx7" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.305395 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/001a299a-877e-4e4b-a2e4-24b82df345b8-config\") pod \"dnsmasq-dns-cb65b4b5-sqpx7\" (UID: \"001a299a-877e-4e4b-a2e4-24b82df345b8\") " pod="openstack/dnsmasq-dns-cb65b4b5-sqpx7" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.305442 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/001a299a-877e-4e4b-a2e4-24b82df345b8-dns-swift-storage-0\") pod \"dnsmasq-dns-cb65b4b5-sqpx7\" (UID: \"001a299a-877e-4e4b-a2e4-24b82df345b8\") " pod="openstack/dnsmasq-dns-cb65b4b5-sqpx7" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.305505 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/001a299a-877e-4e4b-a2e4-24b82df345b8-dns-svc\") pod \"dnsmasq-dns-cb65b4b5-sqpx7\" (UID: \"001a299a-877e-4e4b-a2e4-24b82df345b8\") " pod="openstack/dnsmasq-dns-cb65b4b5-sqpx7" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.305540 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/001a299a-877e-4e4b-a2e4-24b82df345b8-ovsdbserver-nb\") pod \"dnsmasq-dns-cb65b4b5-sqpx7\" (UID: \"001a299a-877e-4e4b-a2e4-24b82df345b8\") " pod="openstack/dnsmasq-dns-cb65b4b5-sqpx7" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.305573 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2bb2\" (UniqueName: \"kubernetes.io/projected/001a299a-877e-4e4b-a2e4-24b82df345b8-kube-api-access-v2bb2\") pod \"dnsmasq-dns-cb65b4b5-sqpx7\" (UID: \"001a299a-877e-4e4b-a2e4-24b82df345b8\") " pod="openstack/dnsmasq-dns-cb65b4b5-sqpx7" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.305623 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/001a299a-877e-4e4b-a2e4-24b82df345b8-ovsdbserver-sb\") pod \"dnsmasq-dns-cb65b4b5-sqpx7\" (UID: \"001a299a-877e-4e4b-a2e4-24b82df345b8\") " pod="openstack/dnsmasq-dns-cb65b4b5-sqpx7" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.306827 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/001a299a-877e-4e4b-a2e4-24b82df345b8-ovsdbserver-sb\") pod \"dnsmasq-dns-cb65b4b5-sqpx7\" (UID: \"001a299a-877e-4e4b-a2e4-24b82df345b8\") " pod="openstack/dnsmasq-dns-cb65b4b5-sqpx7" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.307035 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/001a299a-877e-4e4b-a2e4-24b82df345b8-dns-svc\") pod \"dnsmasq-dns-cb65b4b5-sqpx7\" (UID: \"001a299a-877e-4e4b-a2e4-24b82df345b8\") " pod="openstack/dnsmasq-dns-cb65b4b5-sqpx7" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.307032 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/001a299a-877e-4e4b-a2e4-24b82df345b8-config\") pod \"dnsmasq-dns-cb65b4b5-sqpx7\" (UID: \"001a299a-877e-4e4b-a2e4-24b82df345b8\") " pod="openstack/dnsmasq-dns-cb65b4b5-sqpx7" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.307294 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/001a299a-877e-4e4b-a2e4-24b82df345b8-ovsdbserver-nb\") pod \"dnsmasq-dns-cb65b4b5-sqpx7\" (UID: \"001a299a-877e-4e4b-a2e4-24b82df345b8\") " pod="openstack/dnsmasq-dns-cb65b4b5-sqpx7" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.307388 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/001a299a-877e-4e4b-a2e4-24b82df345b8-dns-swift-storage-0\") pod \"dnsmasq-dns-cb65b4b5-sqpx7\" (UID: \"001a299a-877e-4e4b-a2e4-24b82df345b8\") " pod="openstack/dnsmasq-dns-cb65b4b5-sqpx7" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.329311 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2bb2\" (UniqueName: \"kubernetes.io/projected/001a299a-877e-4e4b-a2e4-24b82df345b8-kube-api-access-v2bb2\") pod \"dnsmasq-dns-cb65b4b5-sqpx7\" (UID: \"001a299a-877e-4e4b-a2e4-24b82df345b8\") " pod="openstack/dnsmasq-dns-cb65b4b5-sqpx7" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.444823 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb65b4b5-sqpx7" Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.708056 4830 generic.go:334] "Generic (PLEG): container finished" podID="d25630f0-a59e-44ba-94ba-bd0ae9216b42" containerID="3e305ed4bde4816d27ef7a6fa961104500d0307d2c1c0ea2ab3b18a26aa4b3f2" exitCode=0 Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.711113 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rkkhc" event={"ID":"d25630f0-a59e-44ba-94ba-bd0ae9216b42","Type":"ContainerDied","Data":"3e305ed4bde4816d27ef7a6fa961104500d0307d2c1c0ea2ab3b18a26aa4b3f2"} Mar 18 18:20:45 crc kubenswrapper[4830]: I0318 18:20:45.782459 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb65b4b5-sqpx7"] Mar 18 18:20:46 crc kubenswrapper[4830]: I0318 18:20:46.722881 4830 generic.go:334] "Generic (PLEG): container finished" podID="001a299a-877e-4e4b-a2e4-24b82df345b8" containerID="2117a7acf8456cc8a293f1e114c3584125cb7311ff6bac7d1dc4846c989bb348" exitCode=0 Mar 18 18:20:46 crc kubenswrapper[4830]: I0318 18:20:46.722954 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb65b4b5-sqpx7" event={"ID":"001a299a-877e-4e4b-a2e4-24b82df345b8","Type":"ContainerDied","Data":"2117a7acf8456cc8a293f1e114c3584125cb7311ff6bac7d1dc4846c989bb348"} Mar 18 18:20:46 crc kubenswrapper[4830]: I0318 18:20:46.723342 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb65b4b5-sqpx7" event={"ID":"001a299a-877e-4e4b-a2e4-24b82df345b8","Type":"ContainerStarted","Data":"40c8e8e4152f17689b440fdf238e5ea24084e600517150b2c2e8c4e71f546f70"} Mar 18 18:20:47 crc kubenswrapper[4830]: I0318 18:20:47.088466 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rkkhc" Mar 18 18:20:47 crc kubenswrapper[4830]: I0318 18:20:47.138645 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkx9g\" (UniqueName: \"kubernetes.io/projected/d25630f0-a59e-44ba-94ba-bd0ae9216b42-kube-api-access-nkx9g\") pod \"d25630f0-a59e-44ba-94ba-bd0ae9216b42\" (UID: \"d25630f0-a59e-44ba-94ba-bd0ae9216b42\") " Mar 18 18:20:47 crc kubenswrapper[4830]: I0318 18:20:47.138693 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d25630f0-a59e-44ba-94ba-bd0ae9216b42-config-data\") pod \"d25630f0-a59e-44ba-94ba-bd0ae9216b42\" (UID: \"d25630f0-a59e-44ba-94ba-bd0ae9216b42\") " Mar 18 18:20:47 crc kubenswrapper[4830]: I0318 18:20:47.138805 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d25630f0-a59e-44ba-94ba-bd0ae9216b42-combined-ca-bundle\") pod \"d25630f0-a59e-44ba-94ba-bd0ae9216b42\" (UID: \"d25630f0-a59e-44ba-94ba-bd0ae9216b42\") " Mar 18 18:20:47 crc kubenswrapper[4830]: I0318 18:20:47.143059 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d25630f0-a59e-44ba-94ba-bd0ae9216b42-kube-api-access-nkx9g" (OuterVolumeSpecName: "kube-api-access-nkx9g") pod "d25630f0-a59e-44ba-94ba-bd0ae9216b42" (UID: "d25630f0-a59e-44ba-94ba-bd0ae9216b42"). InnerVolumeSpecName "kube-api-access-nkx9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:20:47 crc kubenswrapper[4830]: I0318 18:20:47.161234 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d25630f0-a59e-44ba-94ba-bd0ae9216b42-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d25630f0-a59e-44ba-94ba-bd0ae9216b42" (UID: "d25630f0-a59e-44ba-94ba-bd0ae9216b42"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:20:47 crc kubenswrapper[4830]: I0318 18:20:47.188346 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d25630f0-a59e-44ba-94ba-bd0ae9216b42-config-data" (OuterVolumeSpecName: "config-data") pod "d25630f0-a59e-44ba-94ba-bd0ae9216b42" (UID: "d25630f0-a59e-44ba-94ba-bd0ae9216b42"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:20:47 crc kubenswrapper[4830]: I0318 18:20:47.240627 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d25630f0-a59e-44ba-94ba-bd0ae9216b42-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:47 crc kubenswrapper[4830]: I0318 18:20:47.240659 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkx9g\" (UniqueName: \"kubernetes.io/projected/d25630f0-a59e-44ba-94ba-bd0ae9216b42-kube-api-access-nkx9g\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:47 crc kubenswrapper[4830]: I0318 18:20:47.240675 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d25630f0-a59e-44ba-94ba-bd0ae9216b42-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:47 crc kubenswrapper[4830]: I0318 18:20:47.733068 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rkkhc" Mar 18 18:20:47 crc kubenswrapper[4830]: I0318 18:20:47.733350 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rkkhc" event={"ID":"d25630f0-a59e-44ba-94ba-bd0ae9216b42","Type":"ContainerDied","Data":"e794dbc575c25828782d3ca2410722879654b0390c73596c5e6983878eca9f28"} Mar 18 18:20:47 crc kubenswrapper[4830]: I0318 18:20:47.733399 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e794dbc575c25828782d3ca2410722879654b0390c73596c5e6983878eca9f28" Mar 18 18:20:47 crc kubenswrapper[4830]: I0318 18:20:47.739676 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb65b4b5-sqpx7" event={"ID":"001a299a-877e-4e4b-a2e4-24b82df345b8","Type":"ContainerStarted","Data":"498088e26660da7eae741eb26c5921cf04ce7ff512f9c078029af63bb79e9ead"} Mar 18 18:20:47 crc kubenswrapper[4830]: I0318 18:20:47.739911 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cb65b4b5-sqpx7" Mar 18 18:20:47 crc kubenswrapper[4830]: I0318 18:20:47.769372 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cb65b4b5-sqpx7" podStartSLOduration=2.7693338240000003 podStartE2EDuration="2.769333824s" podCreationTimestamp="2026-03-18 18:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:20:47.75733317 +0000 UTC m=+1082.324963512" watchObservedRunningTime="2026-03-18 18:20:47.769333824 +0000 UTC m=+1082.336964166" Mar 18 18:20:47 crc kubenswrapper[4830]: I0318 18:20:47.912628 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cb65b4b5-sqpx7"] Mar 18 18:20:47 crc kubenswrapper[4830]: I0318 18:20:47.947911 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-wrpjj"] Mar 18 18:20:47 crc kubenswrapper[4830]: E0318 18:20:47.948266 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d25630f0-a59e-44ba-94ba-bd0ae9216b42" containerName="keystone-db-sync" Mar 18 18:20:47 crc kubenswrapper[4830]: I0318 18:20:47.948283 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="d25630f0-a59e-44ba-94ba-bd0ae9216b42" containerName="keystone-db-sync" Mar 18 18:20:47 crc kubenswrapper[4830]: I0318 18:20:47.948452 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="d25630f0-a59e-44ba-94ba-bd0ae9216b42" containerName="keystone-db-sync" Mar 18 18:20:47 crc kubenswrapper[4830]: I0318 18:20:47.948969 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wrpjj" Mar 18 18:20:47 crc kubenswrapper[4830]: I0318 18:20:47.954245 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 18:20:47 crc kubenswrapper[4830]: I0318 18:20:47.954282 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-cg9pq" Mar 18 18:20:47 crc kubenswrapper[4830]: I0318 18:20:47.954394 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 18:20:47 crc kubenswrapper[4830]: I0318 18:20:47.954426 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 18:20:47 crc kubenswrapper[4830]: I0318 18:20:47.954553 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.007043 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wrpjj"] Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.013812 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ff8446d97-s68gs"] Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.015130 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ff8446d97-s68gs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.061329 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ff5d5df-234c-4628-bde1-11f3673a22fd-config-data\") pod \"keystone-bootstrap-wrpjj\" (UID: \"4ff5d5df-234c-4628-bde1-11f3673a22fd\") " pod="openstack/keystone-bootstrap-wrpjj" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.061587 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ff5d5df-234c-4628-bde1-11f3673a22fd-combined-ca-bundle\") pod \"keystone-bootstrap-wrpjj\" (UID: \"4ff5d5df-234c-4628-bde1-11f3673a22fd\") " pod="openstack/keystone-bootstrap-wrpjj" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.061669 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kc52\" (UniqueName: \"kubernetes.io/projected/4ff5d5df-234c-4628-bde1-11f3673a22fd-kube-api-access-2kc52\") pod \"keystone-bootstrap-wrpjj\" (UID: \"4ff5d5df-234c-4628-bde1-11f3673a22fd\") " pod="openstack/keystone-bootstrap-wrpjj" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.061755 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4ff5d5df-234c-4628-bde1-11f3673a22fd-fernet-keys\") pod \"keystone-bootstrap-wrpjj\" (UID: \"4ff5d5df-234c-4628-bde1-11f3673a22fd\") " pod="openstack/keystone-bootstrap-wrpjj" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.061965 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w58zj\" (UniqueName: \"kubernetes.io/projected/a5a22953-2e4d-48be-a69a-8593623257d0-kube-api-access-w58zj\") pod \"dnsmasq-dns-5ff8446d97-s68gs\" (UID: \"a5a22953-2e4d-48be-a69a-8593623257d0\") " pod="openstack/dnsmasq-dns-5ff8446d97-s68gs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.062052 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5a22953-2e4d-48be-a69a-8593623257d0-dns-svc\") pod \"dnsmasq-dns-5ff8446d97-s68gs\" (UID: \"a5a22953-2e4d-48be-a69a-8593623257d0\") " pod="openstack/dnsmasq-dns-5ff8446d97-s68gs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.063618 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a5a22953-2e4d-48be-a69a-8593623257d0-ovsdbserver-sb\") pod \"dnsmasq-dns-5ff8446d97-s68gs\" (UID: \"a5a22953-2e4d-48be-a69a-8593623257d0\") " pod="openstack/dnsmasq-dns-5ff8446d97-s68gs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.063738 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ff5d5df-234c-4628-bde1-11f3673a22fd-scripts\") pod \"keystone-bootstrap-wrpjj\" (UID: \"4ff5d5df-234c-4628-bde1-11f3673a22fd\") " pod="openstack/keystone-bootstrap-wrpjj" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.063854 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a5a22953-2e4d-48be-a69a-8593623257d0-dns-swift-storage-0\") pod \"dnsmasq-dns-5ff8446d97-s68gs\" (UID: \"a5a22953-2e4d-48be-a69a-8593623257d0\") " pod="openstack/dnsmasq-dns-5ff8446d97-s68gs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.063929 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4ff5d5df-234c-4628-bde1-11f3673a22fd-credential-keys\") pod \"keystone-bootstrap-wrpjj\" (UID: \"4ff5d5df-234c-4628-bde1-11f3673a22fd\") " pod="openstack/keystone-bootstrap-wrpjj" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.064001 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5a22953-2e4d-48be-a69a-8593623257d0-config\") pod \"dnsmasq-dns-5ff8446d97-s68gs\" (UID: \"a5a22953-2e4d-48be-a69a-8593623257d0\") " pod="openstack/dnsmasq-dns-5ff8446d97-s68gs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.064167 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a5a22953-2e4d-48be-a69a-8593623257d0-ovsdbserver-nb\") pod \"dnsmasq-dns-5ff8446d97-s68gs\" (UID: \"a5a22953-2e4d-48be-a69a-8593623257d0\") " pod="openstack/dnsmasq-dns-5ff8446d97-s68gs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.111122 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ff8446d97-s68gs"] Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.153715 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-96knc"] Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.154711 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-96knc" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.157703 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.158117 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-bzz8b" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.158239 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.161106 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-96knc"] Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.165724 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4ff5d5df-234c-4628-bde1-11f3673a22fd-credential-keys\") pod \"keystone-bootstrap-wrpjj\" (UID: \"4ff5d5df-234c-4628-bde1-11f3673a22fd\") " pod="openstack/keystone-bootstrap-wrpjj" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.165833 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5a22953-2e4d-48be-a69a-8593623257d0-config\") pod \"dnsmasq-dns-5ff8446d97-s68gs\" (UID: \"a5a22953-2e4d-48be-a69a-8593623257d0\") " pod="openstack/dnsmasq-dns-5ff8446d97-s68gs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.166669 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a5a22953-2e4d-48be-a69a-8593623257d0-ovsdbserver-nb\") pod \"dnsmasq-dns-5ff8446d97-s68gs\" (UID: \"a5a22953-2e4d-48be-a69a-8593623257d0\") " pod="openstack/dnsmasq-dns-5ff8446d97-s68gs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.166705 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ff5d5df-234c-4628-bde1-11f3673a22fd-config-data\") pod \"keystone-bootstrap-wrpjj\" (UID: \"4ff5d5df-234c-4628-bde1-11f3673a22fd\") " pod="openstack/keystone-bootstrap-wrpjj" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.166728 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ff5d5df-234c-4628-bde1-11f3673a22fd-combined-ca-bundle\") pod \"keystone-bootstrap-wrpjj\" (UID: \"4ff5d5df-234c-4628-bde1-11f3673a22fd\") " pod="openstack/keystone-bootstrap-wrpjj" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.166744 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kc52\" (UniqueName: \"kubernetes.io/projected/4ff5d5df-234c-4628-bde1-11f3673a22fd-kube-api-access-2kc52\") pod \"keystone-bootstrap-wrpjj\" (UID: \"4ff5d5df-234c-4628-bde1-11f3673a22fd\") " pod="openstack/keystone-bootstrap-wrpjj" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.166765 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4ff5d5df-234c-4628-bde1-11f3673a22fd-fernet-keys\") pod \"keystone-bootstrap-wrpjj\" (UID: \"4ff5d5df-234c-4628-bde1-11f3673a22fd\") " pod="openstack/keystone-bootstrap-wrpjj" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.166857 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w58zj\" (UniqueName: \"kubernetes.io/projected/a5a22953-2e4d-48be-a69a-8593623257d0-kube-api-access-w58zj\") pod \"dnsmasq-dns-5ff8446d97-s68gs\" (UID: \"a5a22953-2e4d-48be-a69a-8593623257d0\") " pod="openstack/dnsmasq-dns-5ff8446d97-s68gs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.166876 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5a22953-2e4d-48be-a69a-8593623257d0-dns-svc\") pod \"dnsmasq-dns-5ff8446d97-s68gs\" (UID: \"a5a22953-2e4d-48be-a69a-8593623257d0\") " pod="openstack/dnsmasq-dns-5ff8446d97-s68gs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.166891 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a5a22953-2e4d-48be-a69a-8593623257d0-ovsdbserver-sb\") pod \"dnsmasq-dns-5ff8446d97-s68gs\" (UID: \"a5a22953-2e4d-48be-a69a-8593623257d0\") " pod="openstack/dnsmasq-dns-5ff8446d97-s68gs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.166903 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ff5d5df-234c-4628-bde1-11f3673a22fd-scripts\") pod \"keystone-bootstrap-wrpjj\" (UID: \"4ff5d5df-234c-4628-bde1-11f3673a22fd\") " pod="openstack/keystone-bootstrap-wrpjj" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.166979 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a5a22953-2e4d-48be-a69a-8593623257d0-dns-swift-storage-0\") pod \"dnsmasq-dns-5ff8446d97-s68gs\" (UID: \"a5a22953-2e4d-48be-a69a-8593623257d0\") " pod="openstack/dnsmasq-dns-5ff8446d97-s68gs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.167686 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5a22953-2e4d-48be-a69a-8593623257d0-config\") pod \"dnsmasq-dns-5ff8446d97-s68gs\" (UID: \"a5a22953-2e4d-48be-a69a-8593623257d0\") " pod="openstack/dnsmasq-dns-5ff8446d97-s68gs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.168175 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5a22953-2e4d-48be-a69a-8593623257d0-dns-svc\") pod \"dnsmasq-dns-5ff8446d97-s68gs\" (UID: \"a5a22953-2e4d-48be-a69a-8593623257d0\") " pod="openstack/dnsmasq-dns-5ff8446d97-s68gs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.168705 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a5a22953-2e4d-48be-a69a-8593623257d0-ovsdbserver-sb\") pod \"dnsmasq-dns-5ff8446d97-s68gs\" (UID: \"a5a22953-2e4d-48be-a69a-8593623257d0\") " pod="openstack/dnsmasq-dns-5ff8446d97-s68gs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.169393 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a5a22953-2e4d-48be-a69a-8593623257d0-ovsdbserver-nb\") pod \"dnsmasq-dns-5ff8446d97-s68gs\" (UID: \"a5a22953-2e4d-48be-a69a-8593623257d0\") " pod="openstack/dnsmasq-dns-5ff8446d97-s68gs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.170939 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a5a22953-2e4d-48be-a69a-8593623257d0-dns-swift-storage-0\") pod \"dnsmasq-dns-5ff8446d97-s68gs\" (UID: \"a5a22953-2e4d-48be-a69a-8593623257d0\") " pod="openstack/dnsmasq-dns-5ff8446d97-s68gs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.178106 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.179065 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4ff5d5df-234c-4628-bde1-11f3673a22fd-credential-keys\") pod \"keystone-bootstrap-wrpjj\" (UID: \"4ff5d5df-234c-4628-bde1-11f3673a22fd\") " pod="openstack/keystone-bootstrap-wrpjj" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.179834 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ff5d5df-234c-4628-bde1-11f3673a22fd-combined-ca-bundle\") pod \"keystone-bootstrap-wrpjj\" (UID: \"4ff5d5df-234c-4628-bde1-11f3673a22fd\") " pod="openstack/keystone-bootstrap-wrpjj" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.180103 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ff5d5df-234c-4628-bde1-11f3673a22fd-config-data\") pod \"keystone-bootstrap-wrpjj\" (UID: \"4ff5d5df-234c-4628-bde1-11f3673a22fd\") " pod="openstack/keystone-bootstrap-wrpjj" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.180371 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.181241 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4ff5d5df-234c-4628-bde1-11f3673a22fd-fernet-keys\") pod \"keystone-bootstrap-wrpjj\" (UID: \"4ff5d5df-234c-4628-bde1-11f3673a22fd\") " pod="openstack/keystone-bootstrap-wrpjj" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.183306 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ff5d5df-234c-4628-bde1-11f3673a22fd-scripts\") pod \"keystone-bootstrap-wrpjj\" (UID: \"4ff5d5df-234c-4628-bde1-11f3673a22fd\") " pod="openstack/keystone-bootstrap-wrpjj" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.198119 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.198999 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.203616 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.204229 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kc52\" (UniqueName: \"kubernetes.io/projected/4ff5d5df-234c-4628-bde1-11f3673a22fd-kube-api-access-2kc52\") pod \"keystone-bootstrap-wrpjj\" (UID: \"4ff5d5df-234c-4628-bde1-11f3673a22fd\") " pod="openstack/keystone-bootstrap-wrpjj" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.204862 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w58zj\" (UniqueName: \"kubernetes.io/projected/a5a22953-2e4d-48be-a69a-8593623257d0-kube-api-access-w58zj\") pod \"dnsmasq-dns-5ff8446d97-s68gs\" (UID: \"a5a22953-2e4d-48be-a69a-8593623257d0\") " pod="openstack/dnsmasq-dns-5ff8446d97-s68gs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.229052 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-vgb8c"] Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.230004 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vgb8c" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.236204 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.236593 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-sjj5h" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.236794 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.250197 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-vgb8c"] Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.269990 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c42d089-56c7-45ee-ba54-ee464499ff29-config-data\") pod \"cinder-db-sync-96knc\" (UID: \"8c42d089-56c7-45ee-ba54-ee464499ff29\") " pod="openstack/cinder-db-sync-96knc" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.270281 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c42d089-56c7-45ee-ba54-ee464499ff29-scripts\") pod \"cinder-db-sync-96knc\" (UID: \"8c42d089-56c7-45ee-ba54-ee464499ff29\") " pod="openstack/cinder-db-sync-96knc" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.270378 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7df9e731-0537-4d90-a4c1-907721b227e1-config-data\") pod \"ceilometer-0\" (UID: \"7df9e731-0537-4d90-a4c1-907721b227e1\") " pod="openstack/ceilometer-0" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.270459 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/82570719-0e07-4ef2-adee-a287052cc4ac-config\") pod \"neutron-db-sync-vgb8c\" (UID: \"82570719-0e07-4ef2-adee-a287052cc4ac\") " pod="openstack/neutron-db-sync-vgb8c" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.270540 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8c42d089-56c7-45ee-ba54-ee464499ff29-etc-machine-id\") pod \"cinder-db-sync-96knc\" (UID: \"8c42d089-56c7-45ee-ba54-ee464499ff29\") " pod="openstack/cinder-db-sync-96knc" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.270615 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7df9e731-0537-4d90-a4c1-907721b227e1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7df9e731-0537-4d90-a4c1-907721b227e1\") " pod="openstack/ceilometer-0" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.270693 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7df9e731-0537-4d90-a4c1-907721b227e1-scripts\") pod \"ceilometer-0\" (UID: \"7df9e731-0537-4d90-a4c1-907721b227e1\") " pod="openstack/ceilometer-0" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.270788 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4zc4\" (UniqueName: \"kubernetes.io/projected/82570719-0e07-4ef2-adee-a287052cc4ac-kube-api-access-d4zc4\") pod \"neutron-db-sync-vgb8c\" (UID: \"82570719-0e07-4ef2-adee-a287052cc4ac\") " pod="openstack/neutron-db-sync-vgb8c" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.270856 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7df9e731-0537-4d90-a4c1-907721b227e1-log-httpd\") pod \"ceilometer-0\" (UID: \"7df9e731-0537-4d90-a4c1-907721b227e1\") " pod="openstack/ceilometer-0" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.270933 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd26g\" (UniqueName: \"kubernetes.io/projected/7df9e731-0537-4d90-a4c1-907721b227e1-kube-api-access-fd26g\") pod \"ceilometer-0\" (UID: \"7df9e731-0537-4d90-a4c1-907721b227e1\") " pod="openstack/ceilometer-0" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.271003 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lntz\" (UniqueName: \"kubernetes.io/projected/8c42d089-56c7-45ee-ba54-ee464499ff29-kube-api-access-9lntz\") pod \"cinder-db-sync-96knc\" (UID: \"8c42d089-56c7-45ee-ba54-ee464499ff29\") " pod="openstack/cinder-db-sync-96knc" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.271064 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c42d089-56c7-45ee-ba54-ee464499ff29-combined-ca-bundle\") pod \"cinder-db-sync-96knc\" (UID: \"8c42d089-56c7-45ee-ba54-ee464499ff29\") " pod="openstack/cinder-db-sync-96knc" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.275237 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8c42d089-56c7-45ee-ba54-ee464499ff29-db-sync-config-data\") pod \"cinder-db-sync-96knc\" (UID: \"8c42d089-56c7-45ee-ba54-ee464499ff29\") " pod="openstack/cinder-db-sync-96knc" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.275729 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82570719-0e07-4ef2-adee-a287052cc4ac-combined-ca-bundle\") pod \"neutron-db-sync-vgb8c\" (UID: \"82570719-0e07-4ef2-adee-a287052cc4ac\") " pod="openstack/neutron-db-sync-vgb8c" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.275884 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7df9e731-0537-4d90-a4c1-907721b227e1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7df9e731-0537-4d90-a4c1-907721b227e1\") " pod="openstack/ceilometer-0" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.275980 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7df9e731-0537-4d90-a4c1-907721b227e1-run-httpd\") pod \"ceilometer-0\" (UID: \"7df9e731-0537-4d90-a4c1-907721b227e1\") " pod="openstack/ceilometer-0" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.307643 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wrpjj" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.318215 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-wvvs5"] Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.320344 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wvvs5" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.324398 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.324792 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-bwpfg" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.330605 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ff8446d97-s68gs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.333507 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-wvvs5"] Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.377605 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7df9e731-0537-4d90-a4c1-907721b227e1-scripts\") pod \"ceilometer-0\" (UID: \"7df9e731-0537-4d90-a4c1-907721b227e1\") " pod="openstack/ceilometer-0" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.377820 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4zc4\" (UniqueName: \"kubernetes.io/projected/82570719-0e07-4ef2-adee-a287052cc4ac-kube-api-access-d4zc4\") pod \"neutron-db-sync-vgb8c\" (UID: \"82570719-0e07-4ef2-adee-a287052cc4ac\") " pod="openstack/neutron-db-sync-vgb8c" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.377888 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7df9e731-0537-4d90-a4c1-907721b227e1-log-httpd\") pod \"ceilometer-0\" (UID: \"7df9e731-0537-4d90-a4c1-907721b227e1\") " pod="openstack/ceilometer-0" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.377957 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd26g\" (UniqueName: \"kubernetes.io/projected/7df9e731-0537-4d90-a4c1-907721b227e1-kube-api-access-fd26g\") pod \"ceilometer-0\" (UID: \"7df9e731-0537-4d90-a4c1-907721b227e1\") " pod="openstack/ceilometer-0" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.378054 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lntz\" (UniqueName: \"kubernetes.io/projected/8c42d089-56c7-45ee-ba54-ee464499ff29-kube-api-access-9lntz\") pod \"cinder-db-sync-96knc\" (UID: \"8c42d089-56c7-45ee-ba54-ee464499ff29\") " pod="openstack/cinder-db-sync-96knc" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.378137 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c42d089-56c7-45ee-ba54-ee464499ff29-combined-ca-bundle\") pod \"cinder-db-sync-96knc\" (UID: \"8c42d089-56c7-45ee-ba54-ee464499ff29\") " pod="openstack/cinder-db-sync-96knc" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.378215 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trfdx\" (UniqueName: \"kubernetes.io/projected/40d348ed-98d2-494b-b2b1-f1dfb190a636-kube-api-access-trfdx\") pod \"barbican-db-sync-wvvs5\" (UID: \"40d348ed-98d2-494b-b2b1-f1dfb190a636\") " pod="openstack/barbican-db-sync-wvvs5" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.378287 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40d348ed-98d2-494b-b2b1-f1dfb190a636-combined-ca-bundle\") pod \"barbican-db-sync-wvvs5\" (UID: \"40d348ed-98d2-494b-b2b1-f1dfb190a636\") " pod="openstack/barbican-db-sync-wvvs5" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.378360 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82570719-0e07-4ef2-adee-a287052cc4ac-combined-ca-bundle\") pod \"neutron-db-sync-vgb8c\" (UID: \"82570719-0e07-4ef2-adee-a287052cc4ac\") " pod="openstack/neutron-db-sync-vgb8c" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.378432 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8c42d089-56c7-45ee-ba54-ee464499ff29-db-sync-config-data\") pod \"cinder-db-sync-96knc\" (UID: \"8c42d089-56c7-45ee-ba54-ee464499ff29\") " pod="openstack/cinder-db-sync-96knc" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.378503 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7df9e731-0537-4d90-a4c1-907721b227e1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7df9e731-0537-4d90-a4c1-907721b227e1\") " pod="openstack/ceilometer-0" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.378568 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7df9e731-0537-4d90-a4c1-907721b227e1-run-httpd\") pod \"ceilometer-0\" (UID: \"7df9e731-0537-4d90-a4c1-907721b227e1\") " pod="openstack/ceilometer-0" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.378650 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/40d348ed-98d2-494b-b2b1-f1dfb190a636-db-sync-config-data\") pod \"barbican-db-sync-wvvs5\" (UID: \"40d348ed-98d2-494b-b2b1-f1dfb190a636\") " pod="openstack/barbican-db-sync-wvvs5" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.378711 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c42d089-56c7-45ee-ba54-ee464499ff29-config-data\") pod \"cinder-db-sync-96knc\" (UID: \"8c42d089-56c7-45ee-ba54-ee464499ff29\") " pod="openstack/cinder-db-sync-96knc" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.378799 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c42d089-56c7-45ee-ba54-ee464499ff29-scripts\") pod \"cinder-db-sync-96knc\" (UID: \"8c42d089-56c7-45ee-ba54-ee464499ff29\") " pod="openstack/cinder-db-sync-96knc" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.383215 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7df9e731-0537-4d90-a4c1-907721b227e1-config-data\") pod \"ceilometer-0\" (UID: \"7df9e731-0537-4d90-a4c1-907721b227e1\") " pod="openstack/ceilometer-0" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.383318 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/82570719-0e07-4ef2-adee-a287052cc4ac-config\") pod \"neutron-db-sync-vgb8c\" (UID: \"82570719-0e07-4ef2-adee-a287052cc4ac\") " pod="openstack/neutron-db-sync-vgb8c" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.383409 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8c42d089-56c7-45ee-ba54-ee464499ff29-etc-machine-id\") pod \"cinder-db-sync-96knc\" (UID: \"8c42d089-56c7-45ee-ba54-ee464499ff29\") " pod="openstack/cinder-db-sync-96knc" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.383495 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7df9e731-0537-4d90-a4c1-907721b227e1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7df9e731-0537-4d90-a4c1-907721b227e1\") " pod="openstack/ceilometer-0" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.384694 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7df9e731-0537-4d90-a4c1-907721b227e1-log-httpd\") pod \"ceilometer-0\" (UID: \"7df9e731-0537-4d90-a4c1-907721b227e1\") " pod="openstack/ceilometer-0" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.392797 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7df9e731-0537-4d90-a4c1-907721b227e1-scripts\") pod \"ceilometer-0\" (UID: \"7df9e731-0537-4d90-a4c1-907721b227e1\") " pod="openstack/ceilometer-0" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.395368 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82570719-0e07-4ef2-adee-a287052cc4ac-combined-ca-bundle\") pod \"neutron-db-sync-vgb8c\" (UID: \"82570719-0e07-4ef2-adee-a287052cc4ac\") " pod="openstack/neutron-db-sync-vgb8c" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.395451 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8c42d089-56c7-45ee-ba54-ee464499ff29-etc-machine-id\") pod \"cinder-db-sync-96knc\" (UID: \"8c42d089-56c7-45ee-ba54-ee464499ff29\") " pod="openstack/cinder-db-sync-96knc" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.395642 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7df9e731-0537-4d90-a4c1-907721b227e1-run-httpd\") pod \"ceilometer-0\" (UID: \"7df9e731-0537-4d90-a4c1-907721b227e1\") " pod="openstack/ceilometer-0" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.397664 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7df9e731-0537-4d90-a4c1-907721b227e1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7df9e731-0537-4d90-a4c1-907721b227e1\") " pod="openstack/ceilometer-0" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.411019 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8c42d089-56c7-45ee-ba54-ee464499ff29-db-sync-config-data\") pod \"cinder-db-sync-96knc\" (UID: \"8c42d089-56c7-45ee-ba54-ee464499ff29\") " pod="openstack/cinder-db-sync-96knc" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.411728 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/82570719-0e07-4ef2-adee-a287052cc4ac-config\") pod \"neutron-db-sync-vgb8c\" (UID: \"82570719-0e07-4ef2-adee-a287052cc4ac\") " pod="openstack/neutron-db-sync-vgb8c" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.414781 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lntz\" (UniqueName: \"kubernetes.io/projected/8c42d089-56c7-45ee-ba54-ee464499ff29-kube-api-access-9lntz\") pod \"cinder-db-sync-96knc\" (UID: \"8c42d089-56c7-45ee-ba54-ee464499ff29\") " pod="openstack/cinder-db-sync-96knc" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.424299 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c42d089-56c7-45ee-ba54-ee464499ff29-scripts\") pod \"cinder-db-sync-96knc\" (UID: \"8c42d089-56c7-45ee-ba54-ee464499ff29\") " pod="openstack/cinder-db-sync-96knc" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.426142 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7df9e731-0537-4d90-a4c1-907721b227e1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7df9e731-0537-4d90-a4c1-907721b227e1\") " pod="openstack/ceilometer-0" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.428210 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd26g\" (UniqueName: \"kubernetes.io/projected/7df9e731-0537-4d90-a4c1-907721b227e1-kube-api-access-fd26g\") pod \"ceilometer-0\" (UID: \"7df9e731-0537-4d90-a4c1-907721b227e1\") " pod="openstack/ceilometer-0" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.434906 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c42d089-56c7-45ee-ba54-ee464499ff29-combined-ca-bundle\") pod \"cinder-db-sync-96knc\" (UID: \"8c42d089-56c7-45ee-ba54-ee464499ff29\") " pod="openstack/cinder-db-sync-96knc" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.435272 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7df9e731-0537-4d90-a4c1-907721b227e1-config-data\") pod \"ceilometer-0\" (UID: \"7df9e731-0537-4d90-a4c1-907721b227e1\") " pod="openstack/ceilometer-0" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.435378 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c42d089-56c7-45ee-ba54-ee464499ff29-config-data\") pod \"cinder-db-sync-96knc\" (UID: \"8c42d089-56c7-45ee-ba54-ee464499ff29\") " pod="openstack/cinder-db-sync-96knc" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.435509 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4zc4\" (UniqueName: \"kubernetes.io/projected/82570719-0e07-4ef2-adee-a287052cc4ac-kube-api-access-d4zc4\") pod \"neutron-db-sync-vgb8c\" (UID: \"82570719-0e07-4ef2-adee-a287052cc4ac\") " pod="openstack/neutron-db-sync-vgb8c" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.444815 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ff8446d97-s68gs"] Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.477821 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7ff6d84665-njgzs"] Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.479159 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff6d84665-njgzs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.484603 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trfdx\" (UniqueName: \"kubernetes.io/projected/40d348ed-98d2-494b-b2b1-f1dfb190a636-kube-api-access-trfdx\") pod \"barbican-db-sync-wvvs5\" (UID: \"40d348ed-98d2-494b-b2b1-f1dfb190a636\") " pod="openstack/barbican-db-sync-wvvs5" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.484636 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40d348ed-98d2-494b-b2b1-f1dfb190a636-combined-ca-bundle\") pod \"barbican-db-sync-wvvs5\" (UID: \"40d348ed-98d2-494b-b2b1-f1dfb190a636\") " pod="openstack/barbican-db-sync-wvvs5" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.484688 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/40d348ed-98d2-494b-b2b1-f1dfb190a636-db-sync-config-data\") pod \"barbican-db-sync-wvvs5\" (UID: \"40d348ed-98d2-494b-b2b1-f1dfb190a636\") " pod="openstack/barbican-db-sync-wvvs5" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.489963 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40d348ed-98d2-494b-b2b1-f1dfb190a636-combined-ca-bundle\") pod \"barbican-db-sync-wvvs5\" (UID: \"40d348ed-98d2-494b-b2b1-f1dfb190a636\") " pod="openstack/barbican-db-sync-wvvs5" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.491119 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/40d348ed-98d2-494b-b2b1-f1dfb190a636-db-sync-config-data\") pod \"barbican-db-sync-wvvs5\" (UID: \"40d348ed-98d2-494b-b2b1-f1dfb190a636\") " pod="openstack/barbican-db-sync-wvvs5" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.507342 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trfdx\" (UniqueName: \"kubernetes.io/projected/40d348ed-98d2-494b-b2b1-f1dfb190a636-kube-api-access-trfdx\") pod \"barbican-db-sync-wvvs5\" (UID: \"40d348ed-98d2-494b-b2b1-f1dfb190a636\") " pod="openstack/barbican-db-sync-wvvs5" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.510117 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff6d84665-njgzs"] Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.521674 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-msrv5"] Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.523127 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-msrv5" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.525546 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.525838 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-bq9d5" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.525895 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.551959 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-msrv5"] Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.565637 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-96knc" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.585607 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff6d84665-njgzs\" (UID: \"cf55af5b-7fa0-4c22-8edc-868fdd43d7c6\") " pod="openstack/dnsmasq-dns-7ff6d84665-njgzs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.585652 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a1597fb-6c27-4b75-8996-40ff17a49e69-logs\") pod \"placement-db-sync-msrv5\" (UID: \"8a1597fb-6c27-4b75-8996-40ff17a49e69\") " pod="openstack/placement-db-sync-msrv5" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.585676 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a1597fb-6c27-4b75-8996-40ff17a49e69-combined-ca-bundle\") pod \"placement-db-sync-msrv5\" (UID: \"8a1597fb-6c27-4b75-8996-40ff17a49e69\") " pod="openstack/placement-db-sync-msrv5" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.585693 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2p2q\" (UniqueName: \"kubernetes.io/projected/8a1597fb-6c27-4b75-8996-40ff17a49e69-kube-api-access-z2p2q\") pod \"placement-db-sync-msrv5\" (UID: \"8a1597fb-6c27-4b75-8996-40ff17a49e69\") " pod="openstack/placement-db-sync-msrv5" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.585712 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff6d84665-njgzs\" (UID: \"cf55af5b-7fa0-4c22-8edc-868fdd43d7c6\") " pod="openstack/dnsmasq-dns-7ff6d84665-njgzs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.586133 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-dns-svc\") pod \"dnsmasq-dns-7ff6d84665-njgzs\" (UID: \"cf55af5b-7fa0-4c22-8edc-868fdd43d7c6\") " pod="openstack/dnsmasq-dns-7ff6d84665-njgzs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.586228 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-config\") pod \"dnsmasq-dns-7ff6d84665-njgzs\" (UID: \"cf55af5b-7fa0-4c22-8edc-868fdd43d7c6\") " pod="openstack/dnsmasq-dns-7ff6d84665-njgzs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.586286 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf249\" (UniqueName: \"kubernetes.io/projected/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-kube-api-access-pf249\") pod \"dnsmasq-dns-7ff6d84665-njgzs\" (UID: \"cf55af5b-7fa0-4c22-8edc-868fdd43d7c6\") " pod="openstack/dnsmasq-dns-7ff6d84665-njgzs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.586318 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a1597fb-6c27-4b75-8996-40ff17a49e69-scripts\") pod \"placement-db-sync-msrv5\" (UID: \"8a1597fb-6c27-4b75-8996-40ff17a49e69\") " pod="openstack/placement-db-sync-msrv5" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.586391 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff6d84665-njgzs\" (UID: \"cf55af5b-7fa0-4c22-8edc-868fdd43d7c6\") " pod="openstack/dnsmasq-dns-7ff6d84665-njgzs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.586429 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a1597fb-6c27-4b75-8996-40ff17a49e69-config-data\") pod \"placement-db-sync-msrv5\" (UID: \"8a1597fb-6c27-4b75-8996-40ff17a49e69\") " pod="openstack/placement-db-sync-msrv5" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.597469 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.608577 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vgb8c" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.706887 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff6d84665-njgzs\" (UID: \"cf55af5b-7fa0-4c22-8edc-868fdd43d7c6\") " pod="openstack/dnsmasq-dns-7ff6d84665-njgzs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.706959 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a1597fb-6c27-4b75-8996-40ff17a49e69-config-data\") pod \"placement-db-sync-msrv5\" (UID: \"8a1597fb-6c27-4b75-8996-40ff17a49e69\") " pod="openstack/placement-db-sync-msrv5" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.707013 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff6d84665-njgzs\" (UID: \"cf55af5b-7fa0-4c22-8edc-868fdd43d7c6\") " pod="openstack/dnsmasq-dns-7ff6d84665-njgzs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.707048 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a1597fb-6c27-4b75-8996-40ff17a49e69-logs\") pod \"placement-db-sync-msrv5\" (UID: \"8a1597fb-6c27-4b75-8996-40ff17a49e69\") " pod="openstack/placement-db-sync-msrv5" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.707080 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2p2q\" (UniqueName: \"kubernetes.io/projected/8a1597fb-6c27-4b75-8996-40ff17a49e69-kube-api-access-z2p2q\") pod \"placement-db-sync-msrv5\" (UID: \"8a1597fb-6c27-4b75-8996-40ff17a49e69\") " pod="openstack/placement-db-sync-msrv5" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.707105 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a1597fb-6c27-4b75-8996-40ff17a49e69-combined-ca-bundle\") pod \"placement-db-sync-msrv5\" (UID: \"8a1597fb-6c27-4b75-8996-40ff17a49e69\") " pod="openstack/placement-db-sync-msrv5" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.707130 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff6d84665-njgzs\" (UID: \"cf55af5b-7fa0-4c22-8edc-868fdd43d7c6\") " pod="openstack/dnsmasq-dns-7ff6d84665-njgzs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.707215 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-dns-svc\") pod \"dnsmasq-dns-7ff6d84665-njgzs\" (UID: \"cf55af5b-7fa0-4c22-8edc-868fdd43d7c6\") " pod="openstack/dnsmasq-dns-7ff6d84665-njgzs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.707245 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-config\") pod \"dnsmasq-dns-7ff6d84665-njgzs\" (UID: \"cf55af5b-7fa0-4c22-8edc-868fdd43d7c6\") " pod="openstack/dnsmasq-dns-7ff6d84665-njgzs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.707273 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf249\" (UniqueName: \"kubernetes.io/projected/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-kube-api-access-pf249\") pod \"dnsmasq-dns-7ff6d84665-njgzs\" (UID: \"cf55af5b-7fa0-4c22-8edc-868fdd43d7c6\") " pod="openstack/dnsmasq-dns-7ff6d84665-njgzs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.707310 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a1597fb-6c27-4b75-8996-40ff17a49e69-scripts\") pod \"placement-db-sync-msrv5\" (UID: \"8a1597fb-6c27-4b75-8996-40ff17a49e69\") " pod="openstack/placement-db-sync-msrv5" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.707761 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff6d84665-njgzs\" (UID: \"cf55af5b-7fa0-4c22-8edc-868fdd43d7c6\") " pod="openstack/dnsmasq-dns-7ff6d84665-njgzs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.709120 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff6d84665-njgzs\" (UID: \"cf55af5b-7fa0-4c22-8edc-868fdd43d7c6\") " pod="openstack/dnsmasq-dns-7ff6d84665-njgzs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.709984 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-dns-svc\") pod \"dnsmasq-dns-7ff6d84665-njgzs\" (UID: \"cf55af5b-7fa0-4c22-8edc-868fdd43d7c6\") " pod="openstack/dnsmasq-dns-7ff6d84665-njgzs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.710792 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff6d84665-njgzs\" (UID: \"cf55af5b-7fa0-4c22-8edc-868fdd43d7c6\") " pod="openstack/dnsmasq-dns-7ff6d84665-njgzs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.711297 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-config\") pod \"dnsmasq-dns-7ff6d84665-njgzs\" (UID: \"cf55af5b-7fa0-4c22-8edc-868fdd43d7c6\") " pod="openstack/dnsmasq-dns-7ff6d84665-njgzs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.711550 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a1597fb-6c27-4b75-8996-40ff17a49e69-logs\") pod \"placement-db-sync-msrv5\" (UID: \"8a1597fb-6c27-4b75-8996-40ff17a49e69\") " pod="openstack/placement-db-sync-msrv5" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.711833 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a1597fb-6c27-4b75-8996-40ff17a49e69-scripts\") pod \"placement-db-sync-msrv5\" (UID: \"8a1597fb-6c27-4b75-8996-40ff17a49e69\") " pod="openstack/placement-db-sync-msrv5" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.724592 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a1597fb-6c27-4b75-8996-40ff17a49e69-combined-ca-bundle\") pod \"placement-db-sync-msrv5\" (UID: \"8a1597fb-6c27-4b75-8996-40ff17a49e69\") " pod="openstack/placement-db-sync-msrv5" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.724608 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a1597fb-6c27-4b75-8996-40ff17a49e69-config-data\") pod \"placement-db-sync-msrv5\" (UID: \"8a1597fb-6c27-4b75-8996-40ff17a49e69\") " pod="openstack/placement-db-sync-msrv5" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.729270 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf249\" (UniqueName: \"kubernetes.io/projected/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-kube-api-access-pf249\") pod \"dnsmasq-dns-7ff6d84665-njgzs\" (UID: \"cf55af5b-7fa0-4c22-8edc-868fdd43d7c6\") " pod="openstack/dnsmasq-dns-7ff6d84665-njgzs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.731917 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2p2q\" (UniqueName: \"kubernetes.io/projected/8a1597fb-6c27-4b75-8996-40ff17a49e69-kube-api-access-z2p2q\") pod \"placement-db-sync-msrv5\" (UID: \"8a1597fb-6c27-4b75-8996-40ff17a49e69\") " pod="openstack/placement-db-sync-msrv5" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.743293 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wvvs5" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.801659 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff6d84665-njgzs" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.840431 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-msrv5" Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.956187 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ff8446d97-s68gs"] Mar 18 18:20:48 crc kubenswrapper[4830]: I0318 18:20:48.987695 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wrpjj"] Mar 18 18:20:49 crc kubenswrapper[4830]: I0318 18:20:49.112994 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-96knc"] Mar 18 18:20:49 crc kubenswrapper[4830]: W0318 18:20:49.136459 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c42d089_56c7_45ee_ba54_ee464499ff29.slice/crio-3488ab74d8af70643fcb5458bc3517671d7ecd0b15a97e813945d74835fd43fc WatchSource:0}: Error finding container 3488ab74d8af70643fcb5458bc3517671d7ecd0b15a97e813945d74835fd43fc: Status 404 returned error can't find the container with id 3488ab74d8af70643fcb5458bc3517671d7ecd0b15a97e813945d74835fd43fc Mar 18 18:20:49 crc kubenswrapper[4830]: I0318 18:20:49.281336 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:20:49 crc kubenswrapper[4830]: I0318 18:20:49.310187 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-vgb8c"] Mar 18 18:20:49 crc kubenswrapper[4830]: W0318 18:20:49.312423 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82570719_0e07_4ef2_adee_a287052cc4ac.slice/crio-7148e7d1dd170bfd1bff0716804026bebfe1db3b8c1e8cd06be112035507aa42 WatchSource:0}: Error finding container 7148e7d1dd170bfd1bff0716804026bebfe1db3b8c1e8cd06be112035507aa42: Status 404 returned error can't find the container with id 7148e7d1dd170bfd1bff0716804026bebfe1db3b8c1e8cd06be112035507aa42 Mar 18 18:20:49 crc kubenswrapper[4830]: I0318 18:20:49.364654 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-msrv5"] Mar 18 18:20:49 crc kubenswrapper[4830]: I0318 18:20:49.479796 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-wvvs5"] Mar 18 18:20:49 crc kubenswrapper[4830]: W0318 18:20:49.482227 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40d348ed_98d2_494b_b2b1_f1dfb190a636.slice/crio-8ceaa383a57ab73501b289a2bf746f35b115ec00610bd8caa012ccbbbc6a45c9 WatchSource:0}: Error finding container 8ceaa383a57ab73501b289a2bf746f35b115ec00610bd8caa012ccbbbc6a45c9: Status 404 returned error can't find the container with id 8ceaa383a57ab73501b289a2bf746f35b115ec00610bd8caa012ccbbbc6a45c9 Mar 18 18:20:49 crc kubenswrapper[4830]: I0318 18:20:49.501922 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff6d84665-njgzs"] Mar 18 18:20:49 crc kubenswrapper[4830]: I0318 18:20:49.764089 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-msrv5" event={"ID":"8a1597fb-6c27-4b75-8996-40ff17a49e69","Type":"ContainerStarted","Data":"e26967556c1953b30de9bf4c96e8d2a1178cff18d47f97516db68d3b6692573b"} Mar 18 18:20:49 crc kubenswrapper[4830]: I0318 18:20:49.768373 4830 generic.go:334] "Generic (PLEG): container finished" podID="a5a22953-2e4d-48be-a69a-8593623257d0" containerID="4cc0f1d5708291b39e153605a1e08f728fb768cd34d78108377252c9dde37c9e" exitCode=0 Mar 18 18:20:49 crc kubenswrapper[4830]: I0318 18:20:49.768458 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ff8446d97-s68gs" event={"ID":"a5a22953-2e4d-48be-a69a-8593623257d0","Type":"ContainerDied","Data":"4cc0f1d5708291b39e153605a1e08f728fb768cd34d78108377252c9dde37c9e"} Mar 18 18:20:49 crc kubenswrapper[4830]: I0318 18:20:49.768491 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ff8446d97-s68gs" event={"ID":"a5a22953-2e4d-48be-a69a-8593623257d0","Type":"ContainerStarted","Data":"a3c7bdcc516cc22e900d4989318f7c4511f02747ce2f65d857b9ef79b6b279f9"} Mar 18 18:20:49 crc kubenswrapper[4830]: I0318 18:20:49.771731 4830 generic.go:334] "Generic (PLEG): container finished" podID="cf55af5b-7fa0-4c22-8edc-868fdd43d7c6" containerID="6d101ca18743304a5c30e54671bb9b642ddbd61577cbc01b12b1a1bddf2aea39" exitCode=0 Mar 18 18:20:49 crc kubenswrapper[4830]: I0318 18:20:49.771782 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff6d84665-njgzs" event={"ID":"cf55af5b-7fa0-4c22-8edc-868fdd43d7c6","Type":"ContainerDied","Data":"6d101ca18743304a5c30e54671bb9b642ddbd61577cbc01b12b1a1bddf2aea39"} Mar 18 18:20:49 crc kubenswrapper[4830]: I0318 18:20:49.771829 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff6d84665-njgzs" event={"ID":"cf55af5b-7fa0-4c22-8edc-868fdd43d7c6","Type":"ContainerStarted","Data":"5bddc21daf1dfae2fcab84f96e1cdd80870e578df077ea97541132ef044958a0"} Mar 18 18:20:49 crc kubenswrapper[4830]: I0318 18:20:49.773353 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wrpjj" event={"ID":"4ff5d5df-234c-4628-bde1-11f3673a22fd","Type":"ContainerStarted","Data":"119d8ed4821b740db22ee1445ab8b58f898477bd08bc5e02c8c7846598245ad2"} Mar 18 18:20:49 crc kubenswrapper[4830]: I0318 18:20:49.773419 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wrpjj" event={"ID":"4ff5d5df-234c-4628-bde1-11f3673a22fd","Type":"ContainerStarted","Data":"42a7d2cfdc68f53354e6462975b5a7583d5b3b0573163d070ccb29265a16477e"} Mar 18 18:20:49 crc kubenswrapper[4830]: I0318 18:20:49.774180 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7df9e731-0537-4d90-a4c1-907721b227e1","Type":"ContainerStarted","Data":"7a00b93a5794418067e1fb78825d2e8e9864ecafcfce286d6f6c1317bcde293d"} Mar 18 18:20:49 crc kubenswrapper[4830]: I0318 18:20:49.774903 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wvvs5" event={"ID":"40d348ed-98d2-494b-b2b1-f1dfb190a636","Type":"ContainerStarted","Data":"8ceaa383a57ab73501b289a2bf746f35b115ec00610bd8caa012ccbbbc6a45c9"} Mar 18 18:20:49 crc kubenswrapper[4830]: I0318 18:20:49.780012 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vgb8c" event={"ID":"82570719-0e07-4ef2-adee-a287052cc4ac","Type":"ContainerStarted","Data":"7c515d7ae75f9681256fdbe1f69965eb9c3684f823264317db51e5df64736fcd"} Mar 18 18:20:49 crc kubenswrapper[4830]: I0318 18:20:49.780049 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vgb8c" event={"ID":"82570719-0e07-4ef2-adee-a287052cc4ac","Type":"ContainerStarted","Data":"7148e7d1dd170bfd1bff0716804026bebfe1db3b8c1e8cd06be112035507aa42"} Mar 18 18:20:49 crc kubenswrapper[4830]: I0318 18:20:49.785284 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-96knc" event={"ID":"8c42d089-56c7-45ee-ba54-ee464499ff29","Type":"ContainerStarted","Data":"3488ab74d8af70643fcb5458bc3517671d7ecd0b15a97e813945d74835fd43fc"} Mar 18 18:20:49 crc kubenswrapper[4830]: I0318 18:20:49.785609 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cb65b4b5-sqpx7" podUID="001a299a-877e-4e4b-a2e4-24b82df345b8" containerName="dnsmasq-dns" containerID="cri-o://498088e26660da7eae741eb26c5921cf04ce7ff512f9c078029af63bb79e9ead" gracePeriod=10 Mar 18 18:20:49 crc kubenswrapper[4830]: I0318 18:20:49.813179 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-wrpjj" podStartSLOduration=2.813161463 podStartE2EDuration="2.813161463s" podCreationTimestamp="2026-03-18 18:20:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:20:49.804233305 +0000 UTC m=+1084.371863637" watchObservedRunningTime="2026-03-18 18:20:49.813161463 +0000 UTC m=+1084.380791795" Mar 18 18:20:49 crc kubenswrapper[4830]: I0318 18:20:49.843110 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-vgb8c" podStartSLOduration=1.843088135 podStartE2EDuration="1.843088135s" podCreationTimestamp="2026-03-18 18:20:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:20:49.8270643 +0000 UTC m=+1084.394694632" watchObservedRunningTime="2026-03-18 18:20:49.843088135 +0000 UTC m=+1084.410718467" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.038133 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ff8446d97-s68gs" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.168642 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w58zj\" (UniqueName: \"kubernetes.io/projected/a5a22953-2e4d-48be-a69a-8593623257d0-kube-api-access-w58zj\") pod \"a5a22953-2e4d-48be-a69a-8593623257d0\" (UID: \"a5a22953-2e4d-48be-a69a-8593623257d0\") " Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.168719 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a5a22953-2e4d-48be-a69a-8593623257d0-dns-swift-storage-0\") pod \"a5a22953-2e4d-48be-a69a-8593623257d0\" (UID: \"a5a22953-2e4d-48be-a69a-8593623257d0\") " Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.168824 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5a22953-2e4d-48be-a69a-8593623257d0-dns-svc\") pod \"a5a22953-2e4d-48be-a69a-8593623257d0\" (UID: \"a5a22953-2e4d-48be-a69a-8593623257d0\") " Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.168864 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a5a22953-2e4d-48be-a69a-8593623257d0-ovsdbserver-nb\") pod \"a5a22953-2e4d-48be-a69a-8593623257d0\" (UID: \"a5a22953-2e4d-48be-a69a-8593623257d0\") " Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.168917 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a5a22953-2e4d-48be-a69a-8593623257d0-ovsdbserver-sb\") pod \"a5a22953-2e4d-48be-a69a-8593623257d0\" (UID: \"a5a22953-2e4d-48be-a69a-8593623257d0\") " Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.168948 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5a22953-2e4d-48be-a69a-8593623257d0-config\") pod \"a5a22953-2e4d-48be-a69a-8593623257d0\" (UID: \"a5a22953-2e4d-48be-a69a-8593623257d0\") " Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.192937 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5a22953-2e4d-48be-a69a-8593623257d0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a5a22953-2e4d-48be-a69a-8593623257d0" (UID: "a5a22953-2e4d-48be-a69a-8593623257d0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.201210 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5a22953-2e4d-48be-a69a-8593623257d0-kube-api-access-w58zj" (OuterVolumeSpecName: "kube-api-access-w58zj") pod "a5a22953-2e4d-48be-a69a-8593623257d0" (UID: "a5a22953-2e4d-48be-a69a-8593623257d0"). InnerVolumeSpecName "kube-api-access-w58zj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.206234 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.212703 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5a22953-2e4d-48be-a69a-8593623257d0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a5a22953-2e4d-48be-a69a-8593623257d0" (UID: "a5a22953-2e4d-48be-a69a-8593623257d0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.258208 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5a22953-2e4d-48be-a69a-8593623257d0-config" (OuterVolumeSpecName: "config") pod "a5a22953-2e4d-48be-a69a-8593623257d0" (UID: "a5a22953-2e4d-48be-a69a-8593623257d0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.278102 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5a22953-2e4d-48be-a69a-8593623257d0-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.278141 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5a22953-2e4d-48be-a69a-8593623257d0-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.278157 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w58zj\" (UniqueName: \"kubernetes.io/projected/a5a22953-2e4d-48be-a69a-8593623257d0-kube-api-access-w58zj\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.278169 4830 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a5a22953-2e4d-48be-a69a-8593623257d0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.320742 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.341053 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb65b4b5-sqpx7" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.341924 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5a22953-2e4d-48be-a69a-8593623257d0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a5a22953-2e4d-48be-a69a-8593623257d0" (UID: "a5a22953-2e4d-48be-a69a-8593623257d0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.350159 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5a22953-2e4d-48be-a69a-8593623257d0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a5a22953-2e4d-48be-a69a-8593623257d0" (UID: "a5a22953-2e4d-48be-a69a-8593623257d0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.378792 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/001a299a-877e-4e4b-a2e4-24b82df345b8-config\") pod \"001a299a-877e-4e4b-a2e4-24b82df345b8\" (UID: \"001a299a-877e-4e4b-a2e4-24b82df345b8\") " Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.378857 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/001a299a-877e-4e4b-a2e4-24b82df345b8-dns-svc\") pod \"001a299a-877e-4e4b-a2e4-24b82df345b8\" (UID: \"001a299a-877e-4e4b-a2e4-24b82df345b8\") " Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.378877 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/001a299a-877e-4e4b-a2e4-24b82df345b8-ovsdbserver-sb\") pod \"001a299a-877e-4e4b-a2e4-24b82df345b8\" (UID: \"001a299a-877e-4e4b-a2e4-24b82df345b8\") " Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.378920 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/001a299a-877e-4e4b-a2e4-24b82df345b8-dns-swift-storage-0\") pod \"001a299a-877e-4e4b-a2e4-24b82df345b8\" (UID: \"001a299a-877e-4e4b-a2e4-24b82df345b8\") " Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.379028 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2bb2\" (UniqueName: \"kubernetes.io/projected/001a299a-877e-4e4b-a2e4-24b82df345b8-kube-api-access-v2bb2\") pod \"001a299a-877e-4e4b-a2e4-24b82df345b8\" (UID: \"001a299a-877e-4e4b-a2e4-24b82df345b8\") " Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.379084 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/001a299a-877e-4e4b-a2e4-24b82df345b8-ovsdbserver-nb\") pod \"001a299a-877e-4e4b-a2e4-24b82df345b8\" (UID: \"001a299a-877e-4e4b-a2e4-24b82df345b8\") " Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.379452 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a5a22953-2e4d-48be-a69a-8593623257d0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.379469 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a5a22953-2e4d-48be-a69a-8593623257d0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.400983 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/001a299a-877e-4e4b-a2e4-24b82df345b8-kube-api-access-v2bb2" (OuterVolumeSpecName: "kube-api-access-v2bb2") pod "001a299a-877e-4e4b-a2e4-24b82df345b8" (UID: "001a299a-877e-4e4b-a2e4-24b82df345b8"). InnerVolumeSpecName "kube-api-access-v2bb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.454902 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/001a299a-877e-4e4b-a2e4-24b82df345b8-config" (OuterVolumeSpecName: "config") pod "001a299a-877e-4e4b-a2e4-24b82df345b8" (UID: "001a299a-877e-4e4b-a2e4-24b82df345b8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.455656 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/001a299a-877e-4e4b-a2e4-24b82df345b8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "001a299a-877e-4e4b-a2e4-24b82df345b8" (UID: "001a299a-877e-4e4b-a2e4-24b82df345b8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.474489 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/001a299a-877e-4e4b-a2e4-24b82df345b8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "001a299a-877e-4e4b-a2e4-24b82df345b8" (UID: "001a299a-877e-4e4b-a2e4-24b82df345b8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.476647 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/001a299a-877e-4e4b-a2e4-24b82df345b8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "001a299a-877e-4e4b-a2e4-24b82df345b8" (UID: "001a299a-877e-4e4b-a2e4-24b82df345b8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.481887 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/001a299a-877e-4e4b-a2e4-24b82df345b8-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.481919 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/001a299a-877e-4e4b-a2e4-24b82df345b8-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.481928 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/001a299a-877e-4e4b-a2e4-24b82df345b8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.481940 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2bb2\" (UniqueName: \"kubernetes.io/projected/001a299a-877e-4e4b-a2e4-24b82df345b8-kube-api-access-v2bb2\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.481948 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/001a299a-877e-4e4b-a2e4-24b82df345b8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.511260 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/001a299a-877e-4e4b-a2e4-24b82df345b8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "001a299a-877e-4e4b-a2e4-24b82df345b8" (UID: "001a299a-877e-4e4b-a2e4-24b82df345b8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.583807 4830 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/001a299a-877e-4e4b-a2e4-24b82df345b8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.803447 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff6d84665-njgzs" event={"ID":"cf55af5b-7fa0-4c22-8edc-868fdd43d7c6","Type":"ContainerStarted","Data":"962947d8482075c926d7b1628195a9b909626dd7f9b57ff077cac44de5e256a8"} Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.804557 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7ff6d84665-njgzs" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.812693 4830 generic.go:334] "Generic (PLEG): container finished" podID="001a299a-877e-4e4b-a2e4-24b82df345b8" containerID="498088e26660da7eae741eb26c5921cf04ce7ff512f9c078029af63bb79e9ead" exitCode=0 Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.812798 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb65b4b5-sqpx7" event={"ID":"001a299a-877e-4e4b-a2e4-24b82df345b8","Type":"ContainerDied","Data":"498088e26660da7eae741eb26c5921cf04ce7ff512f9c078029af63bb79e9ead"} Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.812825 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb65b4b5-sqpx7" event={"ID":"001a299a-877e-4e4b-a2e4-24b82df345b8","Type":"ContainerDied","Data":"40c8e8e4152f17689b440fdf238e5ea24084e600517150b2c2e8c4e71f546f70"} Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.812845 4830 scope.go:117] "RemoveContainer" containerID="498088e26660da7eae741eb26c5921cf04ce7ff512f9c078029af63bb79e9ead" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.812974 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb65b4b5-sqpx7" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.817480 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ff8446d97-s68gs" event={"ID":"a5a22953-2e4d-48be-a69a-8593623257d0","Type":"ContainerDied","Data":"a3c7bdcc516cc22e900d4989318f7c4511f02747ce2f65d857b9ef79b6b279f9"} Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.817533 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ff8446d97-s68gs" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.826459 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7ff6d84665-njgzs" podStartSLOduration=2.826441388 podStartE2EDuration="2.826441388s" podCreationTimestamp="2026-03-18 18:20:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:20:50.823339352 +0000 UTC m=+1085.390969684" watchObservedRunningTime="2026-03-18 18:20:50.826441388 +0000 UTC m=+1085.394071710" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.845308 4830 scope.go:117] "RemoveContainer" containerID="2117a7acf8456cc8a293f1e114c3584125cb7311ff6bac7d1dc4846c989bb348" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.899929 4830 scope.go:117] "RemoveContainer" containerID="498088e26660da7eae741eb26c5921cf04ce7ff512f9c078029af63bb79e9ead" Mar 18 18:20:50 crc kubenswrapper[4830]: E0318 18:20:50.912701 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"498088e26660da7eae741eb26c5921cf04ce7ff512f9c078029af63bb79e9ead\": container with ID starting with 498088e26660da7eae741eb26c5921cf04ce7ff512f9c078029af63bb79e9ead not found: ID does not exist" containerID="498088e26660da7eae741eb26c5921cf04ce7ff512f9c078029af63bb79e9ead" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.912784 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"498088e26660da7eae741eb26c5921cf04ce7ff512f9c078029af63bb79e9ead"} err="failed to get container status \"498088e26660da7eae741eb26c5921cf04ce7ff512f9c078029af63bb79e9ead\": rpc error: code = NotFound desc = could not find container \"498088e26660da7eae741eb26c5921cf04ce7ff512f9c078029af63bb79e9ead\": container with ID starting with 498088e26660da7eae741eb26c5921cf04ce7ff512f9c078029af63bb79e9ead not found: ID does not exist" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.912816 4830 scope.go:117] "RemoveContainer" containerID="2117a7acf8456cc8a293f1e114c3584125cb7311ff6bac7d1dc4846c989bb348" Mar 18 18:20:50 crc kubenswrapper[4830]: E0318 18:20:50.913816 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2117a7acf8456cc8a293f1e114c3584125cb7311ff6bac7d1dc4846c989bb348\": container with ID starting with 2117a7acf8456cc8a293f1e114c3584125cb7311ff6bac7d1dc4846c989bb348 not found: ID does not exist" containerID="2117a7acf8456cc8a293f1e114c3584125cb7311ff6bac7d1dc4846c989bb348" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.913882 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2117a7acf8456cc8a293f1e114c3584125cb7311ff6bac7d1dc4846c989bb348"} err="failed to get container status \"2117a7acf8456cc8a293f1e114c3584125cb7311ff6bac7d1dc4846c989bb348\": rpc error: code = NotFound desc = could not find container \"2117a7acf8456cc8a293f1e114c3584125cb7311ff6bac7d1dc4846c989bb348\": container with ID starting with 2117a7acf8456cc8a293f1e114c3584125cb7311ff6bac7d1dc4846c989bb348 not found: ID does not exist" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.913910 4830 scope.go:117] "RemoveContainer" containerID="4cc0f1d5708291b39e153605a1e08f728fb768cd34d78108377252c9dde37c9e" Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.949320 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ff8446d97-s68gs"] Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.954200 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ff8446d97-s68gs"] Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.961420 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cb65b4b5-sqpx7"] Mar 18 18:20:50 crc kubenswrapper[4830]: I0318 18:20:50.968582 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cb65b4b5-sqpx7"] Mar 18 18:20:52 crc kubenswrapper[4830]: I0318 18:20:52.245861 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="001a299a-877e-4e4b-a2e4-24b82df345b8" path="/var/lib/kubelet/pods/001a299a-877e-4e4b-a2e4-24b82df345b8/volumes" Mar 18 18:20:52 crc kubenswrapper[4830]: I0318 18:20:52.247109 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5a22953-2e4d-48be-a69a-8593623257d0" path="/var/lib/kubelet/pods/a5a22953-2e4d-48be-a69a-8593623257d0/volumes" Mar 18 18:20:52 crc kubenswrapper[4830]: I0318 18:20:52.842388 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hxlrh" event={"ID":"e2cad194-a0a7-44e7-8e5c-4653ae33983c","Type":"ContainerStarted","Data":"7c99bf884aafec1098e4dad942ec1e611c8c819bbe83815db3fdad19bac4fd8e"} Mar 18 18:20:52 crc kubenswrapper[4830]: I0318 18:20:52.868392 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-hxlrh" podStartSLOduration=3.412519304 podStartE2EDuration="34.868375334s" podCreationTimestamp="2026-03-18 18:20:18 +0000 UTC" firstStartedPulling="2026-03-18 18:20:19.532446609 +0000 UTC m=+1054.100076951" lastFinishedPulling="2026-03-18 18:20:50.988302649 +0000 UTC m=+1085.555932981" observedRunningTime="2026-03-18 18:20:52.858649673 +0000 UTC m=+1087.426280015" watchObservedRunningTime="2026-03-18 18:20:52.868375334 +0000 UTC m=+1087.436005656" Mar 18 18:20:53 crc kubenswrapper[4830]: I0318 18:20:53.854475 4830 generic.go:334] "Generic (PLEG): container finished" podID="4ff5d5df-234c-4628-bde1-11f3673a22fd" containerID="119d8ed4821b740db22ee1445ab8b58f898477bd08bc5e02c8c7846598245ad2" exitCode=0 Mar 18 18:20:53 crc kubenswrapper[4830]: I0318 18:20:53.854523 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wrpjj" event={"ID":"4ff5d5df-234c-4628-bde1-11f3673a22fd","Type":"ContainerDied","Data":"119d8ed4821b740db22ee1445ab8b58f898477bd08bc5e02c8c7846598245ad2"} Mar 18 18:20:56 crc kubenswrapper[4830]: I0318 18:20:56.472687 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wrpjj" Mar 18 18:20:56 crc kubenswrapper[4830]: I0318 18:20:56.584281 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4ff5d5df-234c-4628-bde1-11f3673a22fd-fernet-keys\") pod \"4ff5d5df-234c-4628-bde1-11f3673a22fd\" (UID: \"4ff5d5df-234c-4628-bde1-11f3673a22fd\") " Mar 18 18:20:56 crc kubenswrapper[4830]: I0318 18:20:56.584327 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4ff5d5df-234c-4628-bde1-11f3673a22fd-credential-keys\") pod \"4ff5d5df-234c-4628-bde1-11f3673a22fd\" (UID: \"4ff5d5df-234c-4628-bde1-11f3673a22fd\") " Mar 18 18:20:56 crc kubenswrapper[4830]: I0318 18:20:56.584369 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kc52\" (UniqueName: \"kubernetes.io/projected/4ff5d5df-234c-4628-bde1-11f3673a22fd-kube-api-access-2kc52\") pod \"4ff5d5df-234c-4628-bde1-11f3673a22fd\" (UID: \"4ff5d5df-234c-4628-bde1-11f3673a22fd\") " Mar 18 18:20:56 crc kubenswrapper[4830]: I0318 18:20:56.584393 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ff5d5df-234c-4628-bde1-11f3673a22fd-config-data\") pod \"4ff5d5df-234c-4628-bde1-11f3673a22fd\" (UID: \"4ff5d5df-234c-4628-bde1-11f3673a22fd\") " Mar 18 18:20:56 crc kubenswrapper[4830]: I0318 18:20:56.585093 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ff5d5df-234c-4628-bde1-11f3673a22fd-combined-ca-bundle\") pod \"4ff5d5df-234c-4628-bde1-11f3673a22fd\" (UID: \"4ff5d5df-234c-4628-bde1-11f3673a22fd\") " Mar 18 18:20:56 crc kubenswrapper[4830]: I0318 18:20:56.585196 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ff5d5df-234c-4628-bde1-11f3673a22fd-scripts\") pod \"4ff5d5df-234c-4628-bde1-11f3673a22fd\" (UID: \"4ff5d5df-234c-4628-bde1-11f3673a22fd\") " Mar 18 18:20:56 crc kubenswrapper[4830]: I0318 18:20:56.590840 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ff5d5df-234c-4628-bde1-11f3673a22fd-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4ff5d5df-234c-4628-bde1-11f3673a22fd" (UID: "4ff5d5df-234c-4628-bde1-11f3673a22fd"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:20:56 crc kubenswrapper[4830]: I0318 18:20:56.591694 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ff5d5df-234c-4628-bde1-11f3673a22fd-kube-api-access-2kc52" (OuterVolumeSpecName: "kube-api-access-2kc52") pod "4ff5d5df-234c-4628-bde1-11f3673a22fd" (UID: "4ff5d5df-234c-4628-bde1-11f3673a22fd"). InnerVolumeSpecName "kube-api-access-2kc52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:20:56 crc kubenswrapper[4830]: I0318 18:20:56.591826 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ff5d5df-234c-4628-bde1-11f3673a22fd-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4ff5d5df-234c-4628-bde1-11f3673a22fd" (UID: "4ff5d5df-234c-4628-bde1-11f3673a22fd"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:20:56 crc kubenswrapper[4830]: I0318 18:20:56.604958 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ff5d5df-234c-4628-bde1-11f3673a22fd-scripts" (OuterVolumeSpecName: "scripts") pod "4ff5d5df-234c-4628-bde1-11f3673a22fd" (UID: "4ff5d5df-234c-4628-bde1-11f3673a22fd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:20:56 crc kubenswrapper[4830]: I0318 18:20:56.612481 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ff5d5df-234c-4628-bde1-11f3673a22fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ff5d5df-234c-4628-bde1-11f3673a22fd" (UID: "4ff5d5df-234c-4628-bde1-11f3673a22fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:20:56 crc kubenswrapper[4830]: I0318 18:20:56.628874 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ff5d5df-234c-4628-bde1-11f3673a22fd-config-data" (OuterVolumeSpecName: "config-data") pod "4ff5d5df-234c-4628-bde1-11f3673a22fd" (UID: "4ff5d5df-234c-4628-bde1-11f3673a22fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:20:56 crc kubenswrapper[4830]: I0318 18:20:56.687097 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ff5d5df-234c-4628-bde1-11f3673a22fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:56 crc kubenswrapper[4830]: I0318 18:20:56.687125 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ff5d5df-234c-4628-bde1-11f3673a22fd-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:56 crc kubenswrapper[4830]: I0318 18:20:56.687133 4830 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4ff5d5df-234c-4628-bde1-11f3673a22fd-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:56 crc kubenswrapper[4830]: I0318 18:20:56.687143 4830 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4ff5d5df-234c-4628-bde1-11f3673a22fd-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:56 crc kubenswrapper[4830]: I0318 18:20:56.687151 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kc52\" (UniqueName: \"kubernetes.io/projected/4ff5d5df-234c-4628-bde1-11f3673a22fd-kube-api-access-2kc52\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:56 crc kubenswrapper[4830]: I0318 18:20:56.687160 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ff5d5df-234c-4628-bde1-11f3673a22fd-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:56 crc kubenswrapper[4830]: I0318 18:20:56.881757 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wrpjj" event={"ID":"4ff5d5df-234c-4628-bde1-11f3673a22fd","Type":"ContainerDied","Data":"42a7d2cfdc68f53354e6462975b5a7583d5b3b0573163d070ccb29265a16477e"} Mar 18 18:20:56 crc kubenswrapper[4830]: I0318 18:20:56.881804 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42a7d2cfdc68f53354e6462975b5a7583d5b3b0573163d070ccb29265a16477e" Mar 18 18:20:56 crc kubenswrapper[4830]: I0318 18:20:56.881857 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wrpjj" Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.550444 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-wrpjj"] Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.565336 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-wrpjj"] Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.650447 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-fknfr"] Mar 18 18:20:57 crc kubenswrapper[4830]: E0318 18:20:57.651177 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ff5d5df-234c-4628-bde1-11f3673a22fd" containerName="keystone-bootstrap" Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.651211 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ff5d5df-234c-4628-bde1-11f3673a22fd" containerName="keystone-bootstrap" Mar 18 18:20:57 crc kubenswrapper[4830]: E0318 18:20:57.651239 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="001a299a-877e-4e4b-a2e4-24b82df345b8" containerName="dnsmasq-dns" Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.651252 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="001a299a-877e-4e4b-a2e4-24b82df345b8" containerName="dnsmasq-dns" Mar 18 18:20:57 crc kubenswrapper[4830]: E0318 18:20:57.651311 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5a22953-2e4d-48be-a69a-8593623257d0" containerName="init" Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.651324 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5a22953-2e4d-48be-a69a-8593623257d0" containerName="init" Mar 18 18:20:57 crc kubenswrapper[4830]: E0318 18:20:57.651346 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="001a299a-877e-4e4b-a2e4-24b82df345b8" containerName="init" Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.651359 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="001a299a-877e-4e4b-a2e4-24b82df345b8" containerName="init" Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.651627 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5a22953-2e4d-48be-a69a-8593623257d0" containerName="init" Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.651665 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="001a299a-877e-4e4b-a2e4-24b82df345b8" containerName="dnsmasq-dns" Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.651684 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ff5d5df-234c-4628-bde1-11f3673a22fd" containerName="keystone-bootstrap" Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.652564 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fknfr" Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.656531 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.656593 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.656736 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.656849 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.656930 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-cg9pq" Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.663124 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fknfr"] Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.705270 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-scripts\") pod \"keystone-bootstrap-fknfr\" (UID: \"4d3ffcbf-d066-4c5f-bf95-8503bcb983cf\") " pod="openstack/keystone-bootstrap-fknfr" Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.705481 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-credential-keys\") pod \"keystone-bootstrap-fknfr\" (UID: \"4d3ffcbf-d066-4c5f-bf95-8503bcb983cf\") " pod="openstack/keystone-bootstrap-fknfr" Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.705543 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdxb4\" (UniqueName: \"kubernetes.io/projected/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-kube-api-access-zdxb4\") pod \"keystone-bootstrap-fknfr\" (UID: \"4d3ffcbf-d066-4c5f-bf95-8503bcb983cf\") " pod="openstack/keystone-bootstrap-fknfr" Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.705679 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-config-data\") pod \"keystone-bootstrap-fknfr\" (UID: \"4d3ffcbf-d066-4c5f-bf95-8503bcb983cf\") " pod="openstack/keystone-bootstrap-fknfr" Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.705732 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-combined-ca-bundle\") pod \"keystone-bootstrap-fknfr\" (UID: \"4d3ffcbf-d066-4c5f-bf95-8503bcb983cf\") " pod="openstack/keystone-bootstrap-fknfr" Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.705907 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-fernet-keys\") pod \"keystone-bootstrap-fknfr\" (UID: \"4d3ffcbf-d066-4c5f-bf95-8503bcb983cf\") " pod="openstack/keystone-bootstrap-fknfr" Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.808033 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-credential-keys\") pod \"keystone-bootstrap-fknfr\" (UID: \"4d3ffcbf-d066-4c5f-bf95-8503bcb983cf\") " pod="openstack/keystone-bootstrap-fknfr" Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.808086 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdxb4\" (UniqueName: \"kubernetes.io/projected/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-kube-api-access-zdxb4\") pod \"keystone-bootstrap-fknfr\" (UID: \"4d3ffcbf-d066-4c5f-bf95-8503bcb983cf\") " pod="openstack/keystone-bootstrap-fknfr" Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.808133 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-config-data\") pod \"keystone-bootstrap-fknfr\" (UID: \"4d3ffcbf-d066-4c5f-bf95-8503bcb983cf\") " pod="openstack/keystone-bootstrap-fknfr" Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.808173 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-combined-ca-bundle\") pod \"keystone-bootstrap-fknfr\" (UID: \"4d3ffcbf-d066-4c5f-bf95-8503bcb983cf\") " pod="openstack/keystone-bootstrap-fknfr" Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.808214 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-fernet-keys\") pod \"keystone-bootstrap-fknfr\" (UID: \"4d3ffcbf-d066-4c5f-bf95-8503bcb983cf\") " pod="openstack/keystone-bootstrap-fknfr" Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.808241 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-scripts\") pod \"keystone-bootstrap-fknfr\" (UID: \"4d3ffcbf-d066-4c5f-bf95-8503bcb983cf\") " pod="openstack/keystone-bootstrap-fknfr" Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.814064 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-config-data\") pod \"keystone-bootstrap-fknfr\" (UID: \"4d3ffcbf-d066-4c5f-bf95-8503bcb983cf\") " pod="openstack/keystone-bootstrap-fknfr" Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.814452 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-scripts\") pod \"keystone-bootstrap-fknfr\" (UID: \"4d3ffcbf-d066-4c5f-bf95-8503bcb983cf\") " pod="openstack/keystone-bootstrap-fknfr" Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.816008 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-combined-ca-bundle\") pod \"keystone-bootstrap-fknfr\" (UID: \"4d3ffcbf-d066-4c5f-bf95-8503bcb983cf\") " pod="openstack/keystone-bootstrap-fknfr" Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.824052 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-credential-keys\") pod \"keystone-bootstrap-fknfr\" (UID: \"4d3ffcbf-d066-4c5f-bf95-8503bcb983cf\") " pod="openstack/keystone-bootstrap-fknfr" Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.827109 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-fernet-keys\") pod \"keystone-bootstrap-fknfr\" (UID: \"4d3ffcbf-d066-4c5f-bf95-8503bcb983cf\") " pod="openstack/keystone-bootstrap-fknfr" Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.827422 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdxb4\" (UniqueName: \"kubernetes.io/projected/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-kube-api-access-zdxb4\") pod \"keystone-bootstrap-fknfr\" (UID: \"4d3ffcbf-d066-4c5f-bf95-8503bcb983cf\") " pod="openstack/keystone-bootstrap-fknfr" Mar 18 18:20:57 crc kubenswrapper[4830]: I0318 18:20:57.973306 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fknfr" Mar 18 18:20:58 crc kubenswrapper[4830]: I0318 18:20:58.245541 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ff5d5df-234c-4628-bde1-11f3673a22fd" path="/var/lib/kubelet/pods/4ff5d5df-234c-4628-bde1-11f3673a22fd/volumes" Mar 18 18:20:58 crc kubenswrapper[4830]: I0318 18:20:58.803317 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7ff6d84665-njgzs" Mar 18 18:20:58 crc kubenswrapper[4830]: I0318 18:20:58.867062 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b4ddd5fb7-4q6g2"] Mar 18 18:20:58 crc kubenswrapper[4830]: I0318 18:20:58.867337 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b4ddd5fb7-4q6g2" podUID="a93ae87c-c4d2-4dec-af01-3478996b70fc" containerName="dnsmasq-dns" containerID="cri-o://846a2ec81ed8d83c06ca8e866ae1099194cbad679ba89dee281a5483357a279a" gracePeriod=10 Mar 18 18:21:00 crc kubenswrapper[4830]: I0318 18:21:00.793246 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b4ddd5fb7-4q6g2" podUID="a93ae87c-c4d2-4dec-af01-3478996b70fc" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.121:5353: connect: connection refused" Mar 18 18:21:00 crc kubenswrapper[4830]: I0318 18:21:00.939553 4830 generic.go:334] "Generic (PLEG): container finished" podID="a93ae87c-c4d2-4dec-af01-3478996b70fc" containerID="846a2ec81ed8d83c06ca8e866ae1099194cbad679ba89dee281a5483357a279a" exitCode=0 Mar 18 18:21:00 crc kubenswrapper[4830]: I0318 18:21:00.939596 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4ddd5fb7-4q6g2" event={"ID":"a93ae87c-c4d2-4dec-af01-3478996b70fc","Type":"ContainerDied","Data":"846a2ec81ed8d83c06ca8e866ae1099194cbad679ba89dee281a5483357a279a"} Mar 18 18:21:05 crc kubenswrapper[4830]: I0318 18:21:05.792357 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b4ddd5fb7-4q6g2" podUID="a93ae87c-c4d2-4dec-af01-3478996b70fc" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.121:5353: connect: connection refused" Mar 18 18:21:05 crc kubenswrapper[4830]: I0318 18:21:05.981573 4830 generic.go:334] "Generic (PLEG): container finished" podID="e2cad194-a0a7-44e7-8e5c-4653ae33983c" containerID="7c99bf884aafec1098e4dad942ec1e611c8c819bbe83815db3fdad19bac4fd8e" exitCode=0 Mar 18 18:21:05 crc kubenswrapper[4830]: I0318 18:21:05.981621 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hxlrh" event={"ID":"e2cad194-a0a7-44e7-8e5c-4653ae33983c","Type":"ContainerDied","Data":"7c99bf884aafec1098e4dad942ec1e611c8c819bbe83815db3fdad19bac4fd8e"} Mar 18 18:21:07 crc kubenswrapper[4830]: E0318 18:21:07.306312 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:574a17f0877c175128a764f2b37fc02456649c8514689125718ce6ca974bfb6b" Mar 18 18:21:07 crc kubenswrapper[4830]: E0318 18:21:07.306830 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:574a17f0877c175128a764f2b37fc02456649c8514689125718ce6ca974bfb6b,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9lntz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-96knc_openstack(8c42d089-56c7-45ee-ba54-ee464499ff29): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 18:21:07 crc kubenswrapper[4830]: E0318 18:21:07.308006 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-96knc" podUID="8c42d089-56c7-45ee-ba54-ee464499ff29" Mar 18 18:21:07 crc kubenswrapper[4830]: E0318 18:21:07.788299 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:1240a45aec9c3e1599be762c5565556560849b49fd39c7283b8e5519dcaa501a" Mar 18 18:21:07 crc kubenswrapper[4830]: E0318 18:21:07.788604 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:1240a45aec9c3e1599be762c5565556560849b49fd39c7283b8e5519dcaa501a,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-trfdx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-wvvs5_openstack(40d348ed-98d2-494b-b2b1-f1dfb190a636): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 18:21:07 crc kubenswrapper[4830]: E0318 18:21:07.789730 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-wvvs5" podUID="40d348ed-98d2-494b-b2b1-f1dfb190a636" Mar 18 18:21:07 crc kubenswrapper[4830]: I0318 18:21:07.984472 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hxlrh" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.030614 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hxlrh" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.030753 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hxlrh" event={"ID":"e2cad194-a0a7-44e7-8e5c-4653ae33983c","Type":"ContainerDied","Data":"b41163353b6d25a7f3e9839add4632f868ed51420313e36589cafc75778950d5"} Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.030794 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b41163353b6d25a7f3e9839add4632f868ed51420313e36589cafc75778950d5" Mar 18 18:21:08 crc kubenswrapper[4830]: E0318 18:21:08.032343 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:1240a45aec9c3e1599be762c5565556560849b49fd39c7283b8e5519dcaa501a\\\"\"" pod="openstack/barbican-db-sync-wvvs5" podUID="40d348ed-98d2-494b-b2b1-f1dfb190a636" Mar 18 18:21:08 crc kubenswrapper[4830]: E0318 18:21:08.032434 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:574a17f0877c175128a764f2b37fc02456649c8514689125718ce6ca974bfb6b\\\"\"" pod="openstack/cinder-db-sync-96knc" podUID="8c42d089-56c7-45ee-ba54-ee464499ff29" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.138097 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e2cad194-a0a7-44e7-8e5c-4653ae33983c-db-sync-config-data\") pod \"e2cad194-a0a7-44e7-8e5c-4653ae33983c\" (UID: \"e2cad194-a0a7-44e7-8e5c-4653ae33983c\") " Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.138197 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ckns\" (UniqueName: \"kubernetes.io/projected/e2cad194-a0a7-44e7-8e5c-4653ae33983c-kube-api-access-5ckns\") pod \"e2cad194-a0a7-44e7-8e5c-4653ae33983c\" (UID: \"e2cad194-a0a7-44e7-8e5c-4653ae33983c\") " Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.138244 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2cad194-a0a7-44e7-8e5c-4653ae33983c-combined-ca-bundle\") pod \"e2cad194-a0a7-44e7-8e5c-4653ae33983c\" (UID: \"e2cad194-a0a7-44e7-8e5c-4653ae33983c\") " Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.138271 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2cad194-a0a7-44e7-8e5c-4653ae33983c-config-data\") pod \"e2cad194-a0a7-44e7-8e5c-4653ae33983c\" (UID: \"e2cad194-a0a7-44e7-8e5c-4653ae33983c\") " Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.146054 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2cad194-a0a7-44e7-8e5c-4653ae33983c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e2cad194-a0a7-44e7-8e5c-4653ae33983c" (UID: "e2cad194-a0a7-44e7-8e5c-4653ae33983c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.155408 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2cad194-a0a7-44e7-8e5c-4653ae33983c-kube-api-access-5ckns" (OuterVolumeSpecName: "kube-api-access-5ckns") pod "e2cad194-a0a7-44e7-8e5c-4653ae33983c" (UID: "e2cad194-a0a7-44e7-8e5c-4653ae33983c"). InnerVolumeSpecName "kube-api-access-5ckns". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.168863 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2cad194-a0a7-44e7-8e5c-4653ae33983c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2cad194-a0a7-44e7-8e5c-4653ae33983c" (UID: "e2cad194-a0a7-44e7-8e5c-4653ae33983c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.214161 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4ddd5fb7-4q6g2" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.222988 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2cad194-a0a7-44e7-8e5c-4653ae33983c-config-data" (OuterVolumeSpecName: "config-data") pod "e2cad194-a0a7-44e7-8e5c-4653ae33983c" (UID: "e2cad194-a0a7-44e7-8e5c-4653ae33983c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.240326 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ckns\" (UniqueName: \"kubernetes.io/projected/e2cad194-a0a7-44e7-8e5c-4653ae33983c-kube-api-access-5ckns\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.240354 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2cad194-a0a7-44e7-8e5c-4653ae33983c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.240363 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2cad194-a0a7-44e7-8e5c-4653ae33983c-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.240372 4830 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e2cad194-a0a7-44e7-8e5c-4653ae33983c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.342420 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a93ae87c-c4d2-4dec-af01-3478996b70fc-ovsdbserver-nb\") pod \"a93ae87c-c4d2-4dec-af01-3478996b70fc\" (UID: \"a93ae87c-c4d2-4dec-af01-3478996b70fc\") " Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.342491 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a93ae87c-c4d2-4dec-af01-3478996b70fc-config\") pod \"a93ae87c-c4d2-4dec-af01-3478996b70fc\" (UID: \"a93ae87c-c4d2-4dec-af01-3478996b70fc\") " Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.342563 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a93ae87c-c4d2-4dec-af01-3478996b70fc-ovsdbserver-sb\") pod \"a93ae87c-c4d2-4dec-af01-3478996b70fc\" (UID: \"a93ae87c-c4d2-4dec-af01-3478996b70fc\") " Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.342611 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8b4m\" (UniqueName: \"kubernetes.io/projected/a93ae87c-c4d2-4dec-af01-3478996b70fc-kube-api-access-t8b4m\") pod \"a93ae87c-c4d2-4dec-af01-3478996b70fc\" (UID: \"a93ae87c-c4d2-4dec-af01-3478996b70fc\") " Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.342633 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a93ae87c-c4d2-4dec-af01-3478996b70fc-dns-svc\") pod \"a93ae87c-c4d2-4dec-af01-3478996b70fc\" (UID: \"a93ae87c-c4d2-4dec-af01-3478996b70fc\") " Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.353436 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a93ae87c-c4d2-4dec-af01-3478996b70fc-kube-api-access-t8b4m" (OuterVolumeSpecName: "kube-api-access-t8b4m") pod "a93ae87c-c4d2-4dec-af01-3478996b70fc" (UID: "a93ae87c-c4d2-4dec-af01-3478996b70fc"). InnerVolumeSpecName "kube-api-access-t8b4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.400955 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fknfr"] Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.415819 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d7fb48775-b79mg"] Mar 18 18:21:08 crc kubenswrapper[4830]: E0318 18:21:08.416218 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a93ae87c-c4d2-4dec-af01-3478996b70fc" containerName="dnsmasq-dns" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.416229 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a93ae87c-c4d2-4dec-af01-3478996b70fc" containerName="dnsmasq-dns" Mar 18 18:21:08 crc kubenswrapper[4830]: E0318 18:21:08.416250 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2cad194-a0a7-44e7-8e5c-4653ae33983c" containerName="glance-db-sync" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.416256 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2cad194-a0a7-44e7-8e5c-4653ae33983c" containerName="glance-db-sync" Mar 18 18:21:08 crc kubenswrapper[4830]: E0318 18:21:08.416273 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a93ae87c-c4d2-4dec-af01-3478996b70fc" containerName="init" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.416278 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a93ae87c-c4d2-4dec-af01-3478996b70fc" containerName="init" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.416445 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2cad194-a0a7-44e7-8e5c-4653ae33983c" containerName="glance-db-sync" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.416464 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="a93ae87c-c4d2-4dec-af01-3478996b70fc" containerName="dnsmasq-dns" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.417279 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7fb48775-b79mg" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.431538 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a93ae87c-c4d2-4dec-af01-3478996b70fc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a93ae87c-c4d2-4dec-af01-3478996b70fc" (UID: "a93ae87c-c4d2-4dec-af01-3478996b70fc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.438178 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7fb48775-b79mg"] Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.445445 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a93ae87c-c4d2-4dec-af01-3478996b70fc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a93ae87c-c4d2-4dec-af01-3478996b70fc" (UID: "a93ae87c-c4d2-4dec-af01-3478996b70fc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.445758 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a93ae87c-c4d2-4dec-af01-3478996b70fc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a93ae87c-c4d2-4dec-af01-3478996b70fc" (UID: "a93ae87c-c4d2-4dec-af01-3478996b70fc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.447076 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a93ae87c-c4d2-4dec-af01-3478996b70fc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.447147 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a93ae87c-c4d2-4dec-af01-3478996b70fc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.447163 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8b4m\" (UniqueName: \"kubernetes.io/projected/a93ae87c-c4d2-4dec-af01-3478996b70fc-kube-api-access-t8b4m\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.447172 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a93ae87c-c4d2-4dec-af01-3478996b70fc-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.471214 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a93ae87c-c4d2-4dec-af01-3478996b70fc-config" (OuterVolumeSpecName: "config") pod "a93ae87c-c4d2-4dec-af01-3478996b70fc" (UID: "a93ae87c-c4d2-4dec-af01-3478996b70fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.549585 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-config\") pod \"dnsmasq-dns-5d7fb48775-b79mg\" (UID: \"b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5\") " pod="openstack/dnsmasq-dns-5d7fb48775-b79mg" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.549677 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-dns-swift-storage-0\") pod \"dnsmasq-dns-5d7fb48775-b79mg\" (UID: \"b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5\") " pod="openstack/dnsmasq-dns-5d7fb48775-b79mg" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.549748 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7vk9\" (UniqueName: \"kubernetes.io/projected/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-kube-api-access-h7vk9\") pod \"dnsmasq-dns-5d7fb48775-b79mg\" (UID: \"b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5\") " pod="openstack/dnsmasq-dns-5d7fb48775-b79mg" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.549864 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-dns-svc\") pod \"dnsmasq-dns-5d7fb48775-b79mg\" (UID: \"b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5\") " pod="openstack/dnsmasq-dns-5d7fb48775-b79mg" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.549908 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-ovsdbserver-nb\") pod \"dnsmasq-dns-5d7fb48775-b79mg\" (UID: \"b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5\") " pod="openstack/dnsmasq-dns-5d7fb48775-b79mg" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.550039 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-ovsdbserver-sb\") pod \"dnsmasq-dns-5d7fb48775-b79mg\" (UID: \"b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5\") " pod="openstack/dnsmasq-dns-5d7fb48775-b79mg" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.550110 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a93ae87c-c4d2-4dec-af01-3478996b70fc-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.651602 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-ovsdbserver-sb\") pod \"dnsmasq-dns-5d7fb48775-b79mg\" (UID: \"b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5\") " pod="openstack/dnsmasq-dns-5d7fb48775-b79mg" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.651898 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-config\") pod \"dnsmasq-dns-5d7fb48775-b79mg\" (UID: \"b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5\") " pod="openstack/dnsmasq-dns-5d7fb48775-b79mg" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.651928 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-dns-swift-storage-0\") pod \"dnsmasq-dns-5d7fb48775-b79mg\" (UID: \"b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5\") " pod="openstack/dnsmasq-dns-5d7fb48775-b79mg" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.651977 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7vk9\" (UniqueName: \"kubernetes.io/projected/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-kube-api-access-h7vk9\") pod \"dnsmasq-dns-5d7fb48775-b79mg\" (UID: \"b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5\") " pod="openstack/dnsmasq-dns-5d7fb48775-b79mg" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.652001 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-dns-svc\") pod \"dnsmasq-dns-5d7fb48775-b79mg\" (UID: \"b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5\") " pod="openstack/dnsmasq-dns-5d7fb48775-b79mg" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.652020 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-ovsdbserver-nb\") pod \"dnsmasq-dns-5d7fb48775-b79mg\" (UID: \"b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5\") " pod="openstack/dnsmasq-dns-5d7fb48775-b79mg" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.652903 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-ovsdbserver-nb\") pod \"dnsmasq-dns-5d7fb48775-b79mg\" (UID: \"b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5\") " pod="openstack/dnsmasq-dns-5d7fb48775-b79mg" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.654335 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-dns-swift-storage-0\") pod \"dnsmasq-dns-5d7fb48775-b79mg\" (UID: \"b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5\") " pod="openstack/dnsmasq-dns-5d7fb48775-b79mg" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.654834 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-dns-svc\") pod \"dnsmasq-dns-5d7fb48775-b79mg\" (UID: \"b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5\") " pod="openstack/dnsmasq-dns-5d7fb48775-b79mg" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.657195 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-ovsdbserver-sb\") pod \"dnsmasq-dns-5d7fb48775-b79mg\" (UID: \"b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5\") " pod="openstack/dnsmasq-dns-5d7fb48775-b79mg" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.657238 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-config\") pod \"dnsmasq-dns-5d7fb48775-b79mg\" (UID: \"b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5\") " pod="openstack/dnsmasq-dns-5d7fb48775-b79mg" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.669631 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7vk9\" (UniqueName: \"kubernetes.io/projected/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-kube-api-access-h7vk9\") pod \"dnsmasq-dns-5d7fb48775-b79mg\" (UID: \"b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5\") " pod="openstack/dnsmasq-dns-5d7fb48775-b79mg" Mar 18 18:21:08 crc kubenswrapper[4830]: I0318 18:21:08.740887 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7fb48775-b79mg" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.041554 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-msrv5" event={"ID":"8a1597fb-6c27-4b75-8996-40ff17a49e69","Type":"ContainerStarted","Data":"260f46cb5b74d58cda51ed1891b814c7af3b8dc935ea5d26486826dc9676c6a4"} Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.043979 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fknfr" event={"ID":"4d3ffcbf-d066-4c5f-bf95-8503bcb983cf","Type":"ContainerStarted","Data":"b679d58105fc60f57cf5c7272800740c97568fe76f6fe223f1159f845d0f52b1"} Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.044029 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fknfr" event={"ID":"4d3ffcbf-d066-4c5f-bf95-8503bcb983cf","Type":"ContainerStarted","Data":"038ac3cf69e85e35e60ab1a254a73215143eb85ae1e7164f060dcfcf2d403bcd"} Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.045054 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7df9e731-0537-4d90-a4c1-907721b227e1","Type":"ContainerStarted","Data":"50c6f76b1d4c7a12f137499a2267b56243ac8396eb3e46a198a94656acaf8ba4"} Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.049954 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4ddd5fb7-4q6g2" event={"ID":"a93ae87c-c4d2-4dec-af01-3478996b70fc","Type":"ContainerDied","Data":"11abd75f2828fbeab59adc44c628e5b1236435d08218147f26257bab514e3f14"} Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.049997 4830 scope.go:117] "RemoveContainer" containerID="846a2ec81ed8d83c06ca8e866ae1099194cbad679ba89dee281a5483357a279a" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.050131 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4ddd5fb7-4q6g2" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.053974 4830 generic.go:334] "Generic (PLEG): container finished" podID="82570719-0e07-4ef2-adee-a287052cc4ac" containerID="7c515d7ae75f9681256fdbe1f69965eb9c3684f823264317db51e5df64736fcd" exitCode=0 Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.054013 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vgb8c" event={"ID":"82570719-0e07-4ef2-adee-a287052cc4ac","Type":"ContainerDied","Data":"7c515d7ae75f9681256fdbe1f69965eb9c3684f823264317db51e5df64736fcd"} Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.074589 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-msrv5" podStartSLOduration=2.64798944 podStartE2EDuration="21.07457336s" podCreationTimestamp="2026-03-18 18:20:48 +0000 UTC" firstStartedPulling="2026-03-18 18:20:49.369889168 +0000 UTC m=+1083.937519500" lastFinishedPulling="2026-03-18 18:21:07.796473058 +0000 UTC m=+1102.364103420" observedRunningTime="2026-03-18 18:21:09.06565693 +0000 UTC m=+1103.633287262" watchObservedRunningTime="2026-03-18 18:21:09.07457336 +0000 UTC m=+1103.642203692" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.088489 4830 scope.go:117] "RemoveContainer" containerID="ca0e0174fbb956aab4d64c5335a5c9127ee141ff9977768dc8b0aac8c5436c5d" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.091848 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b4ddd5fb7-4q6g2"] Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.098567 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b4ddd5fb7-4q6g2"] Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.122344 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-fknfr" podStartSLOduration=12.122328483 podStartE2EDuration="12.122328483s" podCreationTimestamp="2026-03-18 18:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:21:09.116756216 +0000 UTC m=+1103.684386548" watchObservedRunningTime="2026-03-18 18:21:09.122328483 +0000 UTC m=+1103.689958815" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.167692 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7fb48775-b79mg"] Mar 18 18:21:09 crc kubenswrapper[4830]: W0318 18:21:09.176033 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb236dbaf_0244_477f_b8fe_4d4a1a9f9fa5.slice/crio-9652c9e048cd3d4b725e92aef386e60f90932970e2132f85a8a2f50e0abaf821 WatchSource:0}: Error finding container 9652c9e048cd3d4b725e92aef386e60f90932970e2132f85a8a2f50e0abaf821: Status 404 returned error can't find the container with id 9652c9e048cd3d4b725e92aef386e60f90932970e2132f85a8a2f50e0abaf821 Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.298036 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.299413 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.304716 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.332202 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.332368 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.333256 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-x4gbn" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.367057 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-config-data\") pod \"glance-default-external-api-0\" (UID: \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.367149 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2mk7\" (UniqueName: \"kubernetes.io/projected/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-kube-api-access-b2mk7\") pod \"glance-default-external-api-0\" (UID: \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.367232 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-logs\") pod \"glance-default-external-api-0\" (UID: \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.367267 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.367306 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.367346 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-scripts\") pod \"glance-default-external-api-0\" (UID: \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.367369 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.469080 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-config-data\") pod \"glance-default-external-api-0\" (UID: \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.469152 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2mk7\" (UniqueName: \"kubernetes.io/projected/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-kube-api-access-b2mk7\") pod \"glance-default-external-api-0\" (UID: \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.469186 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-logs\") pod \"glance-default-external-api-0\" (UID: \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.469214 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.469244 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.469273 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-scripts\") pod \"glance-default-external-api-0\" (UID: \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.469686 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.469788 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.470026 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-logs\") pod \"glance-default-external-api-0\" (UID: \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.472728 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.473854 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-scripts\") pod \"glance-default-external-api-0\" (UID: \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.478479 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.478647 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-config-data\") pod \"glance-default-external-api-0\" (UID: \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.487308 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2mk7\" (UniqueName: \"kubernetes.io/projected/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-kube-api-access-b2mk7\") pod \"glance-default-external-api-0\" (UID: \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.499016 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.590031 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.591306 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.593422 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.619022 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.648074 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.676082 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b46d602e-2976-4d06-80f1-c592927b2415-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b46d602e-2976-4d06-80f1-c592927b2415\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.676133 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"b46d602e-2976-4d06-80f1-c592927b2415\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.676191 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plrx4\" (UniqueName: \"kubernetes.io/projected/b46d602e-2976-4d06-80f1-c592927b2415-kube-api-access-plrx4\") pod \"glance-default-internal-api-0\" (UID: \"b46d602e-2976-4d06-80f1-c592927b2415\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.676209 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b46d602e-2976-4d06-80f1-c592927b2415-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b46d602e-2976-4d06-80f1-c592927b2415\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.676256 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b46d602e-2976-4d06-80f1-c592927b2415-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b46d602e-2976-4d06-80f1-c592927b2415\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.676274 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b46d602e-2976-4d06-80f1-c592927b2415-logs\") pod \"glance-default-internal-api-0\" (UID: \"b46d602e-2976-4d06-80f1-c592927b2415\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.676298 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b46d602e-2976-4d06-80f1-c592927b2415-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b46d602e-2976-4d06-80f1-c592927b2415\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.776963 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plrx4\" (UniqueName: \"kubernetes.io/projected/b46d602e-2976-4d06-80f1-c592927b2415-kube-api-access-plrx4\") pod \"glance-default-internal-api-0\" (UID: \"b46d602e-2976-4d06-80f1-c592927b2415\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.777202 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b46d602e-2976-4d06-80f1-c592927b2415-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b46d602e-2976-4d06-80f1-c592927b2415\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.777252 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b46d602e-2976-4d06-80f1-c592927b2415-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b46d602e-2976-4d06-80f1-c592927b2415\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.777270 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b46d602e-2976-4d06-80f1-c592927b2415-logs\") pod \"glance-default-internal-api-0\" (UID: \"b46d602e-2976-4d06-80f1-c592927b2415\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.777473 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b46d602e-2976-4d06-80f1-c592927b2415-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b46d602e-2976-4d06-80f1-c592927b2415\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.777506 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b46d602e-2976-4d06-80f1-c592927b2415-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b46d602e-2976-4d06-80f1-c592927b2415\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.777539 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"b46d602e-2976-4d06-80f1-c592927b2415\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.777670 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"b46d602e-2976-4d06-80f1-c592927b2415\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.779262 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b46d602e-2976-4d06-80f1-c592927b2415-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b46d602e-2976-4d06-80f1-c592927b2415\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.779599 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b46d602e-2976-4d06-80f1-c592927b2415-logs\") pod \"glance-default-internal-api-0\" (UID: \"b46d602e-2976-4d06-80f1-c592927b2415\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.784079 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b46d602e-2976-4d06-80f1-c592927b2415-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b46d602e-2976-4d06-80f1-c592927b2415\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.785037 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b46d602e-2976-4d06-80f1-c592927b2415-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b46d602e-2976-4d06-80f1-c592927b2415\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.789145 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b46d602e-2976-4d06-80f1-c592927b2415-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b46d602e-2976-4d06-80f1-c592927b2415\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.800283 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plrx4\" (UniqueName: \"kubernetes.io/projected/b46d602e-2976-4d06-80f1-c592927b2415-kube-api-access-plrx4\") pod \"glance-default-internal-api-0\" (UID: \"b46d602e-2976-4d06-80f1-c592927b2415\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.825994 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"b46d602e-2976-4d06-80f1-c592927b2415\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:09 crc kubenswrapper[4830]: I0318 18:21:09.913494 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 18:21:10 crc kubenswrapper[4830]: I0318 18:21:10.067220 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7df9e731-0537-4d90-a4c1-907721b227e1","Type":"ContainerStarted","Data":"47fed7dea201a7cd51fbe2d563f0f2959e9ef377e040f6d593b4d99795c35487"} Mar 18 18:21:10 crc kubenswrapper[4830]: I0318 18:21:10.071011 4830 generic.go:334] "Generic (PLEG): container finished" podID="b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5" containerID="e62a562e4983b13306a458d2cdd9835f57c225b4e34240d054ef2fbd00706d03" exitCode=0 Mar 18 18:21:10 crc kubenswrapper[4830]: I0318 18:21:10.071107 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7fb48775-b79mg" event={"ID":"b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5","Type":"ContainerDied","Data":"e62a562e4983b13306a458d2cdd9835f57c225b4e34240d054ef2fbd00706d03"} Mar 18 18:21:10 crc kubenswrapper[4830]: I0318 18:21:10.071137 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7fb48775-b79mg" event={"ID":"b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5","Type":"ContainerStarted","Data":"9652c9e048cd3d4b725e92aef386e60f90932970e2132f85a8a2f50e0abaf821"} Mar 18 18:21:10 crc kubenswrapper[4830]: I0318 18:21:10.244733 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a93ae87c-c4d2-4dec-af01-3478996b70fc" path="/var/lib/kubelet/pods/a93ae87c-c4d2-4dec-af01-3478996b70fc/volumes" Mar 18 18:21:10 crc kubenswrapper[4830]: I0318 18:21:10.245483 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:21:10 crc kubenswrapper[4830]: W0318 18:21:10.275399 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6bfa626_3689_4791_94ce_6a2b2d80c8ea.slice/crio-a018c6afcb8ab00f13e9f0f0075b6400d1d6c158342608026a738b2f9d317ade WatchSource:0}: Error finding container a018c6afcb8ab00f13e9f0f0075b6400d1d6c158342608026a738b2f9d317ade: Status 404 returned error can't find the container with id a018c6afcb8ab00f13e9f0f0075b6400d1d6c158342608026a738b2f9d317ade Mar 18 18:21:10 crc kubenswrapper[4830]: I0318 18:21:10.427552 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vgb8c" Mar 18 18:21:10 crc kubenswrapper[4830]: I0318 18:21:10.443420 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:21:10 crc kubenswrapper[4830]: I0318 18:21:10.590297 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4zc4\" (UniqueName: \"kubernetes.io/projected/82570719-0e07-4ef2-adee-a287052cc4ac-kube-api-access-d4zc4\") pod \"82570719-0e07-4ef2-adee-a287052cc4ac\" (UID: \"82570719-0e07-4ef2-adee-a287052cc4ac\") " Mar 18 18:21:10 crc kubenswrapper[4830]: I0318 18:21:10.590421 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82570719-0e07-4ef2-adee-a287052cc4ac-combined-ca-bundle\") pod \"82570719-0e07-4ef2-adee-a287052cc4ac\" (UID: \"82570719-0e07-4ef2-adee-a287052cc4ac\") " Mar 18 18:21:10 crc kubenswrapper[4830]: I0318 18:21:10.590451 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/82570719-0e07-4ef2-adee-a287052cc4ac-config\") pod \"82570719-0e07-4ef2-adee-a287052cc4ac\" (UID: \"82570719-0e07-4ef2-adee-a287052cc4ac\") " Mar 18 18:21:10 crc kubenswrapper[4830]: I0318 18:21:10.598249 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82570719-0e07-4ef2-adee-a287052cc4ac-kube-api-access-d4zc4" (OuterVolumeSpecName: "kube-api-access-d4zc4") pod "82570719-0e07-4ef2-adee-a287052cc4ac" (UID: "82570719-0e07-4ef2-adee-a287052cc4ac"). InnerVolumeSpecName "kube-api-access-d4zc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:21:10 crc kubenswrapper[4830]: I0318 18:21:10.630311 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82570719-0e07-4ef2-adee-a287052cc4ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82570719-0e07-4ef2-adee-a287052cc4ac" (UID: "82570719-0e07-4ef2-adee-a287052cc4ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:10 crc kubenswrapper[4830]: I0318 18:21:10.633926 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82570719-0e07-4ef2-adee-a287052cc4ac-config" (OuterVolumeSpecName: "config") pod "82570719-0e07-4ef2-adee-a287052cc4ac" (UID: "82570719-0e07-4ef2-adee-a287052cc4ac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:10 crc kubenswrapper[4830]: I0318 18:21:10.692342 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4zc4\" (UniqueName: \"kubernetes.io/projected/82570719-0e07-4ef2-adee-a287052cc4ac-kube-api-access-d4zc4\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:10 crc kubenswrapper[4830]: I0318 18:21:10.692376 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82570719-0e07-4ef2-adee-a287052cc4ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:10 crc kubenswrapper[4830]: I0318 18:21:10.692385 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/82570719-0e07-4ef2-adee-a287052cc4ac-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.097705 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b46d602e-2976-4d06-80f1-c592927b2415","Type":"ContainerStarted","Data":"fbc816e6461aeac29a4e75cd3f1da74b2e66051383b867030c291ef7221d8b46"} Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.105172 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vgb8c" event={"ID":"82570719-0e07-4ef2-adee-a287052cc4ac","Type":"ContainerDied","Data":"7148e7d1dd170bfd1bff0716804026bebfe1db3b8c1e8cd06be112035507aa42"} Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.105216 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vgb8c" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.105224 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7148e7d1dd170bfd1bff0716804026bebfe1db3b8c1e8cd06be112035507aa42" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.116118 4830 generic.go:334] "Generic (PLEG): container finished" podID="8a1597fb-6c27-4b75-8996-40ff17a49e69" containerID="260f46cb5b74d58cda51ed1891b814c7af3b8dc935ea5d26486826dc9676c6a4" exitCode=0 Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.116180 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-msrv5" event={"ID":"8a1597fb-6c27-4b75-8996-40ff17a49e69","Type":"ContainerDied","Data":"260f46cb5b74d58cda51ed1891b814c7af3b8dc935ea5d26486826dc9676c6a4"} Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.121862 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c6bfa626-3689-4791-94ce-6a2b2d80c8ea","Type":"ContainerStarted","Data":"022ff1d7a9b527412e90df5149ce8cf2717dfc54359e6e8dbad446e574b96a65"} Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.121890 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c6bfa626-3689-4791-94ce-6a2b2d80c8ea","Type":"ContainerStarted","Data":"a018c6afcb8ab00f13e9f0f0075b6400d1d6c158342608026a738b2f9d317ade"} Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.127599 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7fb48775-b79mg" event={"ID":"b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5","Type":"ContainerStarted","Data":"3dcfe83386a1d36d615998d6b067020035e5f84277f665f2061db5c5b0db8ddb"} Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.127961 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d7fb48775-b79mg" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.205097 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d7fb48775-b79mg" podStartSLOduration=3.205064048 podStartE2EDuration="3.205064048s" podCreationTimestamp="2026-03-18 18:21:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:21:11.195488449 +0000 UTC m=+1105.763118781" watchObservedRunningTime="2026-03-18 18:21:11.205064048 +0000 UTC m=+1105.772694380" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.338000 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7fb48775-b79mg"] Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.382285 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-75b6876cd4-frf5t"] Mar 18 18:21:11 crc kubenswrapper[4830]: E0318 18:21:11.382698 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82570719-0e07-4ef2-adee-a287052cc4ac" containerName="neutron-db-sync" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.382709 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="82570719-0e07-4ef2-adee-a287052cc4ac" containerName="neutron-db-sync" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.382871 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="82570719-0e07-4ef2-adee-a287052cc4ac" containerName="neutron-db-sync" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.383704 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75b6876cd4-frf5t" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.389303 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-sjj5h" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.389510 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.389672 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.389789 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.390355 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-75b6876cd4-frf5t"] Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.409780 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d8b7b7d5-kjddm"] Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.409952 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0eaa5e65-715d-45fd-9ce9-cb2c1adcf283-config\") pod \"neutron-75b6876cd4-frf5t\" (UID: \"0eaa5e65-715d-45fd-9ce9-cb2c1adcf283\") " pod="openstack/neutron-75b6876cd4-frf5t" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.409979 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0eaa5e65-715d-45fd-9ce9-cb2c1adcf283-ovndb-tls-certs\") pod \"neutron-75b6876cd4-frf5t\" (UID: \"0eaa5e65-715d-45fd-9ce9-cb2c1adcf283\") " pod="openstack/neutron-75b6876cd4-frf5t" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.410023 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eaa5e65-715d-45fd-9ce9-cb2c1adcf283-combined-ca-bundle\") pod \"neutron-75b6876cd4-frf5t\" (UID: \"0eaa5e65-715d-45fd-9ce9-cb2c1adcf283\") " pod="openstack/neutron-75b6876cd4-frf5t" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.410059 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb24t\" (UniqueName: \"kubernetes.io/projected/0eaa5e65-715d-45fd-9ce9-cb2c1adcf283-kube-api-access-wb24t\") pod \"neutron-75b6876cd4-frf5t\" (UID: \"0eaa5e65-715d-45fd-9ce9-cb2c1adcf283\") " pod="openstack/neutron-75b6876cd4-frf5t" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.410159 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0eaa5e65-715d-45fd-9ce9-cb2c1adcf283-httpd-config\") pod \"neutron-75b6876cd4-frf5t\" (UID: \"0eaa5e65-715d-45fd-9ce9-cb2c1adcf283\") " pod="openstack/neutron-75b6876cd4-frf5t" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.412940 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d8b7b7d5-kjddm" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.413891 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d8b7b7d5-kjddm"] Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.511506 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0eaa5e65-715d-45fd-9ce9-cb2c1adcf283-httpd-config\") pod \"neutron-75b6876cd4-frf5t\" (UID: \"0eaa5e65-715d-45fd-9ce9-cb2c1adcf283\") " pod="openstack/neutron-75b6876cd4-frf5t" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.512011 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0eaa5e65-715d-45fd-9ce9-cb2c1adcf283-config\") pod \"neutron-75b6876cd4-frf5t\" (UID: \"0eaa5e65-715d-45fd-9ce9-cb2c1adcf283\") " pod="openstack/neutron-75b6876cd4-frf5t" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.512050 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0eaa5e65-715d-45fd-9ce9-cb2c1adcf283-ovndb-tls-certs\") pod \"neutron-75b6876cd4-frf5t\" (UID: \"0eaa5e65-715d-45fd-9ce9-cb2c1adcf283\") " pod="openstack/neutron-75b6876cd4-frf5t" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.512104 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eaa5e65-715d-45fd-9ce9-cb2c1adcf283-combined-ca-bundle\") pod \"neutron-75b6876cd4-frf5t\" (UID: \"0eaa5e65-715d-45fd-9ce9-cb2c1adcf283\") " pod="openstack/neutron-75b6876cd4-frf5t" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.512132 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb24t\" (UniqueName: \"kubernetes.io/projected/0eaa5e65-715d-45fd-9ce9-cb2c1adcf283-kube-api-access-wb24t\") pod \"neutron-75b6876cd4-frf5t\" (UID: \"0eaa5e65-715d-45fd-9ce9-cb2c1adcf283\") " pod="openstack/neutron-75b6876cd4-frf5t" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.517467 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0eaa5e65-715d-45fd-9ce9-cb2c1adcf283-config\") pod \"neutron-75b6876cd4-frf5t\" (UID: \"0eaa5e65-715d-45fd-9ce9-cb2c1adcf283\") " pod="openstack/neutron-75b6876cd4-frf5t" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.520201 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0eaa5e65-715d-45fd-9ce9-cb2c1adcf283-ovndb-tls-certs\") pod \"neutron-75b6876cd4-frf5t\" (UID: \"0eaa5e65-715d-45fd-9ce9-cb2c1adcf283\") " pod="openstack/neutron-75b6876cd4-frf5t" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.520348 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0eaa5e65-715d-45fd-9ce9-cb2c1adcf283-httpd-config\") pod \"neutron-75b6876cd4-frf5t\" (UID: \"0eaa5e65-715d-45fd-9ce9-cb2c1adcf283\") " pod="openstack/neutron-75b6876cd4-frf5t" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.520857 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eaa5e65-715d-45fd-9ce9-cb2c1adcf283-combined-ca-bundle\") pod \"neutron-75b6876cd4-frf5t\" (UID: \"0eaa5e65-715d-45fd-9ce9-cb2c1adcf283\") " pod="openstack/neutron-75b6876cd4-frf5t" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.532843 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb24t\" (UniqueName: \"kubernetes.io/projected/0eaa5e65-715d-45fd-9ce9-cb2c1adcf283-kube-api-access-wb24t\") pod \"neutron-75b6876cd4-frf5t\" (UID: \"0eaa5e65-715d-45fd-9ce9-cb2c1adcf283\") " pod="openstack/neutron-75b6876cd4-frf5t" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.547814 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.613385 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56705b88-b409-4811-a999-f548a0f108c7-dns-svc\") pod \"dnsmasq-dns-5d8b7b7d5-kjddm\" (UID: \"56705b88-b409-4811-a999-f548a0f108c7\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-kjddm" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.613434 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bqw4\" (UniqueName: \"kubernetes.io/projected/56705b88-b409-4811-a999-f548a0f108c7-kube-api-access-9bqw4\") pod \"dnsmasq-dns-5d8b7b7d5-kjddm\" (UID: \"56705b88-b409-4811-a999-f548a0f108c7\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-kjddm" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.613456 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56705b88-b409-4811-a999-f548a0f108c7-ovsdbserver-sb\") pod \"dnsmasq-dns-5d8b7b7d5-kjddm\" (UID: \"56705b88-b409-4811-a999-f548a0f108c7\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-kjddm" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.613527 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56705b88-b409-4811-a999-f548a0f108c7-ovsdbserver-nb\") pod \"dnsmasq-dns-5d8b7b7d5-kjddm\" (UID: \"56705b88-b409-4811-a999-f548a0f108c7\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-kjddm" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.613561 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56705b88-b409-4811-a999-f548a0f108c7-config\") pod \"dnsmasq-dns-5d8b7b7d5-kjddm\" (UID: \"56705b88-b409-4811-a999-f548a0f108c7\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-kjddm" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.613587 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/56705b88-b409-4811-a999-f548a0f108c7-dns-swift-storage-0\") pod \"dnsmasq-dns-5d8b7b7d5-kjddm\" (UID: \"56705b88-b409-4811-a999-f548a0f108c7\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-kjddm" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.628250 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.715440 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56705b88-b409-4811-a999-f548a0f108c7-dns-svc\") pod \"dnsmasq-dns-5d8b7b7d5-kjddm\" (UID: \"56705b88-b409-4811-a999-f548a0f108c7\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-kjddm" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.715489 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bqw4\" (UniqueName: \"kubernetes.io/projected/56705b88-b409-4811-a999-f548a0f108c7-kube-api-access-9bqw4\") pod \"dnsmasq-dns-5d8b7b7d5-kjddm\" (UID: \"56705b88-b409-4811-a999-f548a0f108c7\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-kjddm" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.716501 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56705b88-b409-4811-a999-f548a0f108c7-dns-svc\") pod \"dnsmasq-dns-5d8b7b7d5-kjddm\" (UID: \"56705b88-b409-4811-a999-f548a0f108c7\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-kjddm" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.715513 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56705b88-b409-4811-a999-f548a0f108c7-ovsdbserver-sb\") pod \"dnsmasq-dns-5d8b7b7d5-kjddm\" (UID: \"56705b88-b409-4811-a999-f548a0f108c7\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-kjddm" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.716598 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56705b88-b409-4811-a999-f548a0f108c7-ovsdbserver-sb\") pod \"dnsmasq-dns-5d8b7b7d5-kjddm\" (UID: \"56705b88-b409-4811-a999-f548a0f108c7\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-kjddm" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.716703 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56705b88-b409-4811-a999-f548a0f108c7-ovsdbserver-nb\") pod \"dnsmasq-dns-5d8b7b7d5-kjddm\" (UID: \"56705b88-b409-4811-a999-f548a0f108c7\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-kjddm" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.716745 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56705b88-b409-4811-a999-f548a0f108c7-config\") pod \"dnsmasq-dns-5d8b7b7d5-kjddm\" (UID: \"56705b88-b409-4811-a999-f548a0f108c7\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-kjddm" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.716786 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/56705b88-b409-4811-a999-f548a0f108c7-dns-swift-storage-0\") pod \"dnsmasq-dns-5d8b7b7d5-kjddm\" (UID: \"56705b88-b409-4811-a999-f548a0f108c7\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-kjddm" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.717419 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56705b88-b409-4811-a999-f548a0f108c7-config\") pod \"dnsmasq-dns-5d8b7b7d5-kjddm\" (UID: \"56705b88-b409-4811-a999-f548a0f108c7\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-kjddm" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.718340 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56705b88-b409-4811-a999-f548a0f108c7-ovsdbserver-nb\") pod \"dnsmasq-dns-5d8b7b7d5-kjddm\" (UID: \"56705b88-b409-4811-a999-f548a0f108c7\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-kjddm" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.718731 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/56705b88-b409-4811-a999-f548a0f108c7-dns-swift-storage-0\") pod \"dnsmasq-dns-5d8b7b7d5-kjddm\" (UID: \"56705b88-b409-4811-a999-f548a0f108c7\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-kjddm" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.730242 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75b6876cd4-frf5t" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.732912 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bqw4\" (UniqueName: \"kubernetes.io/projected/56705b88-b409-4811-a999-f548a0f108c7-kube-api-access-9bqw4\") pod \"dnsmasq-dns-5d8b7b7d5-kjddm\" (UID: \"56705b88-b409-4811-a999-f548a0f108c7\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-kjddm" Mar 18 18:21:11 crc kubenswrapper[4830]: I0318 18:21:11.734691 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d8b7b7d5-kjddm" Mar 18 18:21:12 crc kubenswrapper[4830]: I0318 18:21:12.156682 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c6bfa626-3689-4791-94ce-6a2b2d80c8ea","Type":"ContainerStarted","Data":"624b401ca5bcdfc97e63f6546c2257324183f36d820c44fabc3eeabef676ccf9"} Mar 18 18:21:12 crc kubenswrapper[4830]: I0318 18:21:12.157377 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c6bfa626-3689-4791-94ce-6a2b2d80c8ea" containerName="glance-log" containerID="cri-o://022ff1d7a9b527412e90df5149ce8cf2717dfc54359e6e8dbad446e574b96a65" gracePeriod=30 Mar 18 18:21:12 crc kubenswrapper[4830]: I0318 18:21:12.158103 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c6bfa626-3689-4791-94ce-6a2b2d80c8ea" containerName="glance-httpd" containerID="cri-o://624b401ca5bcdfc97e63f6546c2257324183f36d820c44fabc3eeabef676ccf9" gracePeriod=30 Mar 18 18:21:12 crc kubenswrapper[4830]: I0318 18:21:12.163203 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b46d602e-2976-4d06-80f1-c592927b2415","Type":"ContainerStarted","Data":"1cb25310d93e243172f01fa350a3db71603f3f63d0fc5281b88ec880b15f1eb7"} Mar 18 18:21:12 crc kubenswrapper[4830]: I0318 18:21:12.170726 4830 generic.go:334] "Generic (PLEG): container finished" podID="4d3ffcbf-d066-4c5f-bf95-8503bcb983cf" containerID="b679d58105fc60f57cf5c7272800740c97568fe76f6fe223f1159f845d0f52b1" exitCode=0 Mar 18 18:21:12 crc kubenswrapper[4830]: I0318 18:21:12.170957 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fknfr" event={"ID":"4d3ffcbf-d066-4c5f-bf95-8503bcb983cf","Type":"ContainerDied","Data":"b679d58105fc60f57cf5c7272800740c97568fe76f6fe223f1159f845d0f52b1"} Mar 18 18:21:12 crc kubenswrapper[4830]: I0318 18:21:12.219292 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.219269212 podStartE2EDuration="4.219269212s" podCreationTimestamp="2026-03-18 18:21:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:21:12.216029901 +0000 UTC m=+1106.783660233" watchObservedRunningTime="2026-03-18 18:21:12.219269212 +0000 UTC m=+1106.786899544" Mar 18 18:21:12 crc kubenswrapper[4830]: W0318 18:21:12.409485 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56705b88_b409_4811_a999_f548a0f108c7.slice/crio-05109e7091ef81f157aeed03c072a59d2e8765e3a804c5c8316ed9e69ffcd911 WatchSource:0}: Error finding container 05109e7091ef81f157aeed03c072a59d2e8765e3a804c5c8316ed9e69ffcd911: Status 404 returned error can't find the container with id 05109e7091ef81f157aeed03c072a59d2e8765e3a804c5c8316ed9e69ffcd911 Mar 18 18:21:12 crc kubenswrapper[4830]: I0318 18:21:12.410109 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d8b7b7d5-kjddm"] Mar 18 18:21:12 crc kubenswrapper[4830]: I0318 18:21:12.637658 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-75b6876cd4-frf5t"] Mar 18 18:21:12 crc kubenswrapper[4830]: W0318 18:21:12.644358 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0eaa5e65_715d_45fd_9ce9_cb2c1adcf283.slice/crio-b397e3843d9fe6905ef94a24c04b49242d68c5037d106a7914625d142c38dc3b WatchSource:0}: Error finding container b397e3843d9fe6905ef94a24c04b49242d68c5037d106a7914625d142c38dc3b: Status 404 returned error can't find the container with id b397e3843d9fe6905ef94a24c04b49242d68c5037d106a7914625d142c38dc3b Mar 18 18:21:12 crc kubenswrapper[4830]: I0318 18:21:12.665130 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-msrv5" Mar 18 18:21:12 crc kubenswrapper[4830]: I0318 18:21:12.751226 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2p2q\" (UniqueName: \"kubernetes.io/projected/8a1597fb-6c27-4b75-8996-40ff17a49e69-kube-api-access-z2p2q\") pod \"8a1597fb-6c27-4b75-8996-40ff17a49e69\" (UID: \"8a1597fb-6c27-4b75-8996-40ff17a49e69\") " Mar 18 18:21:12 crc kubenswrapper[4830]: I0318 18:21:12.751292 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a1597fb-6c27-4b75-8996-40ff17a49e69-logs\") pod \"8a1597fb-6c27-4b75-8996-40ff17a49e69\" (UID: \"8a1597fb-6c27-4b75-8996-40ff17a49e69\") " Mar 18 18:21:12 crc kubenswrapper[4830]: I0318 18:21:12.751322 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a1597fb-6c27-4b75-8996-40ff17a49e69-combined-ca-bundle\") pod \"8a1597fb-6c27-4b75-8996-40ff17a49e69\" (UID: \"8a1597fb-6c27-4b75-8996-40ff17a49e69\") " Mar 18 18:21:12 crc kubenswrapper[4830]: I0318 18:21:12.751591 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a1597fb-6c27-4b75-8996-40ff17a49e69-config-data\") pod \"8a1597fb-6c27-4b75-8996-40ff17a49e69\" (UID: \"8a1597fb-6c27-4b75-8996-40ff17a49e69\") " Mar 18 18:21:12 crc kubenswrapper[4830]: I0318 18:21:12.751649 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a1597fb-6c27-4b75-8996-40ff17a49e69-scripts\") pod \"8a1597fb-6c27-4b75-8996-40ff17a49e69\" (UID: \"8a1597fb-6c27-4b75-8996-40ff17a49e69\") " Mar 18 18:21:12 crc kubenswrapper[4830]: I0318 18:21:12.751718 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a1597fb-6c27-4b75-8996-40ff17a49e69-logs" (OuterVolumeSpecName: "logs") pod "8a1597fb-6c27-4b75-8996-40ff17a49e69" (UID: "8a1597fb-6c27-4b75-8996-40ff17a49e69"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:21:12 crc kubenswrapper[4830]: I0318 18:21:12.752360 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a1597fb-6c27-4b75-8996-40ff17a49e69-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:12 crc kubenswrapper[4830]: I0318 18:21:12.758152 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a1597fb-6c27-4b75-8996-40ff17a49e69-scripts" (OuterVolumeSpecName: "scripts") pod "8a1597fb-6c27-4b75-8996-40ff17a49e69" (UID: "8a1597fb-6c27-4b75-8996-40ff17a49e69"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:12 crc kubenswrapper[4830]: I0318 18:21:12.761904 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a1597fb-6c27-4b75-8996-40ff17a49e69-kube-api-access-z2p2q" (OuterVolumeSpecName: "kube-api-access-z2p2q") pod "8a1597fb-6c27-4b75-8996-40ff17a49e69" (UID: "8a1597fb-6c27-4b75-8996-40ff17a49e69"). InnerVolumeSpecName "kube-api-access-z2p2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:21:12 crc kubenswrapper[4830]: I0318 18:21:12.786910 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a1597fb-6c27-4b75-8996-40ff17a49e69-config-data" (OuterVolumeSpecName: "config-data") pod "8a1597fb-6c27-4b75-8996-40ff17a49e69" (UID: "8a1597fb-6c27-4b75-8996-40ff17a49e69"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:12 crc kubenswrapper[4830]: I0318 18:21:12.840280 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a1597fb-6c27-4b75-8996-40ff17a49e69-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a1597fb-6c27-4b75-8996-40ff17a49e69" (UID: "8a1597fb-6c27-4b75-8996-40ff17a49e69"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:12 crc kubenswrapper[4830]: I0318 18:21:12.853430 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a1597fb-6c27-4b75-8996-40ff17a49e69-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:12 crc kubenswrapper[4830]: I0318 18:21:12.853458 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a1597fb-6c27-4b75-8996-40ff17a49e69-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:12 crc kubenswrapper[4830]: I0318 18:21:12.853488 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2p2q\" (UniqueName: \"kubernetes.io/projected/8a1597fb-6c27-4b75-8996-40ff17a49e69-kube-api-access-z2p2q\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:12 crc kubenswrapper[4830]: I0318 18:21:12.853499 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a1597fb-6c27-4b75-8996-40ff17a49e69-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:12 crc kubenswrapper[4830]: E0318 18:21:12.951155 4830 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6bfa626_3689_4791_94ce_6a2b2d80c8ea.slice/crio-624b401ca5bcdfc97e63f6546c2257324183f36d820c44fabc3eeabef676ccf9.scope\": RecentStats: unable to find data in memory cache]" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.216688 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b46d602e-2976-4d06-80f1-c592927b2415","Type":"ContainerStarted","Data":"81e3c473c97b12796cde98e7f665cfcac3af3794916d93f4e4a7948fe1bb2a34"} Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.216948 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b46d602e-2976-4d06-80f1-c592927b2415" containerName="glance-log" containerID="cri-o://1cb25310d93e243172f01fa350a3db71603f3f63d0fc5281b88ec880b15f1eb7" gracePeriod=30 Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.220298 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b46d602e-2976-4d06-80f1-c592927b2415" containerName="glance-httpd" containerID="cri-o://81e3c473c97b12796cde98e7f665cfcac3af3794916d93f4e4a7948fe1bb2a34" gracePeriod=30 Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.222047 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-msrv5" event={"ID":"8a1597fb-6c27-4b75-8996-40ff17a49e69","Type":"ContainerDied","Data":"e26967556c1953b30de9bf4c96e8d2a1178cff18d47f97516db68d3b6692573b"} Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.222292 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e26967556c1953b30de9bf4c96e8d2a1178cff18d47f97516db68d3b6692573b" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.222369 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-msrv5" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.227933 4830 generic.go:334] "Generic (PLEG): container finished" podID="56705b88-b409-4811-a999-f548a0f108c7" containerID="f21019a22eb84ca940826b9de78b1fbbaa1ea01b137efc55759b6a2d08e34568" exitCode=0 Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.228034 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d8b7b7d5-kjddm" event={"ID":"56705b88-b409-4811-a999-f548a0f108c7","Type":"ContainerDied","Data":"f21019a22eb84ca940826b9de78b1fbbaa1ea01b137efc55759b6a2d08e34568"} Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.228061 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d8b7b7d5-kjddm" event={"ID":"56705b88-b409-4811-a999-f548a0f108c7","Type":"ContainerStarted","Data":"05109e7091ef81f157aeed03c072a59d2e8765e3a804c5c8316ed9e69ffcd911"} Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.247015 4830 generic.go:334] "Generic (PLEG): container finished" podID="c6bfa626-3689-4791-94ce-6a2b2d80c8ea" containerID="624b401ca5bcdfc97e63f6546c2257324183f36d820c44fabc3eeabef676ccf9" exitCode=0 Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.247051 4830 generic.go:334] "Generic (PLEG): container finished" podID="c6bfa626-3689-4791-94ce-6a2b2d80c8ea" containerID="022ff1d7a9b527412e90df5149ce8cf2717dfc54359e6e8dbad446e574b96a65" exitCode=143 Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.247108 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c6bfa626-3689-4791-94ce-6a2b2d80c8ea","Type":"ContainerDied","Data":"624b401ca5bcdfc97e63f6546c2257324183f36d820c44fabc3eeabef676ccf9"} Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.247138 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c6bfa626-3689-4791-94ce-6a2b2d80c8ea","Type":"ContainerDied","Data":"022ff1d7a9b527412e90df5149ce8cf2717dfc54359e6e8dbad446e574b96a65"} Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.260230 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.260212636 podStartE2EDuration="5.260212636s" podCreationTimestamp="2026-03-18 18:21:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:21:13.250491813 +0000 UTC m=+1107.818122155" watchObservedRunningTime="2026-03-18 18:21:13.260212636 +0000 UTC m=+1107.827842968" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.264840 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d7fb48775-b79mg" podUID="b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5" containerName="dnsmasq-dns" containerID="cri-o://3dcfe83386a1d36d615998d6b067020035e5f84277f665f2061db5c5b0db8ddb" gracePeriod=10 Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.265204 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75b6876cd4-frf5t" event={"ID":"0eaa5e65-715d-45fd-9ce9-cb2c1adcf283","Type":"ContainerStarted","Data":"fdd3b6055bbc8797932488dbe9516bb55785797cab91378825cd527f760a35af"} Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.265247 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75b6876cd4-frf5t" event={"ID":"0eaa5e65-715d-45fd-9ce9-cb2c1adcf283","Type":"ContainerStarted","Data":"b397e3843d9fe6905ef94a24c04b49242d68c5037d106a7914625d142c38dc3b"} Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.335948 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-788c577778-lg8kp"] Mar 18 18:21:13 crc kubenswrapper[4830]: E0318 18:21:13.336403 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a1597fb-6c27-4b75-8996-40ff17a49e69" containerName="placement-db-sync" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.336425 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a1597fb-6c27-4b75-8996-40ff17a49e69" containerName="placement-db-sync" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.336623 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a1597fb-6c27-4b75-8996-40ff17a49e69" containerName="placement-db-sync" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.339239 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-788c577778-lg8kp" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.340869 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.342409 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.342712 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-bq9d5" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.342991 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.343224 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.370984 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-combined-ca-bundle\") pod \"placement-788c577778-lg8kp\" (UID: \"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4\") " pod="openstack/placement-788c577778-lg8kp" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.371060 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbwjb\" (UniqueName: \"kubernetes.io/projected/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-kube-api-access-zbwjb\") pod \"placement-788c577778-lg8kp\" (UID: \"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4\") " pod="openstack/placement-788c577778-lg8kp" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.371095 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-internal-tls-certs\") pod \"placement-788c577778-lg8kp\" (UID: \"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4\") " pod="openstack/placement-788c577778-lg8kp" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.371225 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-logs\") pod \"placement-788c577778-lg8kp\" (UID: \"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4\") " pod="openstack/placement-788c577778-lg8kp" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.371249 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-config-data\") pod \"placement-788c577778-lg8kp\" (UID: \"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4\") " pod="openstack/placement-788c577778-lg8kp" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.371271 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-scripts\") pod \"placement-788c577778-lg8kp\" (UID: \"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4\") " pod="openstack/placement-788c577778-lg8kp" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.371293 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-public-tls-certs\") pod \"placement-788c577778-lg8kp\" (UID: \"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4\") " pod="openstack/placement-788c577778-lg8kp" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.376855 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-788c577778-lg8kp"] Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.390084 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.473291 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-combined-ca-bundle\") pod \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\" (UID: \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\") " Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.473338 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-scripts\") pod \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\" (UID: \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\") " Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.473371 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\" (UID: \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\") " Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.473422 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-logs\") pod \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\" (UID: \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\") " Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.473462 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-httpd-run\") pod \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\" (UID: \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\") " Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.473578 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-config-data\") pod \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\" (UID: \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\") " Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.473635 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2mk7\" (UniqueName: \"kubernetes.io/projected/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-kube-api-access-b2mk7\") pod \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\" (UID: \"c6bfa626-3689-4791-94ce-6a2b2d80c8ea\") " Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.473872 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbwjb\" (UniqueName: \"kubernetes.io/projected/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-kube-api-access-zbwjb\") pod \"placement-788c577778-lg8kp\" (UID: \"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4\") " pod="openstack/placement-788c577778-lg8kp" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.473902 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-internal-tls-certs\") pod \"placement-788c577778-lg8kp\" (UID: \"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4\") " pod="openstack/placement-788c577778-lg8kp" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.473960 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-logs\") pod \"placement-788c577778-lg8kp\" (UID: \"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4\") " pod="openstack/placement-788c577778-lg8kp" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.473976 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-config-data\") pod \"placement-788c577778-lg8kp\" (UID: \"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4\") " pod="openstack/placement-788c577778-lg8kp" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.473994 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-scripts\") pod \"placement-788c577778-lg8kp\" (UID: \"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4\") " pod="openstack/placement-788c577778-lg8kp" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.474010 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-public-tls-certs\") pod \"placement-788c577778-lg8kp\" (UID: \"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4\") " pod="openstack/placement-788c577778-lg8kp" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.474069 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-combined-ca-bundle\") pod \"placement-788c577778-lg8kp\" (UID: \"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4\") " pod="openstack/placement-788c577778-lg8kp" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.477804 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-combined-ca-bundle\") pod \"placement-788c577778-lg8kp\" (UID: \"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4\") " pod="openstack/placement-788c577778-lg8kp" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.478046 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c6bfa626-3689-4791-94ce-6a2b2d80c8ea" (UID: "c6bfa626-3689-4791-94ce-6a2b2d80c8ea"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.478309 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-logs" (OuterVolumeSpecName: "logs") pod "c6bfa626-3689-4791-94ce-6a2b2d80c8ea" (UID: "c6bfa626-3689-4791-94ce-6a2b2d80c8ea"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.478935 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-kube-api-access-b2mk7" (OuterVolumeSpecName: "kube-api-access-b2mk7") pod "c6bfa626-3689-4791-94ce-6a2b2d80c8ea" (UID: "c6bfa626-3689-4791-94ce-6a2b2d80c8ea"). InnerVolumeSpecName "kube-api-access-b2mk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.479808 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-logs\") pod \"placement-788c577778-lg8kp\" (UID: \"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4\") " pod="openstack/placement-788c577778-lg8kp" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.498223 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-public-tls-certs\") pod \"placement-788c577778-lg8kp\" (UID: \"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4\") " pod="openstack/placement-788c577778-lg8kp" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.499486 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-internal-tls-certs\") pod \"placement-788c577778-lg8kp\" (UID: \"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4\") " pod="openstack/placement-788c577778-lg8kp" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.501584 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-scripts" (OuterVolumeSpecName: "scripts") pod "c6bfa626-3689-4791-94ce-6a2b2d80c8ea" (UID: "c6bfa626-3689-4791-94ce-6a2b2d80c8ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.501901 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "c6bfa626-3689-4791-94ce-6a2b2d80c8ea" (UID: "c6bfa626-3689-4791-94ce-6a2b2d80c8ea"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.502428 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-config-data\") pod \"placement-788c577778-lg8kp\" (UID: \"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4\") " pod="openstack/placement-788c577778-lg8kp" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.520888 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbwjb\" (UniqueName: \"kubernetes.io/projected/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-kube-api-access-zbwjb\") pod \"placement-788c577778-lg8kp\" (UID: \"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4\") " pod="openstack/placement-788c577778-lg8kp" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.526268 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-scripts\") pod \"placement-788c577778-lg8kp\" (UID: \"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4\") " pod="openstack/placement-788c577778-lg8kp" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.552034 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c6bfa626-3689-4791-94ce-6a2b2d80c8ea" (UID: "c6bfa626-3689-4791-94ce-6a2b2d80c8ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.575797 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.575826 4830 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.575836 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2mk7\" (UniqueName: \"kubernetes.io/projected/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-kube-api-access-b2mk7\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.575845 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.575872 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.575899 4830 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.584406 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-config-data" (OuterVolumeSpecName: "config-data") pod "c6bfa626-3689-4791-94ce-6a2b2d80c8ea" (UID: "c6bfa626-3689-4791-94ce-6a2b2d80c8ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.604920 4830 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.617392 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fknfr" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.676840 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-scripts\") pod \"4d3ffcbf-d066-4c5f-bf95-8503bcb983cf\" (UID: \"4d3ffcbf-d066-4c5f-bf95-8503bcb983cf\") " Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.676971 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-config-data\") pod \"4d3ffcbf-d066-4c5f-bf95-8503bcb983cf\" (UID: \"4d3ffcbf-d066-4c5f-bf95-8503bcb983cf\") " Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.677015 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdxb4\" (UniqueName: \"kubernetes.io/projected/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-kube-api-access-zdxb4\") pod \"4d3ffcbf-d066-4c5f-bf95-8503bcb983cf\" (UID: \"4d3ffcbf-d066-4c5f-bf95-8503bcb983cf\") " Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.677040 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-credential-keys\") pod \"4d3ffcbf-d066-4c5f-bf95-8503bcb983cf\" (UID: \"4d3ffcbf-d066-4c5f-bf95-8503bcb983cf\") " Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.677069 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-combined-ca-bundle\") pod \"4d3ffcbf-d066-4c5f-bf95-8503bcb983cf\" (UID: \"4d3ffcbf-d066-4c5f-bf95-8503bcb983cf\") " Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.677091 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-fernet-keys\") pod \"4d3ffcbf-d066-4c5f-bf95-8503bcb983cf\" (UID: \"4d3ffcbf-d066-4c5f-bf95-8503bcb983cf\") " Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.677590 4830 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.677606 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6bfa626-3689-4791-94ce-6a2b2d80c8ea-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.685929 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-kube-api-access-zdxb4" (OuterVolumeSpecName: "kube-api-access-zdxb4") pod "4d3ffcbf-d066-4c5f-bf95-8503bcb983cf" (UID: "4d3ffcbf-d066-4c5f-bf95-8503bcb983cf"). InnerVolumeSpecName "kube-api-access-zdxb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.687506 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-scripts" (OuterVolumeSpecName: "scripts") pod "4d3ffcbf-d066-4c5f-bf95-8503bcb983cf" (UID: "4d3ffcbf-d066-4c5f-bf95-8503bcb983cf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.692103 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4d3ffcbf-d066-4c5f-bf95-8503bcb983cf" (UID: "4d3ffcbf-d066-4c5f-bf95-8503bcb983cf"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.692647 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4d3ffcbf-d066-4c5f-bf95-8503bcb983cf" (UID: "4d3ffcbf-d066-4c5f-bf95-8503bcb983cf"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.714822 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-config-data" (OuterVolumeSpecName: "config-data") pod "4d3ffcbf-d066-4c5f-bf95-8503bcb983cf" (UID: "4d3ffcbf-d066-4c5f-bf95-8503bcb983cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.718134 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-788c577778-lg8kp" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.722752 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d3ffcbf-d066-4c5f-bf95-8503bcb983cf" (UID: "4d3ffcbf-d066-4c5f-bf95-8503bcb983cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.801729 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.801835 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.801847 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdxb4\" (UniqueName: \"kubernetes.io/projected/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-kube-api-access-zdxb4\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.801860 4830 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.801870 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.801900 4830 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:13 crc kubenswrapper[4830]: I0318 18:21:13.954137 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7fb48775-b79mg" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.005232 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7vk9\" (UniqueName: \"kubernetes.io/projected/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-kube-api-access-h7vk9\") pod \"b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5\" (UID: \"b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5\") " Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.005313 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-config\") pod \"b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5\" (UID: \"b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5\") " Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.005355 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-dns-svc\") pod \"b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5\" (UID: \"b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5\") " Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.005486 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-ovsdbserver-sb\") pod \"b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5\" (UID: \"b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5\") " Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.005551 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-dns-swift-storage-0\") pod \"b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5\" (UID: \"b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5\") " Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.005652 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-ovsdbserver-nb\") pod \"b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5\" (UID: \"b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5\") " Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.029970 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-kube-api-access-h7vk9" (OuterVolumeSpecName: "kube-api-access-h7vk9") pod "b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5" (UID: "b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5"). InnerVolumeSpecName "kube-api-access-h7vk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.108625 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7vk9\" (UniqueName: \"kubernetes.io/projected/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-kube-api-access-h7vk9\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.120918 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5" (UID: "b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.121205 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5" (UID: "b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.124899 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5" (UID: "b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.130524 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-config" (OuterVolumeSpecName: "config") pod "b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5" (UID: "b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.142617 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5" (UID: "b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.204968 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.209824 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.209844 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.209853 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.209865 4830 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.209874 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.258027 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-788c577778-lg8kp"] Mar 18 18:21:14 crc kubenswrapper[4830]: W0318 18:21:14.258302 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cea3ec6_0f57_4c56_92fb_5dc40de81fe4.slice/crio-f550bc3b9d453218b2c6339b04d35bb1b92975b54fa0f5a9f503e69992114125 WatchSource:0}: Error finding container f550bc3b9d453218b2c6339b04d35bb1b92975b54fa0f5a9f503e69992114125: Status 404 returned error can't find the container with id f550bc3b9d453218b2c6339b04d35bb1b92975b54fa0f5a9f503e69992114125 Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.294676 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d8b7b7d5-kjddm" event={"ID":"56705b88-b409-4811-a999-f548a0f108c7","Type":"ContainerStarted","Data":"023f57ad1843e4c93604c976b9f7c86bb24ad0a5d2f1dfeb4f8efe8a9f50ae62"} Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.294764 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d8b7b7d5-kjddm" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.301233 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c6bfa626-3689-4791-94ce-6a2b2d80c8ea","Type":"ContainerDied","Data":"a018c6afcb8ab00f13e9f0f0075b6400d1d6c158342608026a738b2f9d317ade"} Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.301311 4830 scope.go:117] "RemoveContainer" containerID="624b401ca5bcdfc97e63f6546c2257324183f36d820c44fabc3eeabef676ccf9" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.301488 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.313612 4830 generic.go:334] "Generic (PLEG): container finished" podID="b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5" containerID="3dcfe83386a1d36d615998d6b067020035e5f84277f665f2061db5c5b0db8ddb" exitCode=0 Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.313926 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7fb48775-b79mg" event={"ID":"b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5","Type":"ContainerDied","Data":"3dcfe83386a1d36d615998d6b067020035e5f84277f665f2061db5c5b0db8ddb"} Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.314036 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7fb48775-b79mg" event={"ID":"b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5","Type":"ContainerDied","Data":"9652c9e048cd3d4b725e92aef386e60f90932970e2132f85a8a2f50e0abaf821"} Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.314175 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7fb48775-b79mg" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.326170 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75b6876cd4-frf5t" event={"ID":"0eaa5e65-715d-45fd-9ce9-cb2c1adcf283","Type":"ContainerStarted","Data":"d8f7462ba94d30ff3af81c767a9205afeac42282e30e8263a26db708a3be5d4e"} Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.326241 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-75b6876cd4-frf5t" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.327832 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-788c577778-lg8kp" event={"ID":"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4","Type":"ContainerStarted","Data":"f550bc3b9d453218b2c6339b04d35bb1b92975b54fa0f5a9f503e69992114125"} Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.329456 4830 generic.go:334] "Generic (PLEG): container finished" podID="b46d602e-2976-4d06-80f1-c592927b2415" containerID="81e3c473c97b12796cde98e7f665cfcac3af3794916d93f4e4a7948fe1bb2a34" exitCode=0 Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.329476 4830 generic.go:334] "Generic (PLEG): container finished" podID="b46d602e-2976-4d06-80f1-c592927b2415" containerID="1cb25310d93e243172f01fa350a3db71603f3f63d0fc5281b88ec880b15f1eb7" exitCode=143 Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.329504 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b46d602e-2976-4d06-80f1-c592927b2415","Type":"ContainerDied","Data":"81e3c473c97b12796cde98e7f665cfcac3af3794916d93f4e4a7948fe1bb2a34"} Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.329519 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b46d602e-2976-4d06-80f1-c592927b2415","Type":"ContainerDied","Data":"1cb25310d93e243172f01fa350a3db71603f3f63d0fc5281b88ec880b15f1eb7"} Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.329529 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b46d602e-2976-4d06-80f1-c592927b2415","Type":"ContainerDied","Data":"fbc816e6461aeac29a4e75cd3f1da74b2e66051383b867030c291ef7221d8b46"} Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.329575 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.334292 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fknfr" event={"ID":"4d3ffcbf-d066-4c5f-bf95-8503bcb983cf","Type":"ContainerDied","Data":"038ac3cf69e85e35e60ab1a254a73215143eb85ae1e7164f060dcfcf2d403bcd"} Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.334322 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="038ac3cf69e85e35e60ab1a254a73215143eb85ae1e7164f060dcfcf2d403bcd" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.334368 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fknfr" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.342093 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b46d602e-2976-4d06-80f1-c592927b2415-logs\") pod \"b46d602e-2976-4d06-80f1-c592927b2415\" (UID: \"b46d602e-2976-4d06-80f1-c592927b2415\") " Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.342161 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"b46d602e-2976-4d06-80f1-c592927b2415\" (UID: \"b46d602e-2976-4d06-80f1-c592927b2415\") " Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.342241 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b46d602e-2976-4d06-80f1-c592927b2415-combined-ca-bundle\") pod \"b46d602e-2976-4d06-80f1-c592927b2415\" (UID: \"b46d602e-2976-4d06-80f1-c592927b2415\") " Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.342312 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b46d602e-2976-4d06-80f1-c592927b2415-httpd-run\") pod \"b46d602e-2976-4d06-80f1-c592927b2415\" (UID: \"b46d602e-2976-4d06-80f1-c592927b2415\") " Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.342359 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plrx4\" (UniqueName: \"kubernetes.io/projected/b46d602e-2976-4d06-80f1-c592927b2415-kube-api-access-plrx4\") pod \"b46d602e-2976-4d06-80f1-c592927b2415\" (UID: \"b46d602e-2976-4d06-80f1-c592927b2415\") " Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.342438 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b46d602e-2976-4d06-80f1-c592927b2415-scripts\") pod \"b46d602e-2976-4d06-80f1-c592927b2415\" (UID: \"b46d602e-2976-4d06-80f1-c592927b2415\") " Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.342468 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b46d602e-2976-4d06-80f1-c592927b2415-config-data\") pod \"b46d602e-2976-4d06-80f1-c592927b2415\" (UID: \"b46d602e-2976-4d06-80f1-c592927b2415\") " Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.342561 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b46d602e-2976-4d06-80f1-c592927b2415-logs" (OuterVolumeSpecName: "logs") pod "b46d602e-2976-4d06-80f1-c592927b2415" (UID: "b46d602e-2976-4d06-80f1-c592927b2415"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.342789 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b46d602e-2976-4d06-80f1-c592927b2415-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b46d602e-2976-4d06-80f1-c592927b2415" (UID: "b46d602e-2976-4d06-80f1-c592927b2415"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.364555 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b46d602e-2976-4d06-80f1-c592927b2415-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.364590 4830 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b46d602e-2976-4d06-80f1-c592927b2415-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.372459 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "b46d602e-2976-4d06-80f1-c592927b2415" (UID: "b46d602e-2976-4d06-80f1-c592927b2415"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.373819 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b46d602e-2976-4d06-80f1-c592927b2415-scripts" (OuterVolumeSpecName: "scripts") pod "b46d602e-2976-4d06-80f1-c592927b2415" (UID: "b46d602e-2976-4d06-80f1-c592927b2415"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.374198 4830 scope.go:117] "RemoveContainer" containerID="022ff1d7a9b527412e90df5149ce8cf2717dfc54359e6e8dbad446e574b96a65" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.376949 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b46d602e-2976-4d06-80f1-c592927b2415-kube-api-access-plrx4" (OuterVolumeSpecName: "kube-api-access-plrx4") pod "b46d602e-2976-4d06-80f1-c592927b2415" (UID: "b46d602e-2976-4d06-80f1-c592927b2415"). InnerVolumeSpecName "kube-api-access-plrx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.391644 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d8b7b7d5-kjddm" podStartSLOduration=3.391619845 podStartE2EDuration="3.391619845s" podCreationTimestamp="2026-03-18 18:21:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:21:14.31602062 +0000 UTC m=+1108.883650952" watchObservedRunningTime="2026-03-18 18:21:14.391619845 +0000 UTC m=+1108.959250177" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.435954 4830 scope.go:117] "RemoveContainer" containerID="3dcfe83386a1d36d615998d6b067020035e5f84277f665f2061db5c5b0db8ddb" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.436188 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b46d602e-2976-4d06-80f1-c592927b2415-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b46d602e-2976-4d06-80f1-c592927b2415" (UID: "b46d602e-2976-4d06-80f1-c592927b2415"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.445699 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.459262 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.466935 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plrx4\" (UniqueName: \"kubernetes.io/projected/b46d602e-2976-4d06-80f1-c592927b2415-kube-api-access-plrx4\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.466961 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b46d602e-2976-4d06-80f1-c592927b2415-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.466980 4830 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.466990 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b46d602e-2976-4d06-80f1-c592927b2415-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.467012 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b46d602e-2976-4d06-80f1-c592927b2415-config-data" (OuterVolumeSpecName: "config-data") pod "b46d602e-2976-4d06-80f1-c592927b2415" (UID: "b46d602e-2976-4d06-80f1-c592927b2415"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.480067 4830 scope.go:117] "RemoveContainer" containerID="e62a562e4983b13306a458d2cdd9835f57c225b4e34240d054ef2fbd00706d03" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.481340 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6f6ff8b5bf-p5xgc"] Mar 18 18:21:14 crc kubenswrapper[4830]: E0318 18:21:14.481676 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d3ffcbf-d066-4c5f-bf95-8503bcb983cf" containerName="keystone-bootstrap" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.481693 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d3ffcbf-d066-4c5f-bf95-8503bcb983cf" containerName="keystone-bootstrap" Mar 18 18:21:14 crc kubenswrapper[4830]: E0318 18:21:14.481707 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6bfa626-3689-4791-94ce-6a2b2d80c8ea" containerName="glance-httpd" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.481714 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6bfa626-3689-4791-94ce-6a2b2d80c8ea" containerName="glance-httpd" Mar 18 18:21:14 crc kubenswrapper[4830]: E0318 18:21:14.481723 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5" containerName="dnsmasq-dns" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.481728 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5" containerName="dnsmasq-dns" Mar 18 18:21:14 crc kubenswrapper[4830]: E0318 18:21:14.481736 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5" containerName="init" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.481741 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5" containerName="init" Mar 18 18:21:14 crc kubenswrapper[4830]: E0318 18:21:14.481753 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6bfa626-3689-4791-94ce-6a2b2d80c8ea" containerName="glance-log" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.481759 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6bfa626-3689-4791-94ce-6a2b2d80c8ea" containerName="glance-log" Mar 18 18:21:14 crc kubenswrapper[4830]: E0318 18:21:14.481858 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b46d602e-2976-4d06-80f1-c592927b2415" containerName="glance-httpd" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.481866 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b46d602e-2976-4d06-80f1-c592927b2415" containerName="glance-httpd" Mar 18 18:21:14 crc kubenswrapper[4830]: E0318 18:21:14.481883 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b46d602e-2976-4d06-80f1-c592927b2415" containerName="glance-log" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.481889 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b46d602e-2976-4d06-80f1-c592927b2415" containerName="glance-log" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.482084 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6bfa626-3689-4791-94ce-6a2b2d80c8ea" containerName="glance-httpd" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.482105 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5" containerName="dnsmasq-dns" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.482115 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6bfa626-3689-4791-94ce-6a2b2d80c8ea" containerName="glance-log" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.482126 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="b46d602e-2976-4d06-80f1-c592927b2415" containerName="glance-log" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.482137 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d3ffcbf-d066-4c5f-bf95-8503bcb983cf" containerName="keystone-bootstrap" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.482146 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="b46d602e-2976-4d06-80f1-c592927b2415" containerName="glance-httpd" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.482471 4830 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.482709 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6f6ff8b5bf-p5xgc" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.488174 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.488346 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.488463 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-cg9pq" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.488674 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.488760 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.489108 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.495246 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.497015 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.505051 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.506457 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.508343 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6f6ff8b5bf-p5xgc"] Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.509415 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-75b6876cd4-frf5t" podStartSLOduration=3.509397036 podStartE2EDuration="3.509397036s" podCreationTimestamp="2026-03-18 18:21:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:21:14.399485176 +0000 UTC m=+1108.967115508" watchObservedRunningTime="2026-03-18 18:21:14.509397036 +0000 UTC m=+1109.077027368" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.512667 4830 scope.go:117] "RemoveContainer" containerID="3dcfe83386a1d36d615998d6b067020035e5f84277f665f2061db5c5b0db8ddb" Mar 18 18:21:14 crc kubenswrapper[4830]: E0318 18:21:14.514287 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dcfe83386a1d36d615998d6b067020035e5f84277f665f2061db5c5b0db8ddb\": container with ID starting with 3dcfe83386a1d36d615998d6b067020035e5f84277f665f2061db5c5b0db8ddb not found: ID does not exist" containerID="3dcfe83386a1d36d615998d6b067020035e5f84277f665f2061db5c5b0db8ddb" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.514333 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dcfe83386a1d36d615998d6b067020035e5f84277f665f2061db5c5b0db8ddb"} err="failed to get container status \"3dcfe83386a1d36d615998d6b067020035e5f84277f665f2061db5c5b0db8ddb\": rpc error: code = NotFound desc = could not find container \"3dcfe83386a1d36d615998d6b067020035e5f84277f665f2061db5c5b0db8ddb\": container with ID starting with 3dcfe83386a1d36d615998d6b067020035e5f84277f665f2061db5c5b0db8ddb not found: ID does not exist" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.514359 4830 scope.go:117] "RemoveContainer" containerID="e62a562e4983b13306a458d2cdd9835f57c225b4e34240d054ef2fbd00706d03" Mar 18 18:21:14 crc kubenswrapper[4830]: E0318 18:21:14.514644 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e62a562e4983b13306a458d2cdd9835f57c225b4e34240d054ef2fbd00706d03\": container with ID starting with e62a562e4983b13306a458d2cdd9835f57c225b4e34240d054ef2fbd00706d03 not found: ID does not exist" containerID="e62a562e4983b13306a458d2cdd9835f57c225b4e34240d054ef2fbd00706d03" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.514681 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e62a562e4983b13306a458d2cdd9835f57c225b4e34240d054ef2fbd00706d03"} err="failed to get container status \"e62a562e4983b13306a458d2cdd9835f57c225b4e34240d054ef2fbd00706d03\": rpc error: code = NotFound desc = could not find container \"e62a562e4983b13306a458d2cdd9835f57c225b4e34240d054ef2fbd00706d03\": container with ID starting with e62a562e4983b13306a458d2cdd9835f57c225b4e34240d054ef2fbd00706d03 not found: ID does not exist" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.514707 4830 scope.go:117] "RemoveContainer" containerID="81e3c473c97b12796cde98e7f665cfcac3af3794916d93f4e4a7948fe1bb2a34" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.528351 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.535613 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7fb48775-b79mg"] Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.541244 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d7fb48775-b79mg"] Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.567938 4830 scope.go:117] "RemoveContainer" containerID="1cb25310d93e243172f01fa350a3db71603f3f63d0fc5281b88ec880b15f1eb7" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.568046 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-credential-keys\") pod \"keystone-6f6ff8b5bf-p5xgc\" (UID: \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\") " pod="openstack/keystone-6f6ff8b5bf-p5xgc" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.568173 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-internal-tls-certs\") pod \"keystone-6f6ff8b5bf-p5xgc\" (UID: \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\") " pod="openstack/keystone-6f6ff8b5bf-p5xgc" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.568263 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-config-data\") pod \"keystone-6f6ff8b5bf-p5xgc\" (UID: \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\") " pod="openstack/keystone-6f6ff8b5bf-p5xgc" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.568339 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-scripts\") pod \"keystone-6f6ff8b5bf-p5xgc\" (UID: \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\") " pod="openstack/keystone-6f6ff8b5bf-p5xgc" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.568576 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-combined-ca-bundle\") pod \"keystone-6f6ff8b5bf-p5xgc\" (UID: \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\") " pod="openstack/keystone-6f6ff8b5bf-p5xgc" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.568645 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpfmg\" (UniqueName: \"kubernetes.io/projected/4ce021de-a1a0-43a6-a2fa-270ea1238bac-kube-api-access-kpfmg\") pod \"keystone-6f6ff8b5bf-p5xgc\" (UID: \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\") " pod="openstack/keystone-6f6ff8b5bf-p5xgc" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.568693 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-fernet-keys\") pod \"keystone-6f6ff8b5bf-p5xgc\" (UID: \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\") " pod="openstack/keystone-6f6ff8b5bf-p5xgc" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.568762 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-public-tls-certs\") pod \"keystone-6f6ff8b5bf-p5xgc\" (UID: \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\") " pod="openstack/keystone-6f6ff8b5bf-p5xgc" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.568895 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b46d602e-2976-4d06-80f1-c592927b2415-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.568915 4830 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.605442 4830 scope.go:117] "RemoveContainer" containerID="81e3c473c97b12796cde98e7f665cfcac3af3794916d93f4e4a7948fe1bb2a34" Mar 18 18:21:14 crc kubenswrapper[4830]: E0318 18:21:14.607546 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81e3c473c97b12796cde98e7f665cfcac3af3794916d93f4e4a7948fe1bb2a34\": container with ID starting with 81e3c473c97b12796cde98e7f665cfcac3af3794916d93f4e4a7948fe1bb2a34 not found: ID does not exist" containerID="81e3c473c97b12796cde98e7f665cfcac3af3794916d93f4e4a7948fe1bb2a34" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.607571 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81e3c473c97b12796cde98e7f665cfcac3af3794916d93f4e4a7948fe1bb2a34"} err="failed to get container status \"81e3c473c97b12796cde98e7f665cfcac3af3794916d93f4e4a7948fe1bb2a34\": rpc error: code = NotFound desc = could not find container \"81e3c473c97b12796cde98e7f665cfcac3af3794916d93f4e4a7948fe1bb2a34\": container with ID starting with 81e3c473c97b12796cde98e7f665cfcac3af3794916d93f4e4a7948fe1bb2a34 not found: ID does not exist" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.607591 4830 scope.go:117] "RemoveContainer" containerID="1cb25310d93e243172f01fa350a3db71603f3f63d0fc5281b88ec880b15f1eb7" Mar 18 18:21:14 crc kubenswrapper[4830]: E0318 18:21:14.608028 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cb25310d93e243172f01fa350a3db71603f3f63d0fc5281b88ec880b15f1eb7\": container with ID starting with 1cb25310d93e243172f01fa350a3db71603f3f63d0fc5281b88ec880b15f1eb7 not found: ID does not exist" containerID="1cb25310d93e243172f01fa350a3db71603f3f63d0fc5281b88ec880b15f1eb7" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.608070 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cb25310d93e243172f01fa350a3db71603f3f63d0fc5281b88ec880b15f1eb7"} err="failed to get container status \"1cb25310d93e243172f01fa350a3db71603f3f63d0fc5281b88ec880b15f1eb7\": rpc error: code = NotFound desc = could not find container \"1cb25310d93e243172f01fa350a3db71603f3f63d0fc5281b88ec880b15f1eb7\": container with ID starting with 1cb25310d93e243172f01fa350a3db71603f3f63d0fc5281b88ec880b15f1eb7 not found: ID does not exist" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.608100 4830 scope.go:117] "RemoveContainer" containerID="81e3c473c97b12796cde98e7f665cfcac3af3794916d93f4e4a7948fe1bb2a34" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.608570 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81e3c473c97b12796cde98e7f665cfcac3af3794916d93f4e4a7948fe1bb2a34"} err="failed to get container status \"81e3c473c97b12796cde98e7f665cfcac3af3794916d93f4e4a7948fe1bb2a34\": rpc error: code = NotFound desc = could not find container \"81e3c473c97b12796cde98e7f665cfcac3af3794916d93f4e4a7948fe1bb2a34\": container with ID starting with 81e3c473c97b12796cde98e7f665cfcac3af3794916d93f4e4a7948fe1bb2a34 not found: ID does not exist" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.608620 4830 scope.go:117] "RemoveContainer" containerID="1cb25310d93e243172f01fa350a3db71603f3f63d0fc5281b88ec880b15f1eb7" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.609063 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cb25310d93e243172f01fa350a3db71603f3f63d0fc5281b88ec880b15f1eb7"} err="failed to get container status \"1cb25310d93e243172f01fa350a3db71603f3f63d0fc5281b88ec880b15f1eb7\": rpc error: code = NotFound desc = could not find container \"1cb25310d93e243172f01fa350a3db71603f3f63d0fc5281b88ec880b15f1eb7\": container with ID starting with 1cb25310d93e243172f01fa350a3db71603f3f63d0fc5281b88ec880b15f1eb7 not found: ID does not exist" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.664054 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.669942 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.669994 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zskfq\" (UniqueName: \"kubernetes.io/projected/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-kube-api-access-zskfq\") pod \"glance-default-external-api-0\" (UID: \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.670012 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-config-data\") pod \"glance-default-external-api-0\" (UID: \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.670183 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-combined-ca-bundle\") pod \"keystone-6f6ff8b5bf-p5xgc\" (UID: \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\") " pod="openstack/keystone-6f6ff8b5bf-p5xgc" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.670239 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-logs\") pod \"glance-default-external-api-0\" (UID: \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.670286 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpfmg\" (UniqueName: \"kubernetes.io/projected/4ce021de-a1a0-43a6-a2fa-270ea1238bac-kube-api-access-kpfmg\") pod \"keystone-6f6ff8b5bf-p5xgc\" (UID: \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\") " pod="openstack/keystone-6f6ff8b5bf-p5xgc" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.670310 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-scripts\") pod \"glance-default-external-api-0\" (UID: \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.670388 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-fernet-keys\") pod \"keystone-6f6ff8b5bf-p5xgc\" (UID: \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\") " pod="openstack/keystone-6f6ff8b5bf-p5xgc" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.670439 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.670508 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-public-tls-certs\") pod \"keystone-6f6ff8b5bf-p5xgc\" (UID: \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\") " pod="openstack/keystone-6f6ff8b5bf-p5xgc" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.670549 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-credential-keys\") pod \"keystone-6f6ff8b5bf-p5xgc\" (UID: \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\") " pod="openstack/keystone-6f6ff8b5bf-p5xgc" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.670623 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.670749 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.670800 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-internal-tls-certs\") pod \"keystone-6f6ff8b5bf-p5xgc\" (UID: \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\") " pod="openstack/keystone-6f6ff8b5bf-p5xgc" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.670831 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-config-data\") pod \"keystone-6f6ff8b5bf-p5xgc\" (UID: \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\") " pod="openstack/keystone-6f6ff8b5bf-p5xgc" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.670866 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-scripts\") pod \"keystone-6f6ff8b5bf-p5xgc\" (UID: \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\") " pod="openstack/keystone-6f6ff8b5bf-p5xgc" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.674258 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-combined-ca-bundle\") pod \"keystone-6f6ff8b5bf-p5xgc\" (UID: \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\") " pod="openstack/keystone-6f6ff8b5bf-p5xgc" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.676378 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-fernet-keys\") pod \"keystone-6f6ff8b5bf-p5xgc\" (UID: \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\") " pod="openstack/keystone-6f6ff8b5bf-p5xgc" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.683013 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-scripts\") pod \"keystone-6f6ff8b5bf-p5xgc\" (UID: \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\") " pod="openstack/keystone-6f6ff8b5bf-p5xgc" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.684246 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-public-tls-certs\") pod \"keystone-6f6ff8b5bf-p5xgc\" (UID: \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\") " pod="openstack/keystone-6f6ff8b5bf-p5xgc" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.689405 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-config-data\") pod \"keystone-6f6ff8b5bf-p5xgc\" (UID: \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\") " pod="openstack/keystone-6f6ff8b5bf-p5xgc" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.689842 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-internal-tls-certs\") pod \"keystone-6f6ff8b5bf-p5xgc\" (UID: \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\") " pod="openstack/keystone-6f6ff8b5bf-p5xgc" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.689945 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.690691 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-credential-keys\") pod \"keystone-6f6ff8b5bf-p5xgc\" (UID: \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\") " pod="openstack/keystone-6f6ff8b5bf-p5xgc" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.702393 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpfmg\" (UniqueName: \"kubernetes.io/projected/4ce021de-a1a0-43a6-a2fa-270ea1238bac-kube-api-access-kpfmg\") pod \"keystone-6f6ff8b5bf-p5xgc\" (UID: \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\") " pod="openstack/keystone-6f6ff8b5bf-p5xgc" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.705967 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.708096 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.723583 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.723900 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.724293 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.771979 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.772055 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.772108 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.772139 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-config-data\") pod \"glance-default-external-api-0\" (UID: \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.772155 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zskfq\" (UniqueName: \"kubernetes.io/projected/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-kube-api-access-zskfq\") pod \"glance-default-external-api-0\" (UID: \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.772212 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-logs\") pod \"glance-default-external-api-0\" (UID: \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.772233 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-scripts\") pod \"glance-default-external-api-0\" (UID: \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.772277 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.773278 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-logs\") pod \"glance-default-external-api-0\" (UID: \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.773326 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.773398 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.778179 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-scripts\") pod \"glance-default-external-api-0\" (UID: \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.778956 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-config-data\") pod \"glance-default-external-api-0\" (UID: \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.780361 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.781755 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.787321 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zskfq\" (UniqueName: \"kubernetes.io/projected/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-kube-api-access-zskfq\") pod \"glance-default-external-api-0\" (UID: \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.799439 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\") " pod="openstack/glance-default-external-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.824825 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6f6ff8b5bf-p5xgc" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.836560 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.874810 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dee116fa-07c8-44cf-b7b9-8dd248a32d82-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.874886 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zsb6\" (UniqueName: \"kubernetes.io/projected/dee116fa-07c8-44cf-b7b9-8dd248a32d82-kube-api-access-9zsb6\") pod \"glance-default-internal-api-0\" (UID: \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.874926 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dee116fa-07c8-44cf-b7b9-8dd248a32d82-logs\") pod \"glance-default-internal-api-0\" (UID: \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.874942 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dee116fa-07c8-44cf-b7b9-8dd248a32d82-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.874960 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dee116fa-07c8-44cf-b7b9-8dd248a32d82-config-data\") pod \"glance-default-internal-api-0\" (UID: \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.874979 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dee116fa-07c8-44cf-b7b9-8dd248a32d82-scripts\") pod \"glance-default-internal-api-0\" (UID: \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.874995 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dee116fa-07c8-44cf-b7b9-8dd248a32d82-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.875019 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.976286 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dee116fa-07c8-44cf-b7b9-8dd248a32d82-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.976524 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zsb6\" (UniqueName: \"kubernetes.io/projected/dee116fa-07c8-44cf-b7b9-8dd248a32d82-kube-api-access-9zsb6\") pod \"glance-default-internal-api-0\" (UID: \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.976559 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dee116fa-07c8-44cf-b7b9-8dd248a32d82-logs\") pod \"glance-default-internal-api-0\" (UID: \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.976576 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dee116fa-07c8-44cf-b7b9-8dd248a32d82-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.976592 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dee116fa-07c8-44cf-b7b9-8dd248a32d82-config-data\") pod \"glance-default-internal-api-0\" (UID: \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.976609 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dee116fa-07c8-44cf-b7b9-8dd248a32d82-scripts\") pod \"glance-default-internal-api-0\" (UID: \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.976631 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dee116fa-07c8-44cf-b7b9-8dd248a32d82-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.976660 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.976879 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dee116fa-07c8-44cf-b7b9-8dd248a32d82-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.976994 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.977456 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dee116fa-07c8-44cf-b7b9-8dd248a32d82-logs\") pod \"glance-default-internal-api-0\" (UID: \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.984522 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dee116fa-07c8-44cf-b7b9-8dd248a32d82-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:14 crc kubenswrapper[4830]: I0318 18:21:14.994138 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dee116fa-07c8-44cf-b7b9-8dd248a32d82-config-data\") pod \"glance-default-internal-api-0\" (UID: \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.015696 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dee116fa-07c8-44cf-b7b9-8dd248a32d82-scripts\") pod \"glance-default-internal-api-0\" (UID: \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.021910 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dee116fa-07c8-44cf-b7b9-8dd248a32d82-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.024181 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.030652 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zsb6\" (UniqueName: \"kubernetes.io/projected/dee116fa-07c8-44cf-b7b9-8dd248a32d82-kube-api-access-9zsb6\") pod \"glance-default-internal-api-0\" (UID: \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.176421 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-85cbc86c69-bkfst"] Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.177836 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-85cbc86c69-bkfst" Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.185005 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.185682 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.187851 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-85cbc86c69-bkfst"] Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.282951 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-httpd-config\") pod \"neutron-85cbc86c69-bkfst\" (UID: \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\") " pod="openstack/neutron-85cbc86c69-bkfst" Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.283027 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-config\") pod \"neutron-85cbc86c69-bkfst\" (UID: \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\") " pod="openstack/neutron-85cbc86c69-bkfst" Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.283055 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-combined-ca-bundle\") pod \"neutron-85cbc86c69-bkfst\" (UID: \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\") " pod="openstack/neutron-85cbc86c69-bkfst" Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.283108 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbgnp\" (UniqueName: \"kubernetes.io/projected/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-kube-api-access-gbgnp\") pod \"neutron-85cbc86c69-bkfst\" (UID: \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\") " pod="openstack/neutron-85cbc86c69-bkfst" Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.283158 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-internal-tls-certs\") pod \"neutron-85cbc86c69-bkfst\" (UID: \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\") " pod="openstack/neutron-85cbc86c69-bkfst" Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.283191 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-ovndb-tls-certs\") pod \"neutron-85cbc86c69-bkfst\" (UID: \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\") " pod="openstack/neutron-85cbc86c69-bkfst" Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.283226 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-public-tls-certs\") pod \"neutron-85cbc86c69-bkfst\" (UID: \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\") " pod="openstack/neutron-85cbc86c69-bkfst" Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.324187 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.366829 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.374465 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-788c577778-lg8kp" event={"ID":"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4","Type":"ContainerStarted","Data":"c528121726764a5f97954024e587f792b928d1ed21a311edd7665687664f45be"} Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.374498 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-788c577778-lg8kp" event={"ID":"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4","Type":"ContainerStarted","Data":"bcdbca28adb2b677a3b5cdbe0dc2125d47ea6cf58677c075e7a03027a4601543"} Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.374887 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-788c577778-lg8kp" Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.374947 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-788c577778-lg8kp" Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.385946 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-httpd-config\") pod \"neutron-85cbc86c69-bkfst\" (UID: \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\") " pod="openstack/neutron-85cbc86c69-bkfst" Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.386015 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-config\") pod \"neutron-85cbc86c69-bkfst\" (UID: \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\") " pod="openstack/neutron-85cbc86c69-bkfst" Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.386076 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-combined-ca-bundle\") pod \"neutron-85cbc86c69-bkfst\" (UID: \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\") " pod="openstack/neutron-85cbc86c69-bkfst" Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.386128 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbgnp\" (UniqueName: \"kubernetes.io/projected/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-kube-api-access-gbgnp\") pod \"neutron-85cbc86c69-bkfst\" (UID: \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\") " pod="openstack/neutron-85cbc86c69-bkfst" Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.386219 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-internal-tls-certs\") pod \"neutron-85cbc86c69-bkfst\" (UID: \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\") " pod="openstack/neutron-85cbc86c69-bkfst" Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.386243 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-ovndb-tls-certs\") pod \"neutron-85cbc86c69-bkfst\" (UID: \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\") " pod="openstack/neutron-85cbc86c69-bkfst" Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.386301 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-public-tls-certs\") pod \"neutron-85cbc86c69-bkfst\" (UID: \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\") " pod="openstack/neutron-85cbc86c69-bkfst" Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.393425 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-internal-tls-certs\") pod \"neutron-85cbc86c69-bkfst\" (UID: \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\") " pod="openstack/neutron-85cbc86c69-bkfst" Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.394214 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-combined-ca-bundle\") pod \"neutron-85cbc86c69-bkfst\" (UID: \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\") " pod="openstack/neutron-85cbc86c69-bkfst" Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.395492 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-public-tls-certs\") pod \"neutron-85cbc86c69-bkfst\" (UID: \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\") " pod="openstack/neutron-85cbc86c69-bkfst" Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.396049 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-ovndb-tls-certs\") pod \"neutron-85cbc86c69-bkfst\" (UID: \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\") " pod="openstack/neutron-85cbc86c69-bkfst" Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.397528 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-config\") pod \"neutron-85cbc86c69-bkfst\" (UID: \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\") " pod="openstack/neutron-85cbc86c69-bkfst" Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.406119 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-788c577778-lg8kp" podStartSLOduration=2.406098567 podStartE2EDuration="2.406098567s" podCreationTimestamp="2026-03-18 18:21:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:21:15.402111575 +0000 UTC m=+1109.969741907" watchObservedRunningTime="2026-03-18 18:21:15.406098567 +0000 UTC m=+1109.973728899" Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.417419 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-httpd-config\") pod \"neutron-85cbc86c69-bkfst\" (UID: \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\") " pod="openstack/neutron-85cbc86c69-bkfst" Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.418371 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbgnp\" (UniqueName: \"kubernetes.io/projected/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-kube-api-access-gbgnp\") pod \"neutron-85cbc86c69-bkfst\" (UID: \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\") " pod="openstack/neutron-85cbc86c69-bkfst" Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.454885 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6f6ff8b5bf-p5xgc"] Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.496360 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-85cbc86c69-bkfst" Mar 18 18:21:15 crc kubenswrapper[4830]: I0318 18:21:15.945501 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:21:15 crc kubenswrapper[4830]: W0318 18:21:15.962456 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddee116fa_07c8_44cf_b7b9_8dd248a32d82.slice/crio-28dbbaecfe46ada24a64776bfaff890b440b4d7fc6f29f59cf8e5ee8e83507a8 WatchSource:0}: Error finding container 28dbbaecfe46ada24a64776bfaff890b440b4d7fc6f29f59cf8e5ee8e83507a8: Status 404 returned error can't find the container with id 28dbbaecfe46ada24a64776bfaff890b440b4d7fc6f29f59cf8e5ee8e83507a8 Mar 18 18:21:16 crc kubenswrapper[4830]: I0318 18:21:16.116502 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-85cbc86c69-bkfst"] Mar 18 18:21:16 crc kubenswrapper[4830]: I0318 18:21:16.246061 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5" path="/var/lib/kubelet/pods/b236dbaf-0244-477f-b8fe-4d4a1a9f9fa5/volumes" Mar 18 18:21:16 crc kubenswrapper[4830]: I0318 18:21:16.247132 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b46d602e-2976-4d06-80f1-c592927b2415" path="/var/lib/kubelet/pods/b46d602e-2976-4d06-80f1-c592927b2415/volumes" Mar 18 18:21:16 crc kubenswrapper[4830]: I0318 18:21:16.247847 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6bfa626-3689-4791-94ce-6a2b2d80c8ea" path="/var/lib/kubelet/pods/c6bfa626-3689-4791-94ce-6a2b2d80c8ea/volumes" Mar 18 18:21:16 crc kubenswrapper[4830]: I0318 18:21:16.381882 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dee116fa-07c8-44cf-b7b9-8dd248a32d82","Type":"ContainerStarted","Data":"28dbbaecfe46ada24a64776bfaff890b440b4d7fc6f29f59cf8e5ee8e83507a8"} Mar 18 18:21:16 crc kubenswrapper[4830]: I0318 18:21:16.382971 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6f6ff8b5bf-p5xgc" event={"ID":"4ce021de-a1a0-43a6-a2fa-270ea1238bac","Type":"ContainerStarted","Data":"110a5ec4f5768120e797dad14d9e9cdc7b4dca85aab727001f861f2b68696081"} Mar 18 18:21:16 crc kubenswrapper[4830]: I0318 18:21:16.384170 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85cbc86c69-bkfst" event={"ID":"e184a0dc-c2fa-4cc2-9785-18a056ab0c46","Type":"ContainerStarted","Data":"550094753c0c72a83f4a09c9aab80db49d41471f4c5ea0cc699a845e44d93dde"} Mar 18 18:21:16 crc kubenswrapper[4830]: I0318 18:21:16.386083 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"af2a66cf-2d32-4beb-9df1-e3958a2ff5de","Type":"ContainerStarted","Data":"8924caab6e84c4053d4b26a930837897ee672254fd2d03647d16b9a320dd4c1a"} Mar 18 18:21:17 crc kubenswrapper[4830]: I0318 18:21:17.403349 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dee116fa-07c8-44cf-b7b9-8dd248a32d82","Type":"ContainerStarted","Data":"9bcaf510219165dbbff0561363074075213557f9315c0786aed45b26b2d4e15e"} Mar 18 18:21:17 crc kubenswrapper[4830]: I0318 18:21:17.408191 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6f6ff8b5bf-p5xgc" event={"ID":"4ce021de-a1a0-43a6-a2fa-270ea1238bac","Type":"ContainerStarted","Data":"9ac24ed62afef232745223270fdd95256063c89724522a58a2fc1a5183dbf7a7"} Mar 18 18:21:17 crc kubenswrapper[4830]: I0318 18:21:17.408870 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6f6ff8b5bf-p5xgc" Mar 18 18:21:17 crc kubenswrapper[4830]: I0318 18:21:17.415226 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85cbc86c69-bkfst" event={"ID":"e184a0dc-c2fa-4cc2-9785-18a056ab0c46","Type":"ContainerStarted","Data":"22ea3fa0cc5c2b7b61047286d5c724a062a16f2a3599d4207776fd36457bdcd2"} Mar 18 18:21:17 crc kubenswrapper[4830]: I0318 18:21:17.423925 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"af2a66cf-2d32-4beb-9df1-e3958a2ff5de","Type":"ContainerStarted","Data":"eeaeeb52c55c3b4e532608d4a697fea60145f3d263eab500579c40079e06d51d"} Mar 18 18:21:17 crc kubenswrapper[4830]: I0318 18:21:17.446895 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6f6ff8b5bf-p5xgc" podStartSLOduration=3.44686721 podStartE2EDuration="3.44686721s" podCreationTimestamp="2026-03-18 18:21:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:21:17.432799725 +0000 UTC m=+1112.000430077" watchObservedRunningTime="2026-03-18 18:21:17.44686721 +0000 UTC m=+1112.014497562" Mar 18 18:21:19 crc kubenswrapper[4830]: I0318 18:21:19.440313 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85cbc86c69-bkfst" event={"ID":"e184a0dc-c2fa-4cc2-9785-18a056ab0c46","Type":"ContainerStarted","Data":"2cdcb9ee439266520f74d448b0617ce7209026290de151d3b384a0c54cc23c3f"} Mar 18 18:21:19 crc kubenswrapper[4830]: I0318 18:21:19.441185 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-85cbc86c69-bkfst" Mar 18 18:21:19 crc kubenswrapper[4830]: I0318 18:21:19.443319 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7df9e731-0537-4d90-a4c1-907721b227e1","Type":"ContainerStarted","Data":"e71f3e55d1b20dc8557a690ff642e06691b7429714e7ad3a2835647f38bc0b9c"} Mar 18 18:21:19 crc kubenswrapper[4830]: I0318 18:21:19.461579 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-85cbc86c69-bkfst" podStartSLOduration=4.461557062 podStartE2EDuration="4.461557062s" podCreationTimestamp="2026-03-18 18:21:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:21:19.456609773 +0000 UTC m=+1114.024240095" watchObservedRunningTime="2026-03-18 18:21:19.461557062 +0000 UTC m=+1114.029187394" Mar 18 18:21:20 crc kubenswrapper[4830]: I0318 18:21:20.457918 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dee116fa-07c8-44cf-b7b9-8dd248a32d82","Type":"ContainerStarted","Data":"f2ae5ca0cd907c8d00efc0bdb756c6268dc3f31fdaf2ff901140788845cf5b79"} Mar 18 18:21:20 crc kubenswrapper[4830]: I0318 18:21:20.461740 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"af2a66cf-2d32-4beb-9df1-e3958a2ff5de","Type":"ContainerStarted","Data":"824bf5a73987f5113ac249c45ebbb7a4b623031c0aac26c391413ea78d6b2547"} Mar 18 18:21:20 crc kubenswrapper[4830]: I0318 18:21:20.489681 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.489662406 podStartE2EDuration="6.489662406s" podCreationTimestamp="2026-03-18 18:21:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:21:20.483688768 +0000 UTC m=+1115.051319140" watchObservedRunningTime="2026-03-18 18:21:20.489662406 +0000 UTC m=+1115.057292738" Mar 18 18:21:20 crc kubenswrapper[4830]: I0318 18:21:20.509916 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.509885794 podStartE2EDuration="6.509885794s" podCreationTimestamp="2026-03-18 18:21:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:21:20.503490535 +0000 UTC m=+1115.071120887" watchObservedRunningTime="2026-03-18 18:21:20.509885794 +0000 UTC m=+1115.077516126" Mar 18 18:21:21 crc kubenswrapper[4830]: I0318 18:21:21.474667 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wvvs5" event={"ID":"40d348ed-98d2-494b-b2b1-f1dfb190a636","Type":"ContainerStarted","Data":"169735e5b77a4533222c6bab6fdb49feec714327883d4fd25c5a424ede227b9d"} Mar 18 18:21:21 crc kubenswrapper[4830]: I0318 18:21:21.505993 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-wvvs5" podStartSLOduration=2.688150535 podStartE2EDuration="33.505968019s" podCreationTimestamp="2026-03-18 18:20:48 +0000 UTC" firstStartedPulling="2026-03-18 18:20:49.486977294 +0000 UTC m=+1084.054607636" lastFinishedPulling="2026-03-18 18:21:20.304794778 +0000 UTC m=+1114.872425120" observedRunningTime="2026-03-18 18:21:21.49855841 +0000 UTC m=+1116.066188752" watchObservedRunningTime="2026-03-18 18:21:21.505968019 +0000 UTC m=+1116.073598381" Mar 18 18:21:21 crc kubenswrapper[4830]: I0318 18:21:21.736879 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d8b7b7d5-kjddm" Mar 18 18:21:21 crc kubenswrapper[4830]: I0318 18:21:21.836625 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff6d84665-njgzs"] Mar 18 18:21:21 crc kubenswrapper[4830]: I0318 18:21:21.837223 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7ff6d84665-njgzs" podUID="cf55af5b-7fa0-4c22-8edc-868fdd43d7c6" containerName="dnsmasq-dns" containerID="cri-o://962947d8482075c926d7b1628195a9b909626dd7f9b57ff077cac44de5e256a8" gracePeriod=10 Mar 18 18:21:22 crc kubenswrapper[4830]: I0318 18:21:22.378538 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff6d84665-njgzs" Mar 18 18:21:22 crc kubenswrapper[4830]: I0318 18:21:22.483516 4830 generic.go:334] "Generic (PLEG): container finished" podID="cf55af5b-7fa0-4c22-8edc-868fdd43d7c6" containerID="962947d8482075c926d7b1628195a9b909626dd7f9b57ff077cac44de5e256a8" exitCode=0 Mar 18 18:21:22 crc kubenswrapper[4830]: I0318 18:21:22.483557 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff6d84665-njgzs" event={"ID":"cf55af5b-7fa0-4c22-8edc-868fdd43d7c6","Type":"ContainerDied","Data":"962947d8482075c926d7b1628195a9b909626dd7f9b57ff077cac44de5e256a8"} Mar 18 18:21:22 crc kubenswrapper[4830]: I0318 18:21:22.483567 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff6d84665-njgzs" Mar 18 18:21:22 crc kubenswrapper[4830]: I0318 18:21:22.483580 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff6d84665-njgzs" event={"ID":"cf55af5b-7fa0-4c22-8edc-868fdd43d7c6","Type":"ContainerDied","Data":"5bddc21daf1dfae2fcab84f96e1cdd80870e578df077ea97541132ef044958a0"} Mar 18 18:21:22 crc kubenswrapper[4830]: I0318 18:21:22.483596 4830 scope.go:117] "RemoveContainer" containerID="962947d8482075c926d7b1628195a9b909626dd7f9b57ff077cac44de5e256a8" Mar 18 18:21:22 crc kubenswrapper[4830]: I0318 18:21:22.509370 4830 scope.go:117] "RemoveContainer" containerID="6d101ca18743304a5c30e54671bb9b642ddbd61577cbc01b12b1a1bddf2aea39" Mar 18 18:21:22 crc kubenswrapper[4830]: I0318 18:21:22.530075 4830 scope.go:117] "RemoveContainer" containerID="962947d8482075c926d7b1628195a9b909626dd7f9b57ff077cac44de5e256a8" Mar 18 18:21:22 crc kubenswrapper[4830]: E0318 18:21:22.531811 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"962947d8482075c926d7b1628195a9b909626dd7f9b57ff077cac44de5e256a8\": container with ID starting with 962947d8482075c926d7b1628195a9b909626dd7f9b57ff077cac44de5e256a8 not found: ID does not exist" containerID="962947d8482075c926d7b1628195a9b909626dd7f9b57ff077cac44de5e256a8" Mar 18 18:21:22 crc kubenswrapper[4830]: I0318 18:21:22.531841 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"962947d8482075c926d7b1628195a9b909626dd7f9b57ff077cac44de5e256a8"} err="failed to get container status \"962947d8482075c926d7b1628195a9b909626dd7f9b57ff077cac44de5e256a8\": rpc error: code = NotFound desc = could not find container \"962947d8482075c926d7b1628195a9b909626dd7f9b57ff077cac44de5e256a8\": container with ID starting with 962947d8482075c926d7b1628195a9b909626dd7f9b57ff077cac44de5e256a8 not found: ID does not exist" Mar 18 18:21:22 crc kubenswrapper[4830]: I0318 18:21:22.531861 4830 scope.go:117] "RemoveContainer" containerID="6d101ca18743304a5c30e54671bb9b642ddbd61577cbc01b12b1a1bddf2aea39" Mar 18 18:21:22 crc kubenswrapper[4830]: E0318 18:21:22.532151 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d101ca18743304a5c30e54671bb9b642ddbd61577cbc01b12b1a1bddf2aea39\": container with ID starting with 6d101ca18743304a5c30e54671bb9b642ddbd61577cbc01b12b1a1bddf2aea39 not found: ID does not exist" containerID="6d101ca18743304a5c30e54671bb9b642ddbd61577cbc01b12b1a1bddf2aea39" Mar 18 18:21:22 crc kubenswrapper[4830]: I0318 18:21:22.532175 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d101ca18743304a5c30e54671bb9b642ddbd61577cbc01b12b1a1bddf2aea39"} err="failed to get container status \"6d101ca18743304a5c30e54671bb9b642ddbd61577cbc01b12b1a1bddf2aea39\": rpc error: code = NotFound desc = could not find container \"6d101ca18743304a5c30e54671bb9b642ddbd61577cbc01b12b1a1bddf2aea39\": container with ID starting with 6d101ca18743304a5c30e54671bb9b642ddbd61577cbc01b12b1a1bddf2aea39 not found: ID does not exist" Mar 18 18:21:22 crc kubenswrapper[4830]: I0318 18:21:22.538671 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-dns-svc\") pod \"cf55af5b-7fa0-4c22-8edc-868fdd43d7c6\" (UID: \"cf55af5b-7fa0-4c22-8edc-868fdd43d7c6\") " Mar 18 18:21:22 crc kubenswrapper[4830]: I0318 18:21:22.538736 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-dns-swift-storage-0\") pod \"cf55af5b-7fa0-4c22-8edc-868fdd43d7c6\" (UID: \"cf55af5b-7fa0-4c22-8edc-868fdd43d7c6\") " Mar 18 18:21:22 crc kubenswrapper[4830]: I0318 18:21:22.538794 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-ovsdbserver-sb\") pod \"cf55af5b-7fa0-4c22-8edc-868fdd43d7c6\" (UID: \"cf55af5b-7fa0-4c22-8edc-868fdd43d7c6\") " Mar 18 18:21:22 crc kubenswrapper[4830]: I0318 18:21:22.538814 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-config\") pod \"cf55af5b-7fa0-4c22-8edc-868fdd43d7c6\" (UID: \"cf55af5b-7fa0-4c22-8edc-868fdd43d7c6\") " Mar 18 18:21:22 crc kubenswrapper[4830]: I0318 18:21:22.538929 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pf249\" (UniqueName: \"kubernetes.io/projected/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-kube-api-access-pf249\") pod \"cf55af5b-7fa0-4c22-8edc-868fdd43d7c6\" (UID: \"cf55af5b-7fa0-4c22-8edc-868fdd43d7c6\") " Mar 18 18:21:22 crc kubenswrapper[4830]: I0318 18:21:22.538955 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-ovsdbserver-nb\") pod \"cf55af5b-7fa0-4c22-8edc-868fdd43d7c6\" (UID: \"cf55af5b-7fa0-4c22-8edc-868fdd43d7c6\") " Mar 18 18:21:22 crc kubenswrapper[4830]: I0318 18:21:22.544132 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-kube-api-access-pf249" (OuterVolumeSpecName: "kube-api-access-pf249") pod "cf55af5b-7fa0-4c22-8edc-868fdd43d7c6" (UID: "cf55af5b-7fa0-4c22-8edc-868fdd43d7c6"). InnerVolumeSpecName "kube-api-access-pf249". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:21:22 crc kubenswrapper[4830]: I0318 18:21:22.582795 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cf55af5b-7fa0-4c22-8edc-868fdd43d7c6" (UID: "cf55af5b-7fa0-4c22-8edc-868fdd43d7c6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:21:22 crc kubenswrapper[4830]: I0318 18:21:22.585138 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-config" (OuterVolumeSpecName: "config") pod "cf55af5b-7fa0-4c22-8edc-868fdd43d7c6" (UID: "cf55af5b-7fa0-4c22-8edc-868fdd43d7c6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:21:22 crc kubenswrapper[4830]: I0318 18:21:22.593585 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cf55af5b-7fa0-4c22-8edc-868fdd43d7c6" (UID: "cf55af5b-7fa0-4c22-8edc-868fdd43d7c6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:21:22 crc kubenswrapper[4830]: I0318 18:21:22.594489 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cf55af5b-7fa0-4c22-8edc-868fdd43d7c6" (UID: "cf55af5b-7fa0-4c22-8edc-868fdd43d7c6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:21:22 crc kubenswrapper[4830]: I0318 18:21:22.600078 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cf55af5b-7fa0-4c22-8edc-868fdd43d7c6" (UID: "cf55af5b-7fa0-4c22-8edc-868fdd43d7c6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:21:22 crc kubenswrapper[4830]: I0318 18:21:22.640787 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pf249\" (UniqueName: \"kubernetes.io/projected/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-kube-api-access-pf249\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:22 crc kubenswrapper[4830]: I0318 18:21:22.641083 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:22 crc kubenswrapper[4830]: I0318 18:21:22.641094 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:22 crc kubenswrapper[4830]: I0318 18:21:22.641102 4830 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:22 crc kubenswrapper[4830]: I0318 18:21:22.641110 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:22 crc kubenswrapper[4830]: I0318 18:21:22.641118 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:22 crc kubenswrapper[4830]: I0318 18:21:22.829461 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff6d84665-njgzs"] Mar 18 18:21:22 crc kubenswrapper[4830]: I0318 18:21:22.836999 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7ff6d84665-njgzs"] Mar 18 18:21:23 crc kubenswrapper[4830]: I0318 18:21:23.510511 4830 generic.go:334] "Generic (PLEG): container finished" podID="40d348ed-98d2-494b-b2b1-f1dfb190a636" containerID="169735e5b77a4533222c6bab6fdb49feec714327883d4fd25c5a424ede227b9d" exitCode=0 Mar 18 18:21:23 crc kubenswrapper[4830]: I0318 18:21:23.510619 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wvvs5" event={"ID":"40d348ed-98d2-494b-b2b1-f1dfb190a636","Type":"ContainerDied","Data":"169735e5b77a4533222c6bab6fdb49feec714327883d4fd25c5a424ede227b9d"} Mar 18 18:21:23 crc kubenswrapper[4830]: I0318 18:21:23.513032 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-96knc" event={"ID":"8c42d089-56c7-45ee-ba54-ee464499ff29","Type":"ContainerStarted","Data":"9e6361c5ac167888b9787c9edea1f987baf91625a38cc774ca68d2fd5c29279b"} Mar 18 18:21:23 crc kubenswrapper[4830]: I0318 18:21:23.550974 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-96knc" podStartSLOduration=1.852658658 podStartE2EDuration="35.550955881s" podCreationTimestamp="2026-03-18 18:20:48 +0000 UTC" firstStartedPulling="2026-03-18 18:20:49.178658 +0000 UTC m=+1083.746288342" lastFinishedPulling="2026-03-18 18:21:22.876955233 +0000 UTC m=+1117.444585565" observedRunningTime="2026-03-18 18:21:23.542912365 +0000 UTC m=+1118.110542697" watchObservedRunningTime="2026-03-18 18:21:23.550955881 +0000 UTC m=+1118.118586213" Mar 18 18:21:24 crc kubenswrapper[4830]: I0318 18:21:24.249662 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf55af5b-7fa0-4c22-8edc-868fdd43d7c6" path="/var/lib/kubelet/pods/cf55af5b-7fa0-4c22-8edc-868fdd43d7c6/volumes" Mar 18 18:21:24 crc kubenswrapper[4830]: I0318 18:21:24.837274 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 18:21:24 crc kubenswrapper[4830]: I0318 18:21:24.837318 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 18:21:24 crc kubenswrapper[4830]: I0318 18:21:24.880703 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 18:21:24 crc kubenswrapper[4830]: I0318 18:21:24.882296 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 18:21:25 crc kubenswrapper[4830]: I0318 18:21:25.325887 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 18:21:25 crc kubenswrapper[4830]: I0318 18:21:25.326178 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 18:21:25 crc kubenswrapper[4830]: I0318 18:21:25.358608 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 18:21:25 crc kubenswrapper[4830]: I0318 18:21:25.413092 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 18:21:25 crc kubenswrapper[4830]: I0318 18:21:25.533644 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 18:21:25 crc kubenswrapper[4830]: I0318 18:21:25.533684 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 18:21:25 crc kubenswrapper[4830]: I0318 18:21:25.533698 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 18:21:25 crc kubenswrapper[4830]: I0318 18:21:25.533710 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 18:21:27 crc kubenswrapper[4830]: I0318 18:21:27.353325 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 18:21:27 crc kubenswrapper[4830]: I0318 18:21:27.367406 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 18:21:27 crc kubenswrapper[4830]: I0318 18:21:27.382428 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 18:21:27 crc kubenswrapper[4830]: I0318 18:21:27.443860 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 18:21:27 crc kubenswrapper[4830]: I0318 18:21:27.486693 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wvvs5" Mar 18 18:21:27 crc kubenswrapper[4830]: I0318 18:21:27.544437 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trfdx\" (UniqueName: \"kubernetes.io/projected/40d348ed-98d2-494b-b2b1-f1dfb190a636-kube-api-access-trfdx\") pod \"40d348ed-98d2-494b-b2b1-f1dfb190a636\" (UID: \"40d348ed-98d2-494b-b2b1-f1dfb190a636\") " Mar 18 18:21:27 crc kubenswrapper[4830]: I0318 18:21:27.544562 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40d348ed-98d2-494b-b2b1-f1dfb190a636-combined-ca-bundle\") pod \"40d348ed-98d2-494b-b2b1-f1dfb190a636\" (UID: \"40d348ed-98d2-494b-b2b1-f1dfb190a636\") " Mar 18 18:21:27 crc kubenswrapper[4830]: I0318 18:21:27.544586 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/40d348ed-98d2-494b-b2b1-f1dfb190a636-db-sync-config-data\") pod \"40d348ed-98d2-494b-b2b1-f1dfb190a636\" (UID: \"40d348ed-98d2-494b-b2b1-f1dfb190a636\") " Mar 18 18:21:27 crc kubenswrapper[4830]: I0318 18:21:27.550682 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40d348ed-98d2-494b-b2b1-f1dfb190a636-kube-api-access-trfdx" (OuterVolumeSpecName: "kube-api-access-trfdx") pod "40d348ed-98d2-494b-b2b1-f1dfb190a636" (UID: "40d348ed-98d2-494b-b2b1-f1dfb190a636"). InnerVolumeSpecName "kube-api-access-trfdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:21:27 crc kubenswrapper[4830]: I0318 18:21:27.553668 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40d348ed-98d2-494b-b2b1-f1dfb190a636-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "40d348ed-98d2-494b-b2b1-f1dfb190a636" (UID: "40d348ed-98d2-494b-b2b1-f1dfb190a636"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:27 crc kubenswrapper[4830]: I0318 18:21:27.565725 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wvvs5" event={"ID":"40d348ed-98d2-494b-b2b1-f1dfb190a636","Type":"ContainerDied","Data":"8ceaa383a57ab73501b289a2bf746f35b115ec00610bd8caa012ccbbbc6a45c9"} Mar 18 18:21:27 crc kubenswrapper[4830]: I0318 18:21:27.565758 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ceaa383a57ab73501b289a2bf746f35b115ec00610bd8caa012ccbbbc6a45c9" Mar 18 18:21:27 crc kubenswrapper[4830]: I0318 18:21:27.565829 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wvvs5" Mar 18 18:21:27 crc kubenswrapper[4830]: I0318 18:21:27.570797 4830 generic.go:334] "Generic (PLEG): container finished" podID="8c42d089-56c7-45ee-ba54-ee464499ff29" containerID="9e6361c5ac167888b9787c9edea1f987baf91625a38cc774ca68d2fd5c29279b" exitCode=0 Mar 18 18:21:27 crc kubenswrapper[4830]: I0318 18:21:27.571894 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-96knc" event={"ID":"8c42d089-56c7-45ee-ba54-ee464499ff29","Type":"ContainerDied","Data":"9e6361c5ac167888b9787c9edea1f987baf91625a38cc774ca68d2fd5c29279b"} Mar 18 18:21:27 crc kubenswrapper[4830]: I0318 18:21:27.621096 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40d348ed-98d2-494b-b2b1-f1dfb190a636-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40d348ed-98d2-494b-b2b1-f1dfb190a636" (UID: "40d348ed-98d2-494b-b2b1-f1dfb190a636"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:27 crc kubenswrapper[4830]: I0318 18:21:27.647051 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trfdx\" (UniqueName: \"kubernetes.io/projected/40d348ed-98d2-494b-b2b1-f1dfb190a636-kube-api-access-trfdx\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:27 crc kubenswrapper[4830]: I0318 18:21:27.647085 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40d348ed-98d2-494b-b2b1-f1dfb190a636-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:27 crc kubenswrapper[4830]: I0318 18:21:27.647095 4830 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/40d348ed-98d2-494b-b2b1-f1dfb190a636-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:28 crc kubenswrapper[4830]: I0318 18:21:28.586934 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7df9e731-0537-4d90-a4c1-907721b227e1","Type":"ContainerStarted","Data":"449ef6edc93cf2e68fa998d3203c16831d9f5a978d92b1f5c8596ad8eec051ec"} Mar 18 18:21:28 crc kubenswrapper[4830]: I0318 18:21:28.586982 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7df9e731-0537-4d90-a4c1-907721b227e1" containerName="ceilometer-central-agent" containerID="cri-o://50c6f76b1d4c7a12f137499a2267b56243ac8396eb3e46a198a94656acaf8ba4" gracePeriod=30 Mar 18 18:21:28 crc kubenswrapper[4830]: I0318 18:21:28.587003 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7df9e731-0537-4d90-a4c1-907721b227e1" containerName="proxy-httpd" containerID="cri-o://449ef6edc93cf2e68fa998d3203c16831d9f5a978d92b1f5c8596ad8eec051ec" gracePeriod=30 Mar 18 18:21:28 crc kubenswrapper[4830]: I0318 18:21:28.587024 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7df9e731-0537-4d90-a4c1-907721b227e1" containerName="ceilometer-notification-agent" containerID="cri-o://47fed7dea201a7cd51fbe2d563f0f2959e9ef377e040f6d593b4d99795c35487" gracePeriod=30 Mar 18 18:21:28 crc kubenswrapper[4830]: I0318 18:21:28.587072 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7df9e731-0537-4d90-a4c1-907721b227e1" containerName="sg-core" containerID="cri-o://e71f3e55d1b20dc8557a690ff642e06691b7429714e7ad3a2835647f38bc0b9c" gracePeriod=30 Mar 18 18:21:28 crc kubenswrapper[4830]: I0318 18:21:28.587532 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 18:21:28 crc kubenswrapper[4830]: I0318 18:21:28.615849 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.81348292 podStartE2EDuration="40.615830486s" podCreationTimestamp="2026-03-18 18:20:48 +0000 UTC" firstStartedPulling="2026-03-18 18:20:49.298527673 +0000 UTC m=+1083.866158005" lastFinishedPulling="2026-03-18 18:21:28.100875249 +0000 UTC m=+1122.668505571" observedRunningTime="2026-03-18 18:21:28.608357146 +0000 UTC m=+1123.175987478" watchObservedRunningTime="2026-03-18 18:21:28.615830486 +0000 UTC m=+1123.183460818" Mar 18 18:21:28 crc kubenswrapper[4830]: I0318 18:21:28.802896 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6bbb58d4c-74p8g"] Mar 18 18:21:28 crc kubenswrapper[4830]: E0318 18:21:28.803708 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf55af5b-7fa0-4c22-8edc-868fdd43d7c6" containerName="dnsmasq-dns" Mar 18 18:21:28 crc kubenswrapper[4830]: I0318 18:21:28.803725 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf55af5b-7fa0-4c22-8edc-868fdd43d7c6" containerName="dnsmasq-dns" Mar 18 18:21:28 crc kubenswrapper[4830]: E0318 18:21:28.803739 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40d348ed-98d2-494b-b2b1-f1dfb190a636" containerName="barbican-db-sync" Mar 18 18:21:28 crc kubenswrapper[4830]: I0318 18:21:28.803747 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="40d348ed-98d2-494b-b2b1-f1dfb190a636" containerName="barbican-db-sync" Mar 18 18:21:28 crc kubenswrapper[4830]: E0318 18:21:28.803798 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf55af5b-7fa0-4c22-8edc-868fdd43d7c6" containerName="init" Mar 18 18:21:28 crc kubenswrapper[4830]: I0318 18:21:28.803806 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf55af5b-7fa0-4c22-8edc-868fdd43d7c6" containerName="init" Mar 18 18:21:28 crc kubenswrapper[4830]: I0318 18:21:28.804012 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf55af5b-7fa0-4c22-8edc-868fdd43d7c6" containerName="dnsmasq-dns" Mar 18 18:21:28 crc kubenswrapper[4830]: I0318 18:21:28.804045 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="40d348ed-98d2-494b-b2b1-f1dfb190a636" containerName="barbican-db-sync" Mar 18 18:21:28 crc kubenswrapper[4830]: I0318 18:21:28.814343 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6bbb58d4c-74p8g"] Mar 18 18:21:28 crc kubenswrapper[4830]: I0318 18:21:28.815238 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6bbb58d4c-74p8g" Mar 18 18:21:28 crc kubenswrapper[4830]: I0318 18:21:28.818348 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 18 18:21:28 crc kubenswrapper[4830]: I0318 18:21:28.818500 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-bwpfg" Mar 18 18:21:28 crc kubenswrapper[4830]: I0318 18:21:28.818610 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 18 18:21:28 crc kubenswrapper[4830]: I0318 18:21:28.857966 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-78f6989b54-vkxh8"] Mar 18 18:21:28 crc kubenswrapper[4830]: I0318 18:21:28.859436 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-78f6989b54-vkxh8" Mar 18 18:21:28 crc kubenswrapper[4830]: I0318 18:21:28.866940 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 18 18:21:28 crc kubenswrapper[4830]: I0318 18:21:28.877635 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-78f6989b54-vkxh8"] Mar 18 18:21:28 crc kubenswrapper[4830]: I0318 18:21:28.970985 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7df4c9958f-sb6xg"] Mar 18 18:21:28 crc kubenswrapper[4830]: I0318 18:21:28.972567 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7df4c9958f-sb6xg" Mar 18 18:21:28 crc kubenswrapper[4830]: I0318 18:21:28.977961 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7df4c9958f-sb6xg"] Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.001184 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48aa5450-29c8-47de-bb37-a7a6ffd441bc-config-data-custom\") pod \"barbican-keystone-listener-78f6989b54-vkxh8\" (UID: \"48aa5450-29c8-47de-bb37-a7a6ffd441bc\") " pod="openstack/barbican-keystone-listener-78f6989b54-vkxh8" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.001257 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48aa5450-29c8-47de-bb37-a7a6ffd441bc-config-data\") pod \"barbican-keystone-listener-78f6989b54-vkxh8\" (UID: \"48aa5450-29c8-47de-bb37-a7a6ffd441bc\") " pod="openstack/barbican-keystone-listener-78f6989b54-vkxh8" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.001338 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11e19037-abf1-4269-b933-0950913973b9-config-data-custom\") pod \"barbican-worker-6bbb58d4c-74p8g\" (UID: \"11e19037-abf1-4269-b933-0950913973b9\") " pod="openstack/barbican-worker-6bbb58d4c-74p8g" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.001428 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11e19037-abf1-4269-b933-0950913973b9-logs\") pod \"barbican-worker-6bbb58d4c-74p8g\" (UID: \"11e19037-abf1-4269-b933-0950913973b9\") " pod="openstack/barbican-worker-6bbb58d4c-74p8g" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.001611 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48aa5450-29c8-47de-bb37-a7a6ffd441bc-combined-ca-bundle\") pod \"barbican-keystone-listener-78f6989b54-vkxh8\" (UID: \"48aa5450-29c8-47de-bb37-a7a6ffd441bc\") " pod="openstack/barbican-keystone-listener-78f6989b54-vkxh8" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.001657 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11e19037-abf1-4269-b933-0950913973b9-combined-ca-bundle\") pod \"barbican-worker-6bbb58d4c-74p8g\" (UID: \"11e19037-abf1-4269-b933-0950913973b9\") " pod="openstack/barbican-worker-6bbb58d4c-74p8g" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.001694 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11e19037-abf1-4269-b933-0950913973b9-config-data\") pod \"barbican-worker-6bbb58d4c-74p8g\" (UID: \"11e19037-abf1-4269-b933-0950913973b9\") " pod="openstack/barbican-worker-6bbb58d4c-74p8g" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.001734 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfnj6\" (UniqueName: \"kubernetes.io/projected/11e19037-abf1-4269-b933-0950913973b9-kube-api-access-tfnj6\") pod \"barbican-worker-6bbb58d4c-74p8g\" (UID: \"11e19037-abf1-4269-b933-0950913973b9\") " pod="openstack/barbican-worker-6bbb58d4c-74p8g" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.001858 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48aa5450-29c8-47de-bb37-a7a6ffd441bc-logs\") pod \"barbican-keystone-listener-78f6989b54-vkxh8\" (UID: \"48aa5450-29c8-47de-bb37-a7a6ffd441bc\") " pod="openstack/barbican-keystone-listener-78f6989b54-vkxh8" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.001931 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znhqh\" (UniqueName: \"kubernetes.io/projected/48aa5450-29c8-47de-bb37-a7a6ffd441bc-kube-api-access-znhqh\") pod \"barbican-keystone-listener-78f6989b54-vkxh8\" (UID: \"48aa5450-29c8-47de-bb37-a7a6ffd441bc\") " pod="openstack/barbican-keystone-listener-78f6989b54-vkxh8" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.057234 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6dc87fdf44-w2w2m"] Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.058450 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6dc87fdf44-w2w2m" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.061705 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.092755 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6dc87fdf44-w2w2m"] Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.102336 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d087f379-e58a-4990-8e43-609c4f5feb40-config\") pod \"dnsmasq-dns-7df4c9958f-sb6xg\" (UID: \"d087f379-e58a-4990-8e43-609c4f5feb40\") " pod="openstack/dnsmasq-dns-7df4c9958f-sb6xg" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.102376 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79e7329c-f85f-4a8c-a2a6-67b1a8f11a22-logs\") pod \"barbican-api-6dc87fdf44-w2w2m\" (UID: \"79e7329c-f85f-4a8c-a2a6-67b1a8f11a22\") " pod="openstack/barbican-api-6dc87fdf44-w2w2m" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.102395 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt9hr\" (UniqueName: \"kubernetes.io/projected/79e7329c-f85f-4a8c-a2a6-67b1a8f11a22-kube-api-access-jt9hr\") pod \"barbican-api-6dc87fdf44-w2w2m\" (UID: \"79e7329c-f85f-4a8c-a2a6-67b1a8f11a22\") " pod="openstack/barbican-api-6dc87fdf44-w2w2m" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.102415 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79e7329c-f85f-4a8c-a2a6-67b1a8f11a22-config-data\") pod \"barbican-api-6dc87fdf44-w2w2m\" (UID: \"79e7329c-f85f-4a8c-a2a6-67b1a8f11a22\") " pod="openstack/barbican-api-6dc87fdf44-w2w2m" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.102436 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48aa5450-29c8-47de-bb37-a7a6ffd441bc-combined-ca-bundle\") pod \"barbican-keystone-listener-78f6989b54-vkxh8\" (UID: \"48aa5450-29c8-47de-bb37-a7a6ffd441bc\") " pod="openstack/barbican-keystone-listener-78f6989b54-vkxh8" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.102456 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11e19037-abf1-4269-b933-0950913973b9-combined-ca-bundle\") pod \"barbican-worker-6bbb58d4c-74p8g\" (UID: \"11e19037-abf1-4269-b933-0950913973b9\") " pod="openstack/barbican-worker-6bbb58d4c-74p8g" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.102477 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d087f379-e58a-4990-8e43-609c4f5feb40-ovsdbserver-nb\") pod \"dnsmasq-dns-7df4c9958f-sb6xg\" (UID: \"d087f379-e58a-4990-8e43-609c4f5feb40\") " pod="openstack/dnsmasq-dns-7df4c9958f-sb6xg" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.102495 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11e19037-abf1-4269-b933-0950913973b9-config-data\") pod \"barbican-worker-6bbb58d4c-74p8g\" (UID: \"11e19037-abf1-4269-b933-0950913973b9\") " pod="openstack/barbican-worker-6bbb58d4c-74p8g" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.102512 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfnj6\" (UniqueName: \"kubernetes.io/projected/11e19037-abf1-4269-b933-0950913973b9-kube-api-access-tfnj6\") pod \"barbican-worker-6bbb58d4c-74p8g\" (UID: \"11e19037-abf1-4269-b933-0950913973b9\") " pod="openstack/barbican-worker-6bbb58d4c-74p8g" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.102543 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48aa5450-29c8-47de-bb37-a7a6ffd441bc-logs\") pod \"barbican-keystone-listener-78f6989b54-vkxh8\" (UID: \"48aa5450-29c8-47de-bb37-a7a6ffd441bc\") " pod="openstack/barbican-keystone-listener-78f6989b54-vkxh8" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.102563 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4zsg\" (UniqueName: \"kubernetes.io/projected/d087f379-e58a-4990-8e43-609c4f5feb40-kube-api-access-t4zsg\") pod \"dnsmasq-dns-7df4c9958f-sb6xg\" (UID: \"d087f379-e58a-4990-8e43-609c4f5feb40\") " pod="openstack/dnsmasq-dns-7df4c9958f-sb6xg" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.102584 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79e7329c-f85f-4a8c-a2a6-67b1a8f11a22-combined-ca-bundle\") pod \"barbican-api-6dc87fdf44-w2w2m\" (UID: \"79e7329c-f85f-4a8c-a2a6-67b1a8f11a22\") " pod="openstack/barbican-api-6dc87fdf44-w2w2m" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.102605 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znhqh\" (UniqueName: \"kubernetes.io/projected/48aa5450-29c8-47de-bb37-a7a6ffd441bc-kube-api-access-znhqh\") pod \"barbican-keystone-listener-78f6989b54-vkxh8\" (UID: \"48aa5450-29c8-47de-bb37-a7a6ffd441bc\") " pod="openstack/barbican-keystone-listener-78f6989b54-vkxh8" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.102633 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d087f379-e58a-4990-8e43-609c4f5feb40-dns-svc\") pod \"dnsmasq-dns-7df4c9958f-sb6xg\" (UID: \"d087f379-e58a-4990-8e43-609c4f5feb40\") " pod="openstack/dnsmasq-dns-7df4c9958f-sb6xg" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.102649 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d087f379-e58a-4990-8e43-609c4f5feb40-dns-swift-storage-0\") pod \"dnsmasq-dns-7df4c9958f-sb6xg\" (UID: \"d087f379-e58a-4990-8e43-609c4f5feb40\") " pod="openstack/dnsmasq-dns-7df4c9958f-sb6xg" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.102666 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48aa5450-29c8-47de-bb37-a7a6ffd441bc-config-data-custom\") pod \"barbican-keystone-listener-78f6989b54-vkxh8\" (UID: \"48aa5450-29c8-47de-bb37-a7a6ffd441bc\") " pod="openstack/barbican-keystone-listener-78f6989b54-vkxh8" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.102685 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48aa5450-29c8-47de-bb37-a7a6ffd441bc-config-data\") pod \"barbican-keystone-listener-78f6989b54-vkxh8\" (UID: \"48aa5450-29c8-47de-bb37-a7a6ffd441bc\") " pod="openstack/barbican-keystone-listener-78f6989b54-vkxh8" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.102704 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79e7329c-f85f-4a8c-a2a6-67b1a8f11a22-config-data-custom\") pod \"barbican-api-6dc87fdf44-w2w2m\" (UID: \"79e7329c-f85f-4a8c-a2a6-67b1a8f11a22\") " pod="openstack/barbican-api-6dc87fdf44-w2w2m" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.102731 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11e19037-abf1-4269-b933-0950913973b9-config-data-custom\") pod \"barbican-worker-6bbb58d4c-74p8g\" (UID: \"11e19037-abf1-4269-b933-0950913973b9\") " pod="openstack/barbican-worker-6bbb58d4c-74p8g" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.102759 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11e19037-abf1-4269-b933-0950913973b9-logs\") pod \"barbican-worker-6bbb58d4c-74p8g\" (UID: \"11e19037-abf1-4269-b933-0950913973b9\") " pod="openstack/barbican-worker-6bbb58d4c-74p8g" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.104237 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48aa5450-29c8-47de-bb37-a7a6ffd441bc-logs\") pod \"barbican-keystone-listener-78f6989b54-vkxh8\" (UID: \"48aa5450-29c8-47de-bb37-a7a6ffd441bc\") " pod="openstack/barbican-keystone-listener-78f6989b54-vkxh8" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.117396 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11e19037-abf1-4269-b933-0950913973b9-logs\") pod \"barbican-worker-6bbb58d4c-74p8g\" (UID: \"11e19037-abf1-4269-b933-0950913973b9\") " pod="openstack/barbican-worker-6bbb58d4c-74p8g" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.127158 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d087f379-e58a-4990-8e43-609c4f5feb40-ovsdbserver-sb\") pod \"dnsmasq-dns-7df4c9958f-sb6xg\" (UID: \"d087f379-e58a-4990-8e43-609c4f5feb40\") " pod="openstack/dnsmasq-dns-7df4c9958f-sb6xg" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.129229 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-96knc" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.131043 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48aa5450-29c8-47de-bb37-a7a6ffd441bc-config-data\") pod \"barbican-keystone-listener-78f6989b54-vkxh8\" (UID: \"48aa5450-29c8-47de-bb37-a7a6ffd441bc\") " pod="openstack/barbican-keystone-listener-78f6989b54-vkxh8" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.132746 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11e19037-abf1-4269-b933-0950913973b9-config-data-custom\") pod \"barbican-worker-6bbb58d4c-74p8g\" (UID: \"11e19037-abf1-4269-b933-0950913973b9\") " pod="openstack/barbican-worker-6bbb58d4c-74p8g" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.133436 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48aa5450-29c8-47de-bb37-a7a6ffd441bc-config-data-custom\") pod \"barbican-keystone-listener-78f6989b54-vkxh8\" (UID: \"48aa5450-29c8-47de-bb37-a7a6ffd441bc\") " pod="openstack/barbican-keystone-listener-78f6989b54-vkxh8" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.135852 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11e19037-abf1-4269-b933-0950913973b9-config-data\") pod \"barbican-worker-6bbb58d4c-74p8g\" (UID: \"11e19037-abf1-4269-b933-0950913973b9\") " pod="openstack/barbican-worker-6bbb58d4c-74p8g" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.138081 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48aa5450-29c8-47de-bb37-a7a6ffd441bc-combined-ca-bundle\") pod \"barbican-keystone-listener-78f6989b54-vkxh8\" (UID: \"48aa5450-29c8-47de-bb37-a7a6ffd441bc\") " pod="openstack/barbican-keystone-listener-78f6989b54-vkxh8" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.142435 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfnj6\" (UniqueName: \"kubernetes.io/projected/11e19037-abf1-4269-b933-0950913973b9-kube-api-access-tfnj6\") pod \"barbican-worker-6bbb58d4c-74p8g\" (UID: \"11e19037-abf1-4269-b933-0950913973b9\") " pod="openstack/barbican-worker-6bbb58d4c-74p8g" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.144373 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11e19037-abf1-4269-b933-0950913973b9-combined-ca-bundle\") pod \"barbican-worker-6bbb58d4c-74p8g\" (UID: \"11e19037-abf1-4269-b933-0950913973b9\") " pod="openstack/barbican-worker-6bbb58d4c-74p8g" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.158500 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znhqh\" (UniqueName: \"kubernetes.io/projected/48aa5450-29c8-47de-bb37-a7a6ffd441bc-kube-api-access-znhqh\") pod \"barbican-keystone-listener-78f6989b54-vkxh8\" (UID: \"48aa5450-29c8-47de-bb37-a7a6ffd441bc\") " pod="openstack/barbican-keystone-listener-78f6989b54-vkxh8" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.179164 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6bbb58d4c-74p8g" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.196628 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-78f6989b54-vkxh8" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.246925 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c42d089-56c7-45ee-ba54-ee464499ff29-config-data\") pod \"8c42d089-56c7-45ee-ba54-ee464499ff29\" (UID: \"8c42d089-56c7-45ee-ba54-ee464499ff29\") " Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.247037 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8c42d089-56c7-45ee-ba54-ee464499ff29-db-sync-config-data\") pod \"8c42d089-56c7-45ee-ba54-ee464499ff29\" (UID: \"8c42d089-56c7-45ee-ba54-ee464499ff29\") " Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.247124 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lntz\" (UniqueName: \"kubernetes.io/projected/8c42d089-56c7-45ee-ba54-ee464499ff29-kube-api-access-9lntz\") pod \"8c42d089-56c7-45ee-ba54-ee464499ff29\" (UID: \"8c42d089-56c7-45ee-ba54-ee464499ff29\") " Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.247197 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c42d089-56c7-45ee-ba54-ee464499ff29-combined-ca-bundle\") pod \"8c42d089-56c7-45ee-ba54-ee464499ff29\" (UID: \"8c42d089-56c7-45ee-ba54-ee464499ff29\") " Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.247215 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8c42d089-56c7-45ee-ba54-ee464499ff29-etc-machine-id\") pod \"8c42d089-56c7-45ee-ba54-ee464499ff29\" (UID: \"8c42d089-56c7-45ee-ba54-ee464499ff29\") " Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.247241 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c42d089-56c7-45ee-ba54-ee464499ff29-scripts\") pod \"8c42d089-56c7-45ee-ba54-ee464499ff29\" (UID: \"8c42d089-56c7-45ee-ba54-ee464499ff29\") " Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.247461 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d087f379-e58a-4990-8e43-609c4f5feb40-dns-svc\") pod \"dnsmasq-dns-7df4c9958f-sb6xg\" (UID: \"d087f379-e58a-4990-8e43-609c4f5feb40\") " pod="openstack/dnsmasq-dns-7df4c9958f-sb6xg" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.247484 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d087f379-e58a-4990-8e43-609c4f5feb40-dns-swift-storage-0\") pod \"dnsmasq-dns-7df4c9958f-sb6xg\" (UID: \"d087f379-e58a-4990-8e43-609c4f5feb40\") " pod="openstack/dnsmasq-dns-7df4c9958f-sb6xg" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.247513 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79e7329c-f85f-4a8c-a2a6-67b1a8f11a22-config-data-custom\") pod \"barbican-api-6dc87fdf44-w2w2m\" (UID: \"79e7329c-f85f-4a8c-a2a6-67b1a8f11a22\") " pod="openstack/barbican-api-6dc87fdf44-w2w2m" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.247549 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d087f379-e58a-4990-8e43-609c4f5feb40-ovsdbserver-sb\") pod \"dnsmasq-dns-7df4c9958f-sb6xg\" (UID: \"d087f379-e58a-4990-8e43-609c4f5feb40\") " pod="openstack/dnsmasq-dns-7df4c9958f-sb6xg" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.247580 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d087f379-e58a-4990-8e43-609c4f5feb40-config\") pod \"dnsmasq-dns-7df4c9958f-sb6xg\" (UID: \"d087f379-e58a-4990-8e43-609c4f5feb40\") " pod="openstack/dnsmasq-dns-7df4c9958f-sb6xg" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.247604 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79e7329c-f85f-4a8c-a2a6-67b1a8f11a22-logs\") pod \"barbican-api-6dc87fdf44-w2w2m\" (UID: \"79e7329c-f85f-4a8c-a2a6-67b1a8f11a22\") " pod="openstack/barbican-api-6dc87fdf44-w2w2m" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.247622 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt9hr\" (UniqueName: \"kubernetes.io/projected/79e7329c-f85f-4a8c-a2a6-67b1a8f11a22-kube-api-access-jt9hr\") pod \"barbican-api-6dc87fdf44-w2w2m\" (UID: \"79e7329c-f85f-4a8c-a2a6-67b1a8f11a22\") " pod="openstack/barbican-api-6dc87fdf44-w2w2m" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.247641 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79e7329c-f85f-4a8c-a2a6-67b1a8f11a22-config-data\") pod \"barbican-api-6dc87fdf44-w2w2m\" (UID: \"79e7329c-f85f-4a8c-a2a6-67b1a8f11a22\") " pod="openstack/barbican-api-6dc87fdf44-w2w2m" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.247665 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d087f379-e58a-4990-8e43-609c4f5feb40-ovsdbserver-nb\") pod \"dnsmasq-dns-7df4c9958f-sb6xg\" (UID: \"d087f379-e58a-4990-8e43-609c4f5feb40\") " pod="openstack/dnsmasq-dns-7df4c9958f-sb6xg" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.247703 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4zsg\" (UniqueName: \"kubernetes.io/projected/d087f379-e58a-4990-8e43-609c4f5feb40-kube-api-access-t4zsg\") pod \"dnsmasq-dns-7df4c9958f-sb6xg\" (UID: \"d087f379-e58a-4990-8e43-609c4f5feb40\") " pod="openstack/dnsmasq-dns-7df4c9958f-sb6xg" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.247724 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79e7329c-f85f-4a8c-a2a6-67b1a8f11a22-combined-ca-bundle\") pod \"barbican-api-6dc87fdf44-w2w2m\" (UID: \"79e7329c-f85f-4a8c-a2a6-67b1a8f11a22\") " pod="openstack/barbican-api-6dc87fdf44-w2w2m" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.248608 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d087f379-e58a-4990-8e43-609c4f5feb40-ovsdbserver-sb\") pod \"dnsmasq-dns-7df4c9958f-sb6xg\" (UID: \"d087f379-e58a-4990-8e43-609c4f5feb40\") " pod="openstack/dnsmasq-dns-7df4c9958f-sb6xg" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.249334 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d087f379-e58a-4990-8e43-609c4f5feb40-dns-svc\") pod \"dnsmasq-dns-7df4c9958f-sb6xg\" (UID: \"d087f379-e58a-4990-8e43-609c4f5feb40\") " pod="openstack/dnsmasq-dns-7df4c9958f-sb6xg" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.249867 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d087f379-e58a-4990-8e43-609c4f5feb40-dns-swift-storage-0\") pod \"dnsmasq-dns-7df4c9958f-sb6xg\" (UID: \"d087f379-e58a-4990-8e43-609c4f5feb40\") " pod="openstack/dnsmasq-dns-7df4c9958f-sb6xg" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.253156 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79e7329c-f85f-4a8c-a2a6-67b1a8f11a22-config-data-custom\") pod \"barbican-api-6dc87fdf44-w2w2m\" (UID: \"79e7329c-f85f-4a8c-a2a6-67b1a8f11a22\") " pod="openstack/barbican-api-6dc87fdf44-w2w2m" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.254946 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d087f379-e58a-4990-8e43-609c4f5feb40-config\") pod \"dnsmasq-dns-7df4c9958f-sb6xg\" (UID: \"d087f379-e58a-4990-8e43-609c4f5feb40\") " pod="openstack/dnsmasq-dns-7df4c9958f-sb6xg" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.255465 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79e7329c-f85f-4a8c-a2a6-67b1a8f11a22-logs\") pod \"barbican-api-6dc87fdf44-w2w2m\" (UID: \"79e7329c-f85f-4a8c-a2a6-67b1a8f11a22\") " pod="openstack/barbican-api-6dc87fdf44-w2w2m" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.255507 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8c42d089-56c7-45ee-ba54-ee464499ff29-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8c42d089-56c7-45ee-ba54-ee464499ff29" (UID: "8c42d089-56c7-45ee-ba54-ee464499ff29"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.256576 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d087f379-e58a-4990-8e43-609c4f5feb40-ovsdbserver-nb\") pod \"dnsmasq-dns-7df4c9958f-sb6xg\" (UID: \"d087f379-e58a-4990-8e43-609c4f5feb40\") " pod="openstack/dnsmasq-dns-7df4c9958f-sb6xg" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.264455 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79e7329c-f85f-4a8c-a2a6-67b1a8f11a22-combined-ca-bundle\") pod \"barbican-api-6dc87fdf44-w2w2m\" (UID: \"79e7329c-f85f-4a8c-a2a6-67b1a8f11a22\") " pod="openstack/barbican-api-6dc87fdf44-w2w2m" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.266037 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79e7329c-f85f-4a8c-a2a6-67b1a8f11a22-config-data\") pod \"barbican-api-6dc87fdf44-w2w2m\" (UID: \"79e7329c-f85f-4a8c-a2a6-67b1a8f11a22\") " pod="openstack/barbican-api-6dc87fdf44-w2w2m" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.271126 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c42d089-56c7-45ee-ba54-ee464499ff29-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8c42d089-56c7-45ee-ba54-ee464499ff29" (UID: "8c42d089-56c7-45ee-ba54-ee464499ff29"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.272930 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c42d089-56c7-45ee-ba54-ee464499ff29-kube-api-access-9lntz" (OuterVolumeSpecName: "kube-api-access-9lntz") pod "8c42d089-56c7-45ee-ba54-ee464499ff29" (UID: "8c42d089-56c7-45ee-ba54-ee464499ff29"). InnerVolumeSpecName "kube-api-access-9lntz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.286931 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c42d089-56c7-45ee-ba54-ee464499ff29-scripts" (OuterVolumeSpecName: "scripts") pod "8c42d089-56c7-45ee-ba54-ee464499ff29" (UID: "8c42d089-56c7-45ee-ba54-ee464499ff29"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.287938 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4zsg\" (UniqueName: \"kubernetes.io/projected/d087f379-e58a-4990-8e43-609c4f5feb40-kube-api-access-t4zsg\") pod \"dnsmasq-dns-7df4c9958f-sb6xg\" (UID: \"d087f379-e58a-4990-8e43-609c4f5feb40\") " pod="openstack/dnsmasq-dns-7df4c9958f-sb6xg" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.315687 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt9hr\" (UniqueName: \"kubernetes.io/projected/79e7329c-f85f-4a8c-a2a6-67b1a8f11a22-kube-api-access-jt9hr\") pod \"barbican-api-6dc87fdf44-w2w2m\" (UID: \"79e7329c-f85f-4a8c-a2a6-67b1a8f11a22\") " pod="openstack/barbican-api-6dc87fdf44-w2w2m" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.332492 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c42d089-56c7-45ee-ba54-ee464499ff29-config-data" (OuterVolumeSpecName: "config-data") pod "8c42d089-56c7-45ee-ba54-ee464499ff29" (UID: "8c42d089-56c7-45ee-ba54-ee464499ff29"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.341299 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c42d089-56c7-45ee-ba54-ee464499ff29-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c42d089-56c7-45ee-ba54-ee464499ff29" (UID: "8c42d089-56c7-45ee-ba54-ee464499ff29"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.349304 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c42d089-56c7-45ee-ba54-ee464499ff29-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.349335 4830 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8c42d089-56c7-45ee-ba54-ee464499ff29-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.349349 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lntz\" (UniqueName: \"kubernetes.io/projected/8c42d089-56c7-45ee-ba54-ee464499ff29-kube-api-access-9lntz\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.349362 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c42d089-56c7-45ee-ba54-ee464499ff29-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.349376 4830 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8c42d089-56c7-45ee-ba54-ee464499ff29-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.349387 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c42d089-56c7-45ee-ba54-ee464499ff29-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.374198 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7df4c9958f-sb6xg" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.451747 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6dc87fdf44-w2w2m" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.607058 4830 generic.go:334] "Generic (PLEG): container finished" podID="7df9e731-0537-4d90-a4c1-907721b227e1" containerID="449ef6edc93cf2e68fa998d3203c16831d9f5a978d92b1f5c8596ad8eec051ec" exitCode=0 Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.607095 4830 generic.go:334] "Generic (PLEG): container finished" podID="7df9e731-0537-4d90-a4c1-907721b227e1" containerID="e71f3e55d1b20dc8557a690ff642e06691b7429714e7ad3a2835647f38bc0b9c" exitCode=2 Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.607105 4830 generic.go:334] "Generic (PLEG): container finished" podID="7df9e731-0537-4d90-a4c1-907721b227e1" containerID="50c6f76b1d4c7a12f137499a2267b56243ac8396eb3e46a198a94656acaf8ba4" exitCode=0 Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.607141 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7df9e731-0537-4d90-a4c1-907721b227e1","Type":"ContainerDied","Data":"449ef6edc93cf2e68fa998d3203c16831d9f5a978d92b1f5c8596ad8eec051ec"} Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.607174 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7df9e731-0537-4d90-a4c1-907721b227e1","Type":"ContainerDied","Data":"e71f3e55d1b20dc8557a690ff642e06691b7429714e7ad3a2835647f38bc0b9c"} Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.607183 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7df9e731-0537-4d90-a4c1-907721b227e1","Type":"ContainerDied","Data":"50c6f76b1d4c7a12f137499a2267b56243ac8396eb3e46a198a94656acaf8ba4"} Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.615183 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-96knc" event={"ID":"8c42d089-56c7-45ee-ba54-ee464499ff29","Type":"ContainerDied","Data":"3488ab74d8af70643fcb5458bc3517671d7ecd0b15a97e813945d74835fd43fc"} Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.615210 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3488ab74d8af70643fcb5458bc3517671d7ecd0b15a97e813945d74835fd43fc" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.615262 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-96knc" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.710594 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-78f6989b54-vkxh8"] Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.849975 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 18:21:29 crc kubenswrapper[4830]: E0318 18:21:29.856384 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c42d089-56c7-45ee-ba54-ee464499ff29" containerName="cinder-db-sync" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.856409 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c42d089-56c7-45ee-ba54-ee464499ff29" containerName="cinder-db-sync" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.856577 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c42d089-56c7-45ee-ba54-ee464499ff29" containerName="cinder-db-sync" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.857484 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.867585 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-bzz8b" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.867591 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.867798 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.868357 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.870374 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.912845 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7df4c9958f-sb6xg"] Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.932526 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6bbb58d4c-74p8g"] Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.941828 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8995fbb57-qhlkl"] Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.943287 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8995fbb57-qhlkl" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.948007 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8995fbb57-qhlkl"] Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.957252 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.968206 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.970952 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.971059 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.971101 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-scripts\") pod \"cinder-scheduler-0\" (UID: \"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.971137 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.971215 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rqgj\" (UniqueName: \"kubernetes.io/projected/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-kube-api-access-9rqgj\") pod \"cinder-scheduler-0\" (UID: \"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.971272 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-config-data\") pod \"cinder-scheduler-0\" (UID: \"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:29 crc kubenswrapper[4830]: I0318 18:21:29.973947 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.017023 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6dc87fdf44-w2w2m"] Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.027753 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.042609 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7df4c9958f-sb6xg"] Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.072365 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rqgj\" (UniqueName: \"kubernetes.io/projected/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-kube-api-access-9rqgj\") pod \"cinder-scheduler-0\" (UID: \"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.072403 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtdhb\" (UniqueName: \"kubernetes.io/projected/e07a019b-0836-4e90-b596-0f06f04a9330-kube-api-access-gtdhb\") pod \"cinder-api-0\" (UID: \"e07a019b-0836-4e90-b596-0f06f04a9330\") " pod="openstack/cinder-api-0" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.072436 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e07a019b-0836-4e90-b596-0f06f04a9330-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e07a019b-0836-4e90-b596-0f06f04a9330\") " pod="openstack/cinder-api-0" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.072467 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-config-data\") pod \"cinder-scheduler-0\" (UID: \"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.072612 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.072679 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m7qp\" (UniqueName: \"kubernetes.io/projected/d92164cb-de18-4223-9379-203b3e0cb28b-kube-api-access-9m7qp\") pod \"dnsmasq-dns-8995fbb57-qhlkl\" (UID: \"d92164cb-de18-4223-9379-203b3e0cb28b\") " pod="openstack/dnsmasq-dns-8995fbb57-qhlkl" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.072782 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e07a019b-0836-4e90-b596-0f06f04a9330-config-data\") pod \"cinder-api-0\" (UID: \"e07a019b-0836-4e90-b596-0f06f04a9330\") " pod="openstack/cinder-api-0" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.072818 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d92164cb-de18-4223-9379-203b3e0cb28b-dns-svc\") pod \"dnsmasq-dns-8995fbb57-qhlkl\" (UID: \"d92164cb-de18-4223-9379-203b3e0cb28b\") " pod="openstack/dnsmasq-dns-8995fbb57-qhlkl" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.072845 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e07a019b-0836-4e90-b596-0f06f04a9330-logs\") pod \"cinder-api-0\" (UID: \"e07a019b-0836-4e90-b596-0f06f04a9330\") " pod="openstack/cinder-api-0" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.072976 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.073031 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-scripts\") pod \"cinder-scheduler-0\" (UID: \"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.073054 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e07a019b-0836-4e90-b596-0f06f04a9330-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e07a019b-0836-4e90-b596-0f06f04a9330\") " pod="openstack/cinder-api-0" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.073080 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d92164cb-de18-4223-9379-203b3e0cb28b-config\") pod \"dnsmasq-dns-8995fbb57-qhlkl\" (UID: \"d92164cb-de18-4223-9379-203b3e0cb28b\") " pod="openstack/dnsmasq-dns-8995fbb57-qhlkl" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.073098 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d92164cb-de18-4223-9379-203b3e0cb28b-ovsdbserver-nb\") pod \"dnsmasq-dns-8995fbb57-qhlkl\" (UID: \"d92164cb-de18-4223-9379-203b3e0cb28b\") " pod="openstack/dnsmasq-dns-8995fbb57-qhlkl" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.073120 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d92164cb-de18-4223-9379-203b3e0cb28b-dns-swift-storage-0\") pod \"dnsmasq-dns-8995fbb57-qhlkl\" (UID: \"d92164cb-de18-4223-9379-203b3e0cb28b\") " pod="openstack/dnsmasq-dns-8995fbb57-qhlkl" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.073143 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.073168 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e07a019b-0836-4e90-b596-0f06f04a9330-scripts\") pod \"cinder-api-0\" (UID: \"e07a019b-0836-4e90-b596-0f06f04a9330\") " pod="openstack/cinder-api-0" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.073233 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e07a019b-0836-4e90-b596-0f06f04a9330-config-data-custom\") pod \"cinder-api-0\" (UID: \"e07a019b-0836-4e90-b596-0f06f04a9330\") " pod="openstack/cinder-api-0" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.073278 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d92164cb-de18-4223-9379-203b3e0cb28b-ovsdbserver-sb\") pod \"dnsmasq-dns-8995fbb57-qhlkl\" (UID: \"d92164cb-de18-4223-9379-203b3e0cb28b\") " pod="openstack/dnsmasq-dns-8995fbb57-qhlkl" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.075588 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.079488 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-config-data\") pod \"cinder-scheduler-0\" (UID: \"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.084790 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.085287 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-scripts\") pod \"cinder-scheduler-0\" (UID: \"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.090184 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rqgj\" (UniqueName: \"kubernetes.io/projected/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-kube-api-access-9rqgj\") pod \"cinder-scheduler-0\" (UID: \"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.090607 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.174674 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e07a019b-0836-4e90-b596-0f06f04a9330-logs\") pod \"cinder-api-0\" (UID: \"e07a019b-0836-4e90-b596-0f06f04a9330\") " pod="openstack/cinder-api-0" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.174728 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e07a019b-0836-4e90-b596-0f06f04a9330-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e07a019b-0836-4e90-b596-0f06f04a9330\") " pod="openstack/cinder-api-0" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.174749 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d92164cb-de18-4223-9379-203b3e0cb28b-config\") pod \"dnsmasq-dns-8995fbb57-qhlkl\" (UID: \"d92164cb-de18-4223-9379-203b3e0cb28b\") " pod="openstack/dnsmasq-dns-8995fbb57-qhlkl" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.174765 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d92164cb-de18-4223-9379-203b3e0cb28b-ovsdbserver-nb\") pod \"dnsmasq-dns-8995fbb57-qhlkl\" (UID: \"d92164cb-de18-4223-9379-203b3e0cb28b\") " pod="openstack/dnsmasq-dns-8995fbb57-qhlkl" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.174798 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d92164cb-de18-4223-9379-203b3e0cb28b-dns-swift-storage-0\") pod \"dnsmasq-dns-8995fbb57-qhlkl\" (UID: \"d92164cb-de18-4223-9379-203b3e0cb28b\") " pod="openstack/dnsmasq-dns-8995fbb57-qhlkl" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.174818 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e07a019b-0836-4e90-b596-0f06f04a9330-scripts\") pod \"cinder-api-0\" (UID: \"e07a019b-0836-4e90-b596-0f06f04a9330\") " pod="openstack/cinder-api-0" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.174854 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e07a019b-0836-4e90-b596-0f06f04a9330-config-data-custom\") pod \"cinder-api-0\" (UID: \"e07a019b-0836-4e90-b596-0f06f04a9330\") " pod="openstack/cinder-api-0" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.174879 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d92164cb-de18-4223-9379-203b3e0cb28b-ovsdbserver-sb\") pod \"dnsmasq-dns-8995fbb57-qhlkl\" (UID: \"d92164cb-de18-4223-9379-203b3e0cb28b\") " pod="openstack/dnsmasq-dns-8995fbb57-qhlkl" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.174904 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtdhb\" (UniqueName: \"kubernetes.io/projected/e07a019b-0836-4e90-b596-0f06f04a9330-kube-api-access-gtdhb\") pod \"cinder-api-0\" (UID: \"e07a019b-0836-4e90-b596-0f06f04a9330\") " pod="openstack/cinder-api-0" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.174923 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e07a019b-0836-4e90-b596-0f06f04a9330-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e07a019b-0836-4e90-b596-0f06f04a9330\") " pod="openstack/cinder-api-0" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.174976 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9m7qp\" (UniqueName: \"kubernetes.io/projected/d92164cb-de18-4223-9379-203b3e0cb28b-kube-api-access-9m7qp\") pod \"dnsmasq-dns-8995fbb57-qhlkl\" (UID: \"d92164cb-de18-4223-9379-203b3e0cb28b\") " pod="openstack/dnsmasq-dns-8995fbb57-qhlkl" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.175010 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e07a019b-0836-4e90-b596-0f06f04a9330-config-data\") pod \"cinder-api-0\" (UID: \"e07a019b-0836-4e90-b596-0f06f04a9330\") " pod="openstack/cinder-api-0" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.175031 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d92164cb-de18-4223-9379-203b3e0cb28b-dns-svc\") pod \"dnsmasq-dns-8995fbb57-qhlkl\" (UID: \"d92164cb-de18-4223-9379-203b3e0cb28b\") " pod="openstack/dnsmasq-dns-8995fbb57-qhlkl" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.175713 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d92164cb-de18-4223-9379-203b3e0cb28b-dns-svc\") pod \"dnsmasq-dns-8995fbb57-qhlkl\" (UID: \"d92164cb-de18-4223-9379-203b3e0cb28b\") " pod="openstack/dnsmasq-dns-8995fbb57-qhlkl" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.176019 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e07a019b-0836-4e90-b596-0f06f04a9330-logs\") pod \"cinder-api-0\" (UID: \"e07a019b-0836-4e90-b596-0f06f04a9330\") " pod="openstack/cinder-api-0" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.176670 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e07a019b-0836-4e90-b596-0f06f04a9330-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e07a019b-0836-4e90-b596-0f06f04a9330\") " pod="openstack/cinder-api-0" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.177078 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d92164cb-de18-4223-9379-203b3e0cb28b-ovsdbserver-sb\") pod \"dnsmasq-dns-8995fbb57-qhlkl\" (UID: \"d92164cb-de18-4223-9379-203b3e0cb28b\") " pod="openstack/dnsmasq-dns-8995fbb57-qhlkl" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.177091 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d92164cb-de18-4223-9379-203b3e0cb28b-config\") pod \"dnsmasq-dns-8995fbb57-qhlkl\" (UID: \"d92164cb-de18-4223-9379-203b3e0cb28b\") " pod="openstack/dnsmasq-dns-8995fbb57-qhlkl" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.177802 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d92164cb-de18-4223-9379-203b3e0cb28b-dns-swift-storage-0\") pod \"dnsmasq-dns-8995fbb57-qhlkl\" (UID: \"d92164cb-de18-4223-9379-203b3e0cb28b\") " pod="openstack/dnsmasq-dns-8995fbb57-qhlkl" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.179429 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.179657 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e07a019b-0836-4e90-b596-0f06f04a9330-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e07a019b-0836-4e90-b596-0f06f04a9330\") " pod="openstack/cinder-api-0" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.180247 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e07a019b-0836-4e90-b596-0f06f04a9330-config-data-custom\") pod \"cinder-api-0\" (UID: \"e07a019b-0836-4e90-b596-0f06f04a9330\") " pod="openstack/cinder-api-0" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.183897 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e07a019b-0836-4e90-b596-0f06f04a9330-scripts\") pod \"cinder-api-0\" (UID: \"e07a019b-0836-4e90-b596-0f06f04a9330\") " pod="openstack/cinder-api-0" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.184581 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e07a019b-0836-4e90-b596-0f06f04a9330-config-data\") pod \"cinder-api-0\" (UID: \"e07a019b-0836-4e90-b596-0f06f04a9330\") " pod="openstack/cinder-api-0" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.191880 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d92164cb-de18-4223-9379-203b3e0cb28b-ovsdbserver-nb\") pod \"dnsmasq-dns-8995fbb57-qhlkl\" (UID: \"d92164cb-de18-4223-9379-203b3e0cb28b\") " pod="openstack/dnsmasq-dns-8995fbb57-qhlkl" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.195034 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtdhb\" (UniqueName: \"kubernetes.io/projected/e07a019b-0836-4e90-b596-0f06f04a9330-kube-api-access-gtdhb\") pod \"cinder-api-0\" (UID: \"e07a019b-0836-4e90-b596-0f06f04a9330\") " pod="openstack/cinder-api-0" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.200480 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m7qp\" (UniqueName: \"kubernetes.io/projected/d92164cb-de18-4223-9379-203b3e0cb28b-kube-api-access-9m7qp\") pod \"dnsmasq-dns-8995fbb57-qhlkl\" (UID: \"d92164cb-de18-4223-9379-203b3e0cb28b\") " pod="openstack/dnsmasq-dns-8995fbb57-qhlkl" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.304119 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8995fbb57-qhlkl" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.321333 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.629875 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78f6989b54-vkxh8" event={"ID":"48aa5450-29c8-47de-bb37-a7a6ffd441bc","Type":"ContainerStarted","Data":"56e7a13896b3a18f5d1fbe2b3c404e40bf121b90f6ede29fdbfac10fc578905d"} Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.651192 4830 generic.go:334] "Generic (PLEG): container finished" podID="d087f379-e58a-4990-8e43-609c4f5feb40" containerID="e2a24f2731a4f6f7c039eb1156e94e3bb2a6481e5147738adf800833104d5e02" exitCode=0 Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.651245 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7df4c9958f-sb6xg" event={"ID":"d087f379-e58a-4990-8e43-609c4f5feb40","Type":"ContainerDied","Data":"e2a24f2731a4f6f7c039eb1156e94e3bb2a6481e5147738adf800833104d5e02"} Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.651269 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7df4c9958f-sb6xg" event={"ID":"d087f379-e58a-4990-8e43-609c4f5feb40","Type":"ContainerStarted","Data":"d5d42ede5293d5bb7ce0aa5afe6115ab2d0e309599d15b9d9531f947fc8d222e"} Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.653216 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.667785 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dc87fdf44-w2w2m" event={"ID":"79e7329c-f85f-4a8c-a2a6-67b1a8f11a22","Type":"ContainerStarted","Data":"8fd7b21c0dd8911f3a79460cf0b624cea7d3f7d24a3296d05117200282e82b9a"} Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.668043 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dc87fdf44-w2w2m" event={"ID":"79e7329c-f85f-4a8c-a2a6-67b1a8f11a22","Type":"ContainerStarted","Data":"f9e9fc6cd88f98e08dc7d5c4419d20524d12a4843dc86a24b46f29998b131c98"} Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.668055 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dc87fdf44-w2w2m" event={"ID":"79e7329c-f85f-4a8c-a2a6-67b1a8f11a22","Type":"ContainerStarted","Data":"f127aec0f0f848aa2fd8d54fb8227417a9ac9135ec653fa2b922073c732b6cbd"} Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.668087 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6dc87fdf44-w2w2m" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.668440 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6dc87fdf44-w2w2m" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.689013 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6bbb58d4c-74p8g" event={"ID":"11e19037-abf1-4269-b933-0950913973b9","Type":"ContainerStarted","Data":"7d5a15b177dc6f2188c753b993e032ffbf4e4b88ccdcdc22f26dcd0b1a630d90"} Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.710665 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6dc87fdf44-w2w2m" podStartSLOduration=1.710409173 podStartE2EDuration="1.710409173s" podCreationTimestamp="2026-03-18 18:21:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:21:30.709666542 +0000 UTC m=+1125.277296874" watchObservedRunningTime="2026-03-18 18:21:30.710409173 +0000 UTC m=+1125.278039505" Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.902024 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8995fbb57-qhlkl"] Mar 18 18:21:30 crc kubenswrapper[4830]: I0318 18:21:30.929338 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 18:21:31 crc kubenswrapper[4830]: I0318 18:21:31.047314 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7df4c9958f-sb6xg" Mar 18 18:21:31 crc kubenswrapper[4830]: I0318 18:21:31.095585 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4zsg\" (UniqueName: \"kubernetes.io/projected/d087f379-e58a-4990-8e43-609c4f5feb40-kube-api-access-t4zsg\") pod \"d087f379-e58a-4990-8e43-609c4f5feb40\" (UID: \"d087f379-e58a-4990-8e43-609c4f5feb40\") " Mar 18 18:21:31 crc kubenswrapper[4830]: I0318 18:21:31.095957 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d087f379-e58a-4990-8e43-609c4f5feb40-ovsdbserver-sb\") pod \"d087f379-e58a-4990-8e43-609c4f5feb40\" (UID: \"d087f379-e58a-4990-8e43-609c4f5feb40\") " Mar 18 18:21:31 crc kubenswrapper[4830]: I0318 18:21:31.096008 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d087f379-e58a-4990-8e43-609c4f5feb40-ovsdbserver-nb\") pod \"d087f379-e58a-4990-8e43-609c4f5feb40\" (UID: \"d087f379-e58a-4990-8e43-609c4f5feb40\") " Mar 18 18:21:31 crc kubenswrapper[4830]: I0318 18:21:31.096090 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d087f379-e58a-4990-8e43-609c4f5feb40-dns-swift-storage-0\") pod \"d087f379-e58a-4990-8e43-609c4f5feb40\" (UID: \"d087f379-e58a-4990-8e43-609c4f5feb40\") " Mar 18 18:21:31 crc kubenswrapper[4830]: I0318 18:21:31.096138 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d087f379-e58a-4990-8e43-609c4f5feb40-config\") pod \"d087f379-e58a-4990-8e43-609c4f5feb40\" (UID: \"d087f379-e58a-4990-8e43-609c4f5feb40\") " Mar 18 18:21:31 crc kubenswrapper[4830]: I0318 18:21:31.096180 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d087f379-e58a-4990-8e43-609c4f5feb40-dns-svc\") pod \"d087f379-e58a-4990-8e43-609c4f5feb40\" (UID: \"d087f379-e58a-4990-8e43-609c4f5feb40\") " Mar 18 18:21:31 crc kubenswrapper[4830]: I0318 18:21:31.099676 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d087f379-e58a-4990-8e43-609c4f5feb40-kube-api-access-t4zsg" (OuterVolumeSpecName: "kube-api-access-t4zsg") pod "d087f379-e58a-4990-8e43-609c4f5feb40" (UID: "d087f379-e58a-4990-8e43-609c4f5feb40"). InnerVolumeSpecName "kube-api-access-t4zsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:21:31 crc kubenswrapper[4830]: I0318 18:21:31.118943 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d087f379-e58a-4990-8e43-609c4f5feb40-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d087f379-e58a-4990-8e43-609c4f5feb40" (UID: "d087f379-e58a-4990-8e43-609c4f5feb40"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:21:31 crc kubenswrapper[4830]: I0318 18:21:31.118978 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d087f379-e58a-4990-8e43-609c4f5feb40-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d087f379-e58a-4990-8e43-609c4f5feb40" (UID: "d087f379-e58a-4990-8e43-609c4f5feb40"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:21:31 crc kubenswrapper[4830]: I0318 18:21:31.118687 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d087f379-e58a-4990-8e43-609c4f5feb40-config" (OuterVolumeSpecName: "config") pod "d087f379-e58a-4990-8e43-609c4f5feb40" (UID: "d087f379-e58a-4990-8e43-609c4f5feb40"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:21:31 crc kubenswrapper[4830]: I0318 18:21:31.123028 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d087f379-e58a-4990-8e43-609c4f5feb40-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d087f379-e58a-4990-8e43-609c4f5feb40" (UID: "d087f379-e58a-4990-8e43-609c4f5feb40"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:21:31 crc kubenswrapper[4830]: I0318 18:21:31.125366 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d087f379-e58a-4990-8e43-609c4f5feb40-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d087f379-e58a-4990-8e43-609c4f5feb40" (UID: "d087f379-e58a-4990-8e43-609c4f5feb40"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:21:31 crc kubenswrapper[4830]: I0318 18:21:31.197755 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4zsg\" (UniqueName: \"kubernetes.io/projected/d087f379-e58a-4990-8e43-609c4f5feb40-kube-api-access-t4zsg\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:31 crc kubenswrapper[4830]: I0318 18:21:31.197807 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d087f379-e58a-4990-8e43-609c4f5feb40-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:31 crc kubenswrapper[4830]: I0318 18:21:31.197842 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d087f379-e58a-4990-8e43-609c4f5feb40-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:31 crc kubenswrapper[4830]: I0318 18:21:31.197853 4830 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d087f379-e58a-4990-8e43-609c4f5feb40-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:31 crc kubenswrapper[4830]: I0318 18:21:31.197875 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d087f379-e58a-4990-8e43-609c4f5feb40-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:31 crc kubenswrapper[4830]: I0318 18:21:31.197886 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d087f379-e58a-4990-8e43-609c4f5feb40-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:31 crc kubenswrapper[4830]: I0318 18:21:31.712655 4830 generic.go:334] "Generic (PLEG): container finished" podID="d92164cb-de18-4223-9379-203b3e0cb28b" containerID="122fa7e3ad46f72955996c1f7fccf4cf69a1c2cf68c1bb4bba2014756023ef88" exitCode=0 Mar 18 18:21:31 crc kubenswrapper[4830]: I0318 18:21:31.712724 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8995fbb57-qhlkl" event={"ID":"d92164cb-de18-4223-9379-203b3e0cb28b","Type":"ContainerDied","Data":"122fa7e3ad46f72955996c1f7fccf4cf69a1c2cf68c1bb4bba2014756023ef88"} Mar 18 18:21:31 crc kubenswrapper[4830]: I0318 18:21:31.712752 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8995fbb57-qhlkl" event={"ID":"d92164cb-de18-4223-9379-203b3e0cb28b","Type":"ContainerStarted","Data":"44dc4f57d6d0146271c47a40e81d34f33506e97a5c4eef49b45308a63a9bd0e3"} Mar 18 18:21:31 crc kubenswrapper[4830]: I0318 18:21:31.714959 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e","Type":"ContainerStarted","Data":"27b26da026f4f072e149696ffbb45d1357ad51359a08f15bba482d1b467e658b"} Mar 18 18:21:31 crc kubenswrapper[4830]: I0318 18:21:31.716549 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e07a019b-0836-4e90-b596-0f06f04a9330","Type":"ContainerStarted","Data":"53b065ebfb2ca291a992c335c6b104296b92679688303232de651450fc67dd2e"} Mar 18 18:21:31 crc kubenswrapper[4830]: I0318 18:21:31.716589 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e07a019b-0836-4e90-b596-0f06f04a9330","Type":"ContainerStarted","Data":"6a3acbe4a1e7dd3a7a5ef9e51eeae47da43db487311dcd7cc9ba07abd13a1379"} Mar 18 18:21:31 crc kubenswrapper[4830]: I0318 18:21:31.719236 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7df4c9958f-sb6xg" Mar 18 18:21:31 crc kubenswrapper[4830]: I0318 18:21:31.719661 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7df4c9958f-sb6xg" event={"ID":"d087f379-e58a-4990-8e43-609c4f5feb40","Type":"ContainerDied","Data":"d5d42ede5293d5bb7ce0aa5afe6115ab2d0e309599d15b9d9531f947fc8d222e"} Mar 18 18:21:31 crc kubenswrapper[4830]: I0318 18:21:31.719748 4830 scope.go:117] "RemoveContainer" containerID="e2a24f2731a4f6f7c039eb1156e94e3bb2a6481e5147738adf800833104d5e02" Mar 18 18:21:31 crc kubenswrapper[4830]: I0318 18:21:31.841306 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7df4c9958f-sb6xg"] Mar 18 18:21:31 crc kubenswrapper[4830]: I0318 18:21:31.876020 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7df4c9958f-sb6xg"] Mar 18 18:21:32 crc kubenswrapper[4830]: I0318 18:21:32.254631 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d087f379-e58a-4990-8e43-609c4f5feb40" path="/var/lib/kubelet/pods/d087f379-e58a-4990-8e43-609c4f5feb40/volumes" Mar 18 18:21:32 crc kubenswrapper[4830]: I0318 18:21:32.730016 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6bbb58d4c-74p8g" event={"ID":"11e19037-abf1-4269-b933-0950913973b9","Type":"ContainerStarted","Data":"e3cd2ffc35cea964dcec2e27b4b151f289beecdcd0e3b5f7b932d52f599b93c0"} Mar 18 18:21:32 crc kubenswrapper[4830]: I0318 18:21:32.730457 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6bbb58d4c-74p8g" event={"ID":"11e19037-abf1-4269-b933-0950913973b9","Type":"ContainerStarted","Data":"904ded3c9841d4d431c9a8b7917b3f2eec10c31241a56280fbcc48164d2a5323"} Mar 18 18:21:32 crc kubenswrapper[4830]: I0318 18:21:32.732695 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8995fbb57-qhlkl" event={"ID":"d92164cb-de18-4223-9379-203b3e0cb28b","Type":"ContainerStarted","Data":"17b3766b09800ce3b7c53d51148704ff79130830e4c4453df9d069a9ae638654"} Mar 18 18:21:32 crc kubenswrapper[4830]: I0318 18:21:32.740013 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78f6989b54-vkxh8" event={"ID":"48aa5450-29c8-47de-bb37-a7a6ffd441bc","Type":"ContainerStarted","Data":"fc53817ebacc0ce8c203daf49d972d55c5cc1843b058744c9a909e3088e8e2dc"} Mar 18 18:21:32 crc kubenswrapper[4830]: I0318 18:21:32.740060 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78f6989b54-vkxh8" event={"ID":"48aa5450-29c8-47de-bb37-a7a6ffd441bc","Type":"ContainerStarted","Data":"164f985d7fecf295460783b0211bbc6afa41549a232c6b8704e09de623fb3cd3"} Mar 18 18:21:32 crc kubenswrapper[4830]: I0318 18:21:32.754701 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6bbb58d4c-74p8g" podStartSLOduration=2.719047867 podStartE2EDuration="4.754681017s" podCreationTimestamp="2026-03-18 18:21:28 +0000 UTC" firstStartedPulling="2026-03-18 18:21:29.920109955 +0000 UTC m=+1124.487740287" lastFinishedPulling="2026-03-18 18:21:31.955743115 +0000 UTC m=+1126.523373437" observedRunningTime="2026-03-18 18:21:32.751437005 +0000 UTC m=+1127.319067337" watchObservedRunningTime="2026-03-18 18:21:32.754681017 +0000 UTC m=+1127.322311359" Mar 18 18:21:32 crc kubenswrapper[4830]: I0318 18:21:32.778605 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8995fbb57-qhlkl" podStartSLOduration=3.778574118 podStartE2EDuration="3.778574118s" podCreationTimestamp="2026-03-18 18:21:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:21:32.77046512 +0000 UTC m=+1127.338095462" watchObservedRunningTime="2026-03-18 18:21:32.778574118 +0000 UTC m=+1127.346204470" Mar 18 18:21:32 crc kubenswrapper[4830]: I0318 18:21:32.792897 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-78f6989b54-vkxh8" podStartSLOduration=2.64270135 podStartE2EDuration="4.79287428s" podCreationTimestamp="2026-03-18 18:21:28 +0000 UTC" firstStartedPulling="2026-03-18 18:21:29.726924494 +0000 UTC m=+1124.294554826" lastFinishedPulling="2026-03-18 18:21:31.877097424 +0000 UTC m=+1126.444727756" observedRunningTime="2026-03-18 18:21:32.792419278 +0000 UTC m=+1127.360049610" watchObservedRunningTime="2026-03-18 18:21:32.79287428 +0000 UTC m=+1127.360504612" Mar 18 18:21:32 crc kubenswrapper[4830]: I0318 18:21:32.937828 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 18:21:33 crc kubenswrapper[4830]: E0318 18:21:33.432303 4830 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7df9e731_0537_4d90_a4c1_907721b227e1.slice/crio-conmon-47fed7dea201a7cd51fbe2d563f0f2959e9ef377e040f6d593b4d99795c35487.scope\": RecentStats: unable to find data in memory cache]" Mar 18 18:21:33 crc kubenswrapper[4830]: I0318 18:21:33.758987 4830 generic.go:334] "Generic (PLEG): container finished" podID="7df9e731-0537-4d90-a4c1-907721b227e1" containerID="47fed7dea201a7cd51fbe2d563f0f2959e9ef377e040f6d593b4d99795c35487" exitCode=0 Mar 18 18:21:33 crc kubenswrapper[4830]: I0318 18:21:33.759163 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7df9e731-0537-4d90-a4c1-907721b227e1","Type":"ContainerDied","Data":"47fed7dea201a7cd51fbe2d563f0f2959e9ef377e040f6d593b4d99795c35487"} Mar 18 18:21:33 crc kubenswrapper[4830]: I0318 18:21:33.770574 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e","Type":"ContainerStarted","Data":"2798e226f5d33632ac3e39e1b2da992a846726cf311ecc786f14ceaefbe70926"} Mar 18 18:21:33 crc kubenswrapper[4830]: I0318 18:21:33.773891 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e07a019b-0836-4e90-b596-0f06f04a9330" containerName="cinder-api-log" containerID="cri-o://53b065ebfb2ca291a992c335c6b104296b92679688303232de651450fc67dd2e" gracePeriod=30 Mar 18 18:21:33 crc kubenswrapper[4830]: I0318 18:21:33.774139 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e07a019b-0836-4e90-b596-0f06f04a9330","Type":"ContainerStarted","Data":"3729516e2058e7ad2f87b2479bb0bda86b0d9348782578427d221ff42d8d9d22"} Mar 18 18:21:33 crc kubenswrapper[4830]: I0318 18:21:33.774557 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8995fbb57-qhlkl" Mar 18 18:21:33 crc kubenswrapper[4830]: I0318 18:21:33.775019 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e07a019b-0836-4e90-b596-0f06f04a9330" containerName="cinder-api" containerID="cri-o://3729516e2058e7ad2f87b2479bb0bda86b0d9348782578427d221ff42d8d9d22" gracePeriod=30 Mar 18 18:21:33 crc kubenswrapper[4830]: I0318 18:21:33.775073 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 18 18:21:33 crc kubenswrapper[4830]: I0318 18:21:33.830256 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.830226965 podStartE2EDuration="4.830226965s" podCreationTimestamp="2026-03-18 18:21:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:21:33.810437819 +0000 UTC m=+1128.378068151" watchObservedRunningTime="2026-03-18 18:21:33.830226965 +0000 UTC m=+1128.397857297" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.064259 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.090308 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7df9e731-0537-4d90-a4c1-907721b227e1-run-httpd\") pod \"7df9e731-0537-4d90-a4c1-907721b227e1\" (UID: \"7df9e731-0537-4d90-a4c1-907721b227e1\") " Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.090359 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7df9e731-0537-4d90-a4c1-907721b227e1-scripts\") pod \"7df9e731-0537-4d90-a4c1-907721b227e1\" (UID: \"7df9e731-0537-4d90-a4c1-907721b227e1\") " Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.090412 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fd26g\" (UniqueName: \"kubernetes.io/projected/7df9e731-0537-4d90-a4c1-907721b227e1-kube-api-access-fd26g\") pod \"7df9e731-0537-4d90-a4c1-907721b227e1\" (UID: \"7df9e731-0537-4d90-a4c1-907721b227e1\") " Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.090448 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7df9e731-0537-4d90-a4c1-907721b227e1-log-httpd\") pod \"7df9e731-0537-4d90-a4c1-907721b227e1\" (UID: \"7df9e731-0537-4d90-a4c1-907721b227e1\") " Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.090563 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7df9e731-0537-4d90-a4c1-907721b227e1-config-data\") pod \"7df9e731-0537-4d90-a4c1-907721b227e1\" (UID: \"7df9e731-0537-4d90-a4c1-907721b227e1\") " Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.090627 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7df9e731-0537-4d90-a4c1-907721b227e1-sg-core-conf-yaml\") pod \"7df9e731-0537-4d90-a4c1-907721b227e1\" (UID: \"7df9e731-0537-4d90-a4c1-907721b227e1\") " Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.090652 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7df9e731-0537-4d90-a4c1-907721b227e1-combined-ca-bundle\") pod \"7df9e731-0537-4d90-a4c1-907721b227e1\" (UID: \"7df9e731-0537-4d90-a4c1-907721b227e1\") " Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.092253 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7df9e731-0537-4d90-a4c1-907721b227e1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7df9e731-0537-4d90-a4c1-907721b227e1" (UID: "7df9e731-0537-4d90-a4c1-907721b227e1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.093045 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7df9e731-0537-4d90-a4c1-907721b227e1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7df9e731-0537-4d90-a4c1-907721b227e1" (UID: "7df9e731-0537-4d90-a4c1-907721b227e1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.095247 4830 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7df9e731-0537-4d90-a4c1-907721b227e1-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.095271 4830 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7df9e731-0537-4d90-a4c1-907721b227e1-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.099844 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7df9e731-0537-4d90-a4c1-907721b227e1-kube-api-access-fd26g" (OuterVolumeSpecName: "kube-api-access-fd26g") pod "7df9e731-0537-4d90-a4c1-907721b227e1" (UID: "7df9e731-0537-4d90-a4c1-907721b227e1"). InnerVolumeSpecName "kube-api-access-fd26g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.113593 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7df9e731-0537-4d90-a4c1-907721b227e1-scripts" (OuterVolumeSpecName: "scripts") pod "7df9e731-0537-4d90-a4c1-907721b227e1" (UID: "7df9e731-0537-4d90-a4c1-907721b227e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.178997 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7df9e731-0537-4d90-a4c1-907721b227e1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7df9e731-0537-4d90-a4c1-907721b227e1" (UID: "7df9e731-0537-4d90-a4c1-907721b227e1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.196731 4830 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7df9e731-0537-4d90-a4c1-907721b227e1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.196784 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7df9e731-0537-4d90-a4c1-907721b227e1-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.196799 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fd26g\" (UniqueName: \"kubernetes.io/projected/7df9e731-0537-4d90-a4c1-907721b227e1-kube-api-access-fd26g\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.262855 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7df9e731-0537-4d90-a4c1-907721b227e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7df9e731-0537-4d90-a4c1-907721b227e1" (UID: "7df9e731-0537-4d90-a4c1-907721b227e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.268917 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7df9e731-0537-4d90-a4c1-907721b227e1-config-data" (OuterVolumeSpecName: "config-data") pod "7df9e731-0537-4d90-a4c1-907721b227e1" (UID: "7df9e731-0537-4d90-a4c1-907721b227e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.299398 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7df9e731-0537-4d90-a4c1-907721b227e1-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.299429 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7df9e731-0537-4d90-a4c1-907721b227e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.308752 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.400594 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e07a019b-0836-4e90-b596-0f06f04a9330-etc-machine-id\") pod \"e07a019b-0836-4e90-b596-0f06f04a9330\" (UID: \"e07a019b-0836-4e90-b596-0f06f04a9330\") " Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.400678 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e07a019b-0836-4e90-b596-0f06f04a9330-scripts\") pod \"e07a019b-0836-4e90-b596-0f06f04a9330\" (UID: \"e07a019b-0836-4e90-b596-0f06f04a9330\") " Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.400736 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e07a019b-0836-4e90-b596-0f06f04a9330-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e07a019b-0836-4e90-b596-0f06f04a9330" (UID: "e07a019b-0836-4e90-b596-0f06f04a9330"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.400760 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e07a019b-0836-4e90-b596-0f06f04a9330-config-data-custom\") pod \"e07a019b-0836-4e90-b596-0f06f04a9330\" (UID: \"e07a019b-0836-4e90-b596-0f06f04a9330\") " Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.400802 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e07a019b-0836-4e90-b596-0f06f04a9330-config-data\") pod \"e07a019b-0836-4e90-b596-0f06f04a9330\" (UID: \"e07a019b-0836-4e90-b596-0f06f04a9330\") " Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.400863 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e07a019b-0836-4e90-b596-0f06f04a9330-combined-ca-bundle\") pod \"e07a019b-0836-4e90-b596-0f06f04a9330\" (UID: \"e07a019b-0836-4e90-b596-0f06f04a9330\") " Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.400910 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e07a019b-0836-4e90-b596-0f06f04a9330-logs\") pod \"e07a019b-0836-4e90-b596-0f06f04a9330\" (UID: \"e07a019b-0836-4e90-b596-0f06f04a9330\") " Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.400947 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtdhb\" (UniqueName: \"kubernetes.io/projected/e07a019b-0836-4e90-b596-0f06f04a9330-kube-api-access-gtdhb\") pod \"e07a019b-0836-4e90-b596-0f06f04a9330\" (UID: \"e07a019b-0836-4e90-b596-0f06f04a9330\") " Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.401262 4830 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e07a019b-0836-4e90-b596-0f06f04a9330-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.401863 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e07a019b-0836-4e90-b596-0f06f04a9330-logs" (OuterVolumeSpecName: "logs") pod "e07a019b-0836-4e90-b596-0f06f04a9330" (UID: "e07a019b-0836-4e90-b596-0f06f04a9330"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.405032 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e07a019b-0836-4e90-b596-0f06f04a9330-scripts" (OuterVolumeSpecName: "scripts") pod "e07a019b-0836-4e90-b596-0f06f04a9330" (UID: "e07a019b-0836-4e90-b596-0f06f04a9330"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.407632 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e07a019b-0836-4e90-b596-0f06f04a9330-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e07a019b-0836-4e90-b596-0f06f04a9330" (UID: "e07a019b-0836-4e90-b596-0f06f04a9330"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.407663 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e07a019b-0836-4e90-b596-0f06f04a9330-kube-api-access-gtdhb" (OuterVolumeSpecName: "kube-api-access-gtdhb") pod "e07a019b-0836-4e90-b596-0f06f04a9330" (UID: "e07a019b-0836-4e90-b596-0f06f04a9330"). InnerVolumeSpecName "kube-api-access-gtdhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.429765 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e07a019b-0836-4e90-b596-0f06f04a9330-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e07a019b-0836-4e90-b596-0f06f04a9330" (UID: "e07a019b-0836-4e90-b596-0f06f04a9330"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.447023 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e07a019b-0836-4e90-b596-0f06f04a9330-config-data" (OuterVolumeSpecName: "config-data") pod "e07a019b-0836-4e90-b596-0f06f04a9330" (UID: "e07a019b-0836-4e90-b596-0f06f04a9330"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.503324 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e07a019b-0836-4e90-b596-0f06f04a9330-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.503356 4830 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e07a019b-0836-4e90-b596-0f06f04a9330-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.503368 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e07a019b-0836-4e90-b596-0f06f04a9330-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.503377 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e07a019b-0836-4e90-b596-0f06f04a9330-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.503389 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e07a019b-0836-4e90-b596-0f06f04a9330-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.503397 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtdhb\" (UniqueName: \"kubernetes.io/projected/e07a019b-0836-4e90-b596-0f06f04a9330-kube-api-access-gtdhb\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.782962 4830 generic.go:334] "Generic (PLEG): container finished" podID="e07a019b-0836-4e90-b596-0f06f04a9330" containerID="3729516e2058e7ad2f87b2479bb0bda86b0d9348782578427d221ff42d8d9d22" exitCode=0 Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.782994 4830 generic.go:334] "Generic (PLEG): container finished" podID="e07a019b-0836-4e90-b596-0f06f04a9330" containerID="53b065ebfb2ca291a992c335c6b104296b92679688303232de651450fc67dd2e" exitCode=143 Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.783027 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e07a019b-0836-4e90-b596-0f06f04a9330","Type":"ContainerDied","Data":"3729516e2058e7ad2f87b2479bb0bda86b0d9348782578427d221ff42d8d9d22"} Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.783052 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e07a019b-0836-4e90-b596-0f06f04a9330","Type":"ContainerDied","Data":"53b065ebfb2ca291a992c335c6b104296b92679688303232de651450fc67dd2e"} Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.783062 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e07a019b-0836-4e90-b596-0f06f04a9330","Type":"ContainerDied","Data":"6a3acbe4a1e7dd3a7a5ef9e51eeae47da43db487311dcd7cc9ba07abd13a1379"} Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.783076 4830 scope.go:117] "RemoveContainer" containerID="3729516e2058e7ad2f87b2479bb0bda86b0d9348782578427d221ff42d8d9d22" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.783160 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.795207 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7df9e731-0537-4d90-a4c1-907721b227e1","Type":"ContainerDied","Data":"7a00b93a5794418067e1fb78825d2e8e9864ecafcfce286d6f6c1317bcde293d"} Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.795294 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.800503 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e","Type":"ContainerStarted","Data":"1beae2c931be0c4737d38ea81296666d857479db917f360e7a3d94688386583c"} Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.830030 4830 scope.go:117] "RemoveContainer" containerID="53b065ebfb2ca291a992c335c6b104296b92679688303232de651450fc67dd2e" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.836133 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.850553 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.851320 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.552475216 podStartE2EDuration="5.851263s" podCreationTimestamp="2026-03-18 18:21:29 +0000 UTC" firstStartedPulling="2026-03-18 18:21:30.658678479 +0000 UTC m=+1125.226308811" lastFinishedPulling="2026-03-18 18:21:31.957466263 +0000 UTC m=+1126.525096595" observedRunningTime="2026-03-18 18:21:34.843324097 +0000 UTC m=+1129.410954459" watchObservedRunningTime="2026-03-18 18:21:34.851263 +0000 UTC m=+1129.418893372" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.880965 4830 scope.go:117] "RemoveContainer" containerID="3729516e2058e7ad2f87b2479bb0bda86b0d9348782578427d221ff42d8d9d22" Mar 18 18:21:34 crc kubenswrapper[4830]: E0318 18:21:34.882425 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3729516e2058e7ad2f87b2479bb0bda86b0d9348782578427d221ff42d8d9d22\": container with ID starting with 3729516e2058e7ad2f87b2479bb0bda86b0d9348782578427d221ff42d8d9d22 not found: ID does not exist" containerID="3729516e2058e7ad2f87b2479bb0bda86b0d9348782578427d221ff42d8d9d22" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.882497 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3729516e2058e7ad2f87b2479bb0bda86b0d9348782578427d221ff42d8d9d22"} err="failed to get container status \"3729516e2058e7ad2f87b2479bb0bda86b0d9348782578427d221ff42d8d9d22\": rpc error: code = NotFound desc = could not find container \"3729516e2058e7ad2f87b2479bb0bda86b0d9348782578427d221ff42d8d9d22\": container with ID starting with 3729516e2058e7ad2f87b2479bb0bda86b0d9348782578427d221ff42d8d9d22 not found: ID does not exist" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.882540 4830 scope.go:117] "RemoveContainer" containerID="53b065ebfb2ca291a992c335c6b104296b92679688303232de651450fc67dd2e" Mar 18 18:21:34 crc kubenswrapper[4830]: E0318 18:21:34.883156 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53b065ebfb2ca291a992c335c6b104296b92679688303232de651450fc67dd2e\": container with ID starting with 53b065ebfb2ca291a992c335c6b104296b92679688303232de651450fc67dd2e not found: ID does not exist" containerID="53b065ebfb2ca291a992c335c6b104296b92679688303232de651450fc67dd2e" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.883187 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53b065ebfb2ca291a992c335c6b104296b92679688303232de651450fc67dd2e"} err="failed to get container status \"53b065ebfb2ca291a992c335c6b104296b92679688303232de651450fc67dd2e\": rpc error: code = NotFound desc = could not find container \"53b065ebfb2ca291a992c335c6b104296b92679688303232de651450fc67dd2e\": container with ID starting with 53b065ebfb2ca291a992c335c6b104296b92679688303232de651450fc67dd2e not found: ID does not exist" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.883206 4830 scope.go:117] "RemoveContainer" containerID="3729516e2058e7ad2f87b2479bb0bda86b0d9348782578427d221ff42d8d9d22" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.886116 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3729516e2058e7ad2f87b2479bb0bda86b0d9348782578427d221ff42d8d9d22"} err="failed to get container status \"3729516e2058e7ad2f87b2479bb0bda86b0d9348782578427d221ff42d8d9d22\": rpc error: code = NotFound desc = could not find container \"3729516e2058e7ad2f87b2479bb0bda86b0d9348782578427d221ff42d8d9d22\": container with ID starting with 3729516e2058e7ad2f87b2479bb0bda86b0d9348782578427d221ff42d8d9d22 not found: ID does not exist" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.886162 4830 scope.go:117] "RemoveContainer" containerID="53b065ebfb2ca291a992c335c6b104296b92679688303232de651450fc67dd2e" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.889809 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53b065ebfb2ca291a992c335c6b104296b92679688303232de651450fc67dd2e"} err="failed to get container status \"53b065ebfb2ca291a992c335c6b104296b92679688303232de651450fc67dd2e\": rpc error: code = NotFound desc = could not find container \"53b065ebfb2ca291a992c335c6b104296b92679688303232de651450fc67dd2e\": container with ID starting with 53b065ebfb2ca291a992c335c6b104296b92679688303232de651450fc67dd2e not found: ID does not exist" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.889843 4830 scope.go:117] "RemoveContainer" containerID="449ef6edc93cf2e68fa998d3203c16831d9f5a978d92b1f5c8596ad8eec051ec" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.895925 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 18 18:21:34 crc kubenswrapper[4830]: E0318 18:21:34.896326 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d087f379-e58a-4990-8e43-609c4f5feb40" containerName="init" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.896340 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="d087f379-e58a-4990-8e43-609c4f5feb40" containerName="init" Mar 18 18:21:34 crc kubenswrapper[4830]: E0318 18:21:34.896357 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7df9e731-0537-4d90-a4c1-907721b227e1" containerName="proxy-httpd" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.896363 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="7df9e731-0537-4d90-a4c1-907721b227e1" containerName="proxy-httpd" Mar 18 18:21:34 crc kubenswrapper[4830]: E0318 18:21:34.896384 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7df9e731-0537-4d90-a4c1-907721b227e1" containerName="ceilometer-notification-agent" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.896390 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="7df9e731-0537-4d90-a4c1-907721b227e1" containerName="ceilometer-notification-agent" Mar 18 18:21:34 crc kubenswrapper[4830]: E0318 18:21:34.896402 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e07a019b-0836-4e90-b596-0f06f04a9330" containerName="cinder-api-log" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.896407 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e07a019b-0836-4e90-b596-0f06f04a9330" containerName="cinder-api-log" Mar 18 18:21:34 crc kubenswrapper[4830]: E0318 18:21:34.896417 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e07a019b-0836-4e90-b596-0f06f04a9330" containerName="cinder-api" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.896422 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e07a019b-0836-4e90-b596-0f06f04a9330" containerName="cinder-api" Mar 18 18:21:34 crc kubenswrapper[4830]: E0318 18:21:34.896434 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7df9e731-0537-4d90-a4c1-907721b227e1" containerName="sg-core" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.896440 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="7df9e731-0537-4d90-a4c1-907721b227e1" containerName="sg-core" Mar 18 18:21:34 crc kubenswrapper[4830]: E0318 18:21:34.896453 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7df9e731-0537-4d90-a4c1-907721b227e1" containerName="ceilometer-central-agent" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.896460 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="7df9e731-0537-4d90-a4c1-907721b227e1" containerName="ceilometer-central-agent" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.900272 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="d087f379-e58a-4990-8e43-609c4f5feb40" containerName="init" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.900306 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="7df9e731-0537-4d90-a4c1-907721b227e1" containerName="sg-core" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.900319 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="7df9e731-0537-4d90-a4c1-907721b227e1" containerName="ceilometer-notification-agent" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.900334 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="7df9e731-0537-4d90-a4c1-907721b227e1" containerName="ceilometer-central-agent" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.900350 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="e07a019b-0836-4e90-b596-0f06f04a9330" containerName="cinder-api-log" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.900365 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="7df9e731-0537-4d90-a4c1-907721b227e1" containerName="proxy-httpd" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.900372 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="e07a019b-0836-4e90-b596-0f06f04a9330" containerName="cinder-api" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.901556 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.911316 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.914641 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.914757 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.917832 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.929523 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.944283 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.956093 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.958487 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.961148 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.961370 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.963224 4830 scope.go:117] "RemoveContainer" containerID="e71f3e55d1b20dc8557a690ff642e06691b7429714e7ad3a2835647f38bc0b9c" Mar 18 18:21:34 crc kubenswrapper[4830]: I0318 18:21:34.974222 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.013607 4830 scope.go:117] "RemoveContainer" containerID="47fed7dea201a7cd51fbe2d563f0f2959e9ef377e040f6d593b4d99795c35487" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.028506 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-config-data-custom\") pod \"cinder-api-0\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " pod="openstack/cinder-api-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.028553 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-logs\") pod \"cinder-api-0\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " pod="openstack/cinder-api-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.028592 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82751f36-d42f-4edd-a9c7-e6657da91e34-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"82751f36-d42f-4edd-a9c7-e6657da91e34\") " pod="openstack/ceilometer-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.028611 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " pod="openstack/cinder-api-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.028641 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-scripts\") pod \"cinder-api-0\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " pod="openstack/cinder-api-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.028657 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82751f36-d42f-4edd-a9c7-e6657da91e34-config-data\") pod \"ceilometer-0\" (UID: \"82751f36-d42f-4edd-a9c7-e6657da91e34\") " pod="openstack/ceilometer-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.028678 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82751f36-d42f-4edd-a9c7-e6657da91e34-scripts\") pod \"ceilometer-0\" (UID: \"82751f36-d42f-4edd-a9c7-e6657da91e34\") " pod="openstack/ceilometer-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.028702 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhxmg\" (UniqueName: \"kubernetes.io/projected/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-kube-api-access-rhxmg\") pod \"cinder-api-0\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " pod="openstack/cinder-api-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.028727 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " pod="openstack/cinder-api-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.028746 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " pod="openstack/cinder-api-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.028765 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82751f36-d42f-4edd-a9c7-e6657da91e34-run-httpd\") pod \"ceilometer-0\" (UID: \"82751f36-d42f-4edd-a9c7-e6657da91e34\") " pod="openstack/ceilometer-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.028834 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82751f36-d42f-4edd-a9c7-e6657da91e34-log-httpd\") pod \"ceilometer-0\" (UID: \"82751f36-d42f-4edd-a9c7-e6657da91e34\") " pod="openstack/ceilometer-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.028851 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtkzl\" (UniqueName: \"kubernetes.io/projected/82751f36-d42f-4edd-a9c7-e6657da91e34-kube-api-access-jtkzl\") pod \"ceilometer-0\" (UID: \"82751f36-d42f-4edd-a9c7-e6657da91e34\") " pod="openstack/ceilometer-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.028869 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " pod="openstack/cinder-api-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.028889 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82751f36-d42f-4edd-a9c7-e6657da91e34-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"82751f36-d42f-4edd-a9c7-e6657da91e34\") " pod="openstack/ceilometer-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.028910 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-config-data\") pod \"cinder-api-0\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " pod="openstack/cinder-api-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.052647 4830 scope.go:117] "RemoveContainer" containerID="50c6f76b1d4c7a12f137499a2267b56243ac8396eb3e46a198a94656acaf8ba4" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.129972 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82751f36-d42f-4edd-a9c7-e6657da91e34-scripts\") pod \"ceilometer-0\" (UID: \"82751f36-d42f-4edd-a9c7-e6657da91e34\") " pod="openstack/ceilometer-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.130040 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhxmg\" (UniqueName: \"kubernetes.io/projected/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-kube-api-access-rhxmg\") pod \"cinder-api-0\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " pod="openstack/cinder-api-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.130078 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " pod="openstack/cinder-api-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.130105 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " pod="openstack/cinder-api-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.130130 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82751f36-d42f-4edd-a9c7-e6657da91e34-run-httpd\") pod \"ceilometer-0\" (UID: \"82751f36-d42f-4edd-a9c7-e6657da91e34\") " pod="openstack/ceilometer-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.130174 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82751f36-d42f-4edd-a9c7-e6657da91e34-log-httpd\") pod \"ceilometer-0\" (UID: \"82751f36-d42f-4edd-a9c7-e6657da91e34\") " pod="openstack/ceilometer-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.130200 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtkzl\" (UniqueName: \"kubernetes.io/projected/82751f36-d42f-4edd-a9c7-e6657da91e34-kube-api-access-jtkzl\") pod \"ceilometer-0\" (UID: \"82751f36-d42f-4edd-a9c7-e6657da91e34\") " pod="openstack/ceilometer-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.130227 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " pod="openstack/cinder-api-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.130254 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82751f36-d42f-4edd-a9c7-e6657da91e34-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"82751f36-d42f-4edd-a9c7-e6657da91e34\") " pod="openstack/ceilometer-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.130283 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-config-data\") pod \"cinder-api-0\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " pod="openstack/cinder-api-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.130317 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-config-data-custom\") pod \"cinder-api-0\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " pod="openstack/cinder-api-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.130347 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-logs\") pod \"cinder-api-0\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " pod="openstack/cinder-api-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.130391 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82751f36-d42f-4edd-a9c7-e6657da91e34-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"82751f36-d42f-4edd-a9c7-e6657da91e34\") " pod="openstack/ceilometer-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.130418 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " pod="openstack/cinder-api-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.130456 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-scripts\") pod \"cinder-api-0\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " pod="openstack/cinder-api-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.130479 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82751f36-d42f-4edd-a9c7-e6657da91e34-config-data\") pod \"ceilometer-0\" (UID: \"82751f36-d42f-4edd-a9c7-e6657da91e34\") " pod="openstack/ceilometer-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.130878 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " pod="openstack/cinder-api-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.131141 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-logs\") pod \"cinder-api-0\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " pod="openstack/cinder-api-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.133590 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82751f36-d42f-4edd-a9c7-e6657da91e34-run-httpd\") pod \"ceilometer-0\" (UID: \"82751f36-d42f-4edd-a9c7-e6657da91e34\") " pod="openstack/ceilometer-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.137970 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82751f36-d42f-4edd-a9c7-e6657da91e34-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"82751f36-d42f-4edd-a9c7-e6657da91e34\") " pod="openstack/ceilometer-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.144291 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " pod="openstack/cinder-api-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.145320 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " pod="openstack/cinder-api-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.145992 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-config-data-custom\") pod \"cinder-api-0\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " pod="openstack/cinder-api-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.146421 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82751f36-d42f-4edd-a9c7-e6657da91e34-scripts\") pod \"ceilometer-0\" (UID: \"82751f36-d42f-4edd-a9c7-e6657da91e34\") " pod="openstack/ceilometer-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.146442 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82751f36-d42f-4edd-a9c7-e6657da91e34-log-httpd\") pod \"ceilometer-0\" (UID: \"82751f36-d42f-4edd-a9c7-e6657da91e34\") " pod="openstack/ceilometer-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.144862 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-scripts\") pod \"cinder-api-0\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " pod="openstack/cinder-api-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.146559 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82751f36-d42f-4edd-a9c7-e6657da91e34-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"82751f36-d42f-4edd-a9c7-e6657da91e34\") " pod="openstack/ceilometer-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.146624 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-config-data\") pod \"cinder-api-0\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " pod="openstack/cinder-api-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.146758 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82751f36-d42f-4edd-a9c7-e6657da91e34-config-data\") pod \"ceilometer-0\" (UID: \"82751f36-d42f-4edd-a9c7-e6657da91e34\") " pod="openstack/ceilometer-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.146918 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " pod="openstack/cinder-api-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.148832 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtkzl\" (UniqueName: \"kubernetes.io/projected/82751f36-d42f-4edd-a9c7-e6657da91e34-kube-api-access-jtkzl\") pod \"ceilometer-0\" (UID: \"82751f36-d42f-4edd-a9c7-e6657da91e34\") " pod="openstack/ceilometer-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.162670 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhxmg\" (UniqueName: \"kubernetes.io/projected/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-kube-api-access-rhxmg\") pod \"cinder-api-0\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " pod="openstack/cinder-api-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.180114 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.289013 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.295561 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.711424 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7f884dc87d-6wvs2"] Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.713795 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7f884dc87d-6wvs2" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.717067 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.717170 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.736861 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7f884dc87d-6wvs2"] Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.852223 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e152864-9096-47a7-b0b0-c288840093e7-config-data\") pod \"barbican-api-7f884dc87d-6wvs2\" (UID: \"3e152864-9096-47a7-b0b0-c288840093e7\") " pod="openstack/barbican-api-7f884dc87d-6wvs2" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.852652 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e152864-9096-47a7-b0b0-c288840093e7-public-tls-certs\") pod \"barbican-api-7f884dc87d-6wvs2\" (UID: \"3e152864-9096-47a7-b0b0-c288840093e7\") " pod="openstack/barbican-api-7f884dc87d-6wvs2" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.852704 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e152864-9096-47a7-b0b0-c288840093e7-logs\") pod \"barbican-api-7f884dc87d-6wvs2\" (UID: \"3e152864-9096-47a7-b0b0-c288840093e7\") " pod="openstack/barbican-api-7f884dc87d-6wvs2" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.852959 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e152864-9096-47a7-b0b0-c288840093e7-internal-tls-certs\") pod \"barbican-api-7f884dc87d-6wvs2\" (UID: \"3e152864-9096-47a7-b0b0-c288840093e7\") " pod="openstack/barbican-api-7f884dc87d-6wvs2" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.853023 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqz9t\" (UniqueName: \"kubernetes.io/projected/3e152864-9096-47a7-b0b0-c288840093e7-kube-api-access-xqz9t\") pod \"barbican-api-7f884dc87d-6wvs2\" (UID: \"3e152864-9096-47a7-b0b0-c288840093e7\") " pod="openstack/barbican-api-7f884dc87d-6wvs2" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.853051 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e152864-9096-47a7-b0b0-c288840093e7-config-data-custom\") pod \"barbican-api-7f884dc87d-6wvs2\" (UID: \"3e152864-9096-47a7-b0b0-c288840093e7\") " pod="openstack/barbican-api-7f884dc87d-6wvs2" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.853081 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e152864-9096-47a7-b0b0-c288840093e7-combined-ca-bundle\") pod \"barbican-api-7f884dc87d-6wvs2\" (UID: \"3e152864-9096-47a7-b0b0-c288840093e7\") " pod="openstack/barbican-api-7f884dc87d-6wvs2" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.854625 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.919896 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.959362 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e152864-9096-47a7-b0b0-c288840093e7-internal-tls-certs\") pod \"barbican-api-7f884dc87d-6wvs2\" (UID: \"3e152864-9096-47a7-b0b0-c288840093e7\") " pod="openstack/barbican-api-7f884dc87d-6wvs2" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.959424 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqz9t\" (UniqueName: \"kubernetes.io/projected/3e152864-9096-47a7-b0b0-c288840093e7-kube-api-access-xqz9t\") pod \"barbican-api-7f884dc87d-6wvs2\" (UID: \"3e152864-9096-47a7-b0b0-c288840093e7\") " pod="openstack/barbican-api-7f884dc87d-6wvs2" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.959455 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e152864-9096-47a7-b0b0-c288840093e7-config-data-custom\") pod \"barbican-api-7f884dc87d-6wvs2\" (UID: \"3e152864-9096-47a7-b0b0-c288840093e7\") " pod="openstack/barbican-api-7f884dc87d-6wvs2" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.959475 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e152864-9096-47a7-b0b0-c288840093e7-combined-ca-bundle\") pod \"barbican-api-7f884dc87d-6wvs2\" (UID: \"3e152864-9096-47a7-b0b0-c288840093e7\") " pod="openstack/barbican-api-7f884dc87d-6wvs2" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.959498 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e152864-9096-47a7-b0b0-c288840093e7-config-data\") pod \"barbican-api-7f884dc87d-6wvs2\" (UID: \"3e152864-9096-47a7-b0b0-c288840093e7\") " pod="openstack/barbican-api-7f884dc87d-6wvs2" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.959514 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e152864-9096-47a7-b0b0-c288840093e7-public-tls-certs\") pod \"barbican-api-7f884dc87d-6wvs2\" (UID: \"3e152864-9096-47a7-b0b0-c288840093e7\") " pod="openstack/barbican-api-7f884dc87d-6wvs2" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.959537 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e152864-9096-47a7-b0b0-c288840093e7-logs\") pod \"barbican-api-7f884dc87d-6wvs2\" (UID: \"3e152864-9096-47a7-b0b0-c288840093e7\") " pod="openstack/barbican-api-7f884dc87d-6wvs2" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.961314 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e152864-9096-47a7-b0b0-c288840093e7-logs\") pod \"barbican-api-7f884dc87d-6wvs2\" (UID: \"3e152864-9096-47a7-b0b0-c288840093e7\") " pod="openstack/barbican-api-7f884dc87d-6wvs2" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.965787 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e152864-9096-47a7-b0b0-c288840093e7-internal-tls-certs\") pod \"barbican-api-7f884dc87d-6wvs2\" (UID: \"3e152864-9096-47a7-b0b0-c288840093e7\") " pod="openstack/barbican-api-7f884dc87d-6wvs2" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.967323 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e152864-9096-47a7-b0b0-c288840093e7-combined-ca-bundle\") pod \"barbican-api-7f884dc87d-6wvs2\" (UID: \"3e152864-9096-47a7-b0b0-c288840093e7\") " pod="openstack/barbican-api-7f884dc87d-6wvs2" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.967973 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e152864-9096-47a7-b0b0-c288840093e7-config-data-custom\") pod \"barbican-api-7f884dc87d-6wvs2\" (UID: \"3e152864-9096-47a7-b0b0-c288840093e7\") " pod="openstack/barbican-api-7f884dc87d-6wvs2" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.969751 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e152864-9096-47a7-b0b0-c288840093e7-public-tls-certs\") pod \"barbican-api-7f884dc87d-6wvs2\" (UID: \"3e152864-9096-47a7-b0b0-c288840093e7\") " pod="openstack/barbican-api-7f884dc87d-6wvs2" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.972086 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e152864-9096-47a7-b0b0-c288840093e7-config-data\") pod \"barbican-api-7f884dc87d-6wvs2\" (UID: \"3e152864-9096-47a7-b0b0-c288840093e7\") " pod="openstack/barbican-api-7f884dc87d-6wvs2" Mar 18 18:21:35 crc kubenswrapper[4830]: I0318 18:21:35.989538 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqz9t\" (UniqueName: \"kubernetes.io/projected/3e152864-9096-47a7-b0b0-c288840093e7-kube-api-access-xqz9t\") pod \"barbican-api-7f884dc87d-6wvs2\" (UID: \"3e152864-9096-47a7-b0b0-c288840093e7\") " pod="openstack/barbican-api-7f884dc87d-6wvs2" Mar 18 18:21:36 crc kubenswrapper[4830]: I0318 18:21:36.071851 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7f884dc87d-6wvs2" Mar 18 18:21:36 crc kubenswrapper[4830]: I0318 18:21:36.275161 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7df9e731-0537-4d90-a4c1-907721b227e1" path="/var/lib/kubelet/pods/7df9e731-0537-4d90-a4c1-907721b227e1/volumes" Mar 18 18:21:36 crc kubenswrapper[4830]: I0318 18:21:36.276571 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e07a019b-0836-4e90-b596-0f06f04a9330" path="/var/lib/kubelet/pods/e07a019b-0836-4e90-b596-0f06f04a9330/volumes" Mar 18 18:21:36 crc kubenswrapper[4830]: I0318 18:21:36.556185 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7f884dc87d-6wvs2"] Mar 18 18:21:36 crc kubenswrapper[4830]: W0318 18:21:36.562762 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e152864_9096_47a7_b0b0_c288840093e7.slice/crio-540de3175421e90b719f9dae4a86fc275fed2e759d940020ba95c8d739092d80 WatchSource:0}: Error finding container 540de3175421e90b719f9dae4a86fc275fed2e759d940020ba95c8d739092d80: Status 404 returned error can't find the container with id 540de3175421e90b719f9dae4a86fc275fed2e759d940020ba95c8d739092d80 Mar 18 18:21:36 crc kubenswrapper[4830]: I0318 18:21:36.908119 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f884dc87d-6wvs2" event={"ID":"3e152864-9096-47a7-b0b0-c288840093e7","Type":"ContainerStarted","Data":"540de3175421e90b719f9dae4a86fc275fed2e759d940020ba95c8d739092d80"} Mar 18 18:21:36 crc kubenswrapper[4830]: I0318 18:21:36.937755 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b","Type":"ContainerStarted","Data":"7f2292a71a9c798a2c17f9c6fc6b6d12fc68c263b920ce067f07390c0bc23f23"} Mar 18 18:21:36 crc kubenswrapper[4830]: I0318 18:21:36.937838 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b","Type":"ContainerStarted","Data":"c4362062ff7d150b86119d5b1cbf2b485cb23abf3d495b273bcb1819655c53b7"} Mar 18 18:21:36 crc kubenswrapper[4830]: I0318 18:21:36.940007 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82751f36-d42f-4edd-a9c7-e6657da91e34","Type":"ContainerStarted","Data":"ca961ff56c2a64ad3c6c171e67ee0c3f74464740fa004e173274e78f41ddc694"} Mar 18 18:21:37 crc kubenswrapper[4830]: I0318 18:21:37.950712 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f884dc87d-6wvs2" event={"ID":"3e152864-9096-47a7-b0b0-c288840093e7","Type":"ContainerStarted","Data":"e20014a42907afd388ba14b211a6c05885fe859da4a4d5b322dfc735c19c8637"} Mar 18 18:21:37 crc kubenswrapper[4830]: I0318 18:21:37.951997 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f884dc87d-6wvs2" event={"ID":"3e152864-9096-47a7-b0b0-c288840093e7","Type":"ContainerStarted","Data":"b03c2437bc3b020985e30c6a140c3c922aeb2a95cd6d3bf3c72a87ffaf8ce7ba"} Mar 18 18:21:37 crc kubenswrapper[4830]: I0318 18:21:37.952018 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7f884dc87d-6wvs2" Mar 18 18:21:37 crc kubenswrapper[4830]: I0318 18:21:37.952030 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7f884dc87d-6wvs2" Mar 18 18:21:37 crc kubenswrapper[4830]: I0318 18:21:37.952555 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b","Type":"ContainerStarted","Data":"29fc62aa8b0c7dff64144c93d1f53c7be2667c73d45b77f8b2e9fee0136dd279"} Mar 18 18:21:37 crc kubenswrapper[4830]: I0318 18:21:37.952659 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 18 18:21:37 crc kubenswrapper[4830]: I0318 18:21:37.956023 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82751f36-d42f-4edd-a9c7-e6657da91e34","Type":"ContainerStarted","Data":"1d4c0a46d0e3d58389d6d4c79bc3a4447609fd3f2a0aa21592f1f31e7aea934b"} Mar 18 18:21:37 crc kubenswrapper[4830]: I0318 18:21:37.956051 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82751f36-d42f-4edd-a9c7-e6657da91e34","Type":"ContainerStarted","Data":"3943dd6326b04778113bfba3570fc62a9f0270a46dfc202dbf962068fe710305"} Mar 18 18:21:37 crc kubenswrapper[4830]: I0318 18:21:37.971873 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7f884dc87d-6wvs2" podStartSLOduration=2.971854222 podStartE2EDuration="2.971854222s" podCreationTimestamp="2026-03-18 18:21:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:21:37.968095407 +0000 UTC m=+1132.535725729" watchObservedRunningTime="2026-03-18 18:21:37.971854222 +0000 UTC m=+1132.539484554" Mar 18 18:21:37 crc kubenswrapper[4830]: I0318 18:21:37.999021 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.999000266 podStartE2EDuration="3.999000266s" podCreationTimestamp="2026-03-18 18:21:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:21:37.994997813 +0000 UTC m=+1132.562628155" watchObservedRunningTime="2026-03-18 18:21:37.999000266 +0000 UTC m=+1132.566630608" Mar 18 18:21:38 crc kubenswrapper[4830]: I0318 18:21:38.976570 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82751f36-d42f-4edd-a9c7-e6657da91e34","Type":"ContainerStarted","Data":"292a2024950e615405e4ab375dd3bfc11b4aa3a56bc9871f42d40431922a45a4"} Mar 18 18:21:40 crc kubenswrapper[4830]: I0318 18:21:40.306821 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8995fbb57-qhlkl" Mar 18 18:21:40 crc kubenswrapper[4830]: I0318 18:21:40.373278 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d8b7b7d5-kjddm"] Mar 18 18:21:40 crc kubenswrapper[4830]: I0318 18:21:40.378610 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d8b7b7d5-kjddm" podUID="56705b88-b409-4811-a999-f548a0f108c7" containerName="dnsmasq-dns" containerID="cri-o://023f57ad1843e4c93604c976b9f7c86bb24ad0a5d2f1dfeb4f8efe8a9f50ae62" gracePeriod=10 Mar 18 18:21:40 crc kubenswrapper[4830]: I0318 18:21:40.447471 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 18 18:21:40 crc kubenswrapper[4830]: I0318 18:21:40.561563 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 18:21:40 crc kubenswrapper[4830]: I0318 18:21:40.975095 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6dc87fdf44-w2w2m" Mar 18 18:21:40 crc kubenswrapper[4830]: I0318 18:21:40.998350 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82751f36-d42f-4edd-a9c7-e6657da91e34","Type":"ContainerStarted","Data":"0aea7afbac277c581335abdba76abe516fb4c397a82745e2f599d61da1db92bb"} Mar 18 18:21:40 crc kubenswrapper[4830]: I0318 18:21:40.998516 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 18:21:41 crc kubenswrapper[4830]: I0318 18:21:41.000472 4830 generic.go:334] "Generic (PLEG): container finished" podID="56705b88-b409-4811-a999-f548a0f108c7" containerID="023f57ad1843e4c93604c976b9f7c86bb24ad0a5d2f1dfeb4f8efe8a9f50ae62" exitCode=0 Mar 18 18:21:41 crc kubenswrapper[4830]: I0318 18:21:41.000630 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="a60a5ad8-7dde-4000-acc7-83a4bc24fd7e" containerName="cinder-scheduler" containerID="cri-o://2798e226f5d33632ac3e39e1b2da992a846726cf311ecc786f14ceaefbe70926" gracePeriod=30 Mar 18 18:21:41 crc kubenswrapper[4830]: I0318 18:21:41.000715 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d8b7b7d5-kjddm" event={"ID":"56705b88-b409-4811-a999-f548a0f108c7","Type":"ContainerDied","Data":"023f57ad1843e4c93604c976b9f7c86bb24ad0a5d2f1dfeb4f8efe8a9f50ae62"} Mar 18 18:21:41 crc kubenswrapper[4830]: I0318 18:21:41.000786 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="a60a5ad8-7dde-4000-acc7-83a4bc24fd7e" containerName="probe" containerID="cri-o://1beae2c931be0c4737d38ea81296666d857479db917f360e7a3d94688386583c" gracePeriod=30 Mar 18 18:21:41 crc kubenswrapper[4830]: I0318 18:21:41.042082 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.51305866 podStartE2EDuration="7.041755141s" podCreationTimestamp="2026-03-18 18:21:34 +0000 UTC" firstStartedPulling="2026-03-18 18:21:35.920964784 +0000 UTC m=+1130.488595106" lastFinishedPulling="2026-03-18 18:21:40.449661255 +0000 UTC m=+1135.017291587" observedRunningTime="2026-03-18 18:21:41.040822275 +0000 UTC m=+1135.608452617" watchObservedRunningTime="2026-03-18 18:21:41.041755141 +0000 UTC m=+1135.609385473" Mar 18 18:21:41 crc kubenswrapper[4830]: I0318 18:21:41.067598 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6dc87fdf44-w2w2m" Mar 18 18:21:41 crc kubenswrapper[4830]: I0318 18:21:41.143356 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d8b7b7d5-kjddm" Mar 18 18:21:41 crc kubenswrapper[4830]: I0318 18:21:41.275417 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56705b88-b409-4811-a999-f548a0f108c7-dns-svc\") pod \"56705b88-b409-4811-a999-f548a0f108c7\" (UID: \"56705b88-b409-4811-a999-f548a0f108c7\") " Mar 18 18:21:41 crc kubenswrapper[4830]: I0318 18:21:41.275630 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/56705b88-b409-4811-a999-f548a0f108c7-dns-swift-storage-0\") pod \"56705b88-b409-4811-a999-f548a0f108c7\" (UID: \"56705b88-b409-4811-a999-f548a0f108c7\") " Mar 18 18:21:41 crc kubenswrapper[4830]: I0318 18:21:41.275663 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56705b88-b409-4811-a999-f548a0f108c7-ovsdbserver-sb\") pod \"56705b88-b409-4811-a999-f548a0f108c7\" (UID: \"56705b88-b409-4811-a999-f548a0f108c7\") " Mar 18 18:21:41 crc kubenswrapper[4830]: I0318 18:21:41.275828 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56705b88-b409-4811-a999-f548a0f108c7-ovsdbserver-nb\") pod \"56705b88-b409-4811-a999-f548a0f108c7\" (UID: \"56705b88-b409-4811-a999-f548a0f108c7\") " Mar 18 18:21:41 crc kubenswrapper[4830]: I0318 18:21:41.275879 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bqw4\" (UniqueName: \"kubernetes.io/projected/56705b88-b409-4811-a999-f548a0f108c7-kube-api-access-9bqw4\") pod \"56705b88-b409-4811-a999-f548a0f108c7\" (UID: \"56705b88-b409-4811-a999-f548a0f108c7\") " Mar 18 18:21:41 crc kubenswrapper[4830]: I0318 18:21:41.275917 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56705b88-b409-4811-a999-f548a0f108c7-config\") pod \"56705b88-b409-4811-a999-f548a0f108c7\" (UID: \"56705b88-b409-4811-a999-f548a0f108c7\") " Mar 18 18:21:41 crc kubenswrapper[4830]: I0318 18:21:41.287005 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56705b88-b409-4811-a999-f548a0f108c7-kube-api-access-9bqw4" (OuterVolumeSpecName: "kube-api-access-9bqw4") pod "56705b88-b409-4811-a999-f548a0f108c7" (UID: "56705b88-b409-4811-a999-f548a0f108c7"). InnerVolumeSpecName "kube-api-access-9bqw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:21:41 crc kubenswrapper[4830]: I0318 18:21:41.327543 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56705b88-b409-4811-a999-f548a0f108c7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "56705b88-b409-4811-a999-f548a0f108c7" (UID: "56705b88-b409-4811-a999-f548a0f108c7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:21:41 crc kubenswrapper[4830]: I0318 18:21:41.329332 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56705b88-b409-4811-a999-f548a0f108c7-config" (OuterVolumeSpecName: "config") pod "56705b88-b409-4811-a999-f548a0f108c7" (UID: "56705b88-b409-4811-a999-f548a0f108c7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:21:41 crc kubenswrapper[4830]: I0318 18:21:41.330591 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56705b88-b409-4811-a999-f548a0f108c7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "56705b88-b409-4811-a999-f548a0f108c7" (UID: "56705b88-b409-4811-a999-f548a0f108c7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:21:41 crc kubenswrapper[4830]: I0318 18:21:41.333232 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56705b88-b409-4811-a999-f548a0f108c7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "56705b88-b409-4811-a999-f548a0f108c7" (UID: "56705b88-b409-4811-a999-f548a0f108c7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:21:41 crc kubenswrapper[4830]: I0318 18:21:41.334330 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56705b88-b409-4811-a999-f548a0f108c7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "56705b88-b409-4811-a999-f548a0f108c7" (UID: "56705b88-b409-4811-a999-f548a0f108c7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:21:41 crc kubenswrapper[4830]: I0318 18:21:41.378798 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56705b88-b409-4811-a999-f548a0f108c7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:41 crc kubenswrapper[4830]: I0318 18:21:41.378841 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bqw4\" (UniqueName: \"kubernetes.io/projected/56705b88-b409-4811-a999-f548a0f108c7-kube-api-access-9bqw4\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:41 crc kubenswrapper[4830]: I0318 18:21:41.378854 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56705b88-b409-4811-a999-f548a0f108c7-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:41 crc kubenswrapper[4830]: I0318 18:21:41.378864 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56705b88-b409-4811-a999-f548a0f108c7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:41 crc kubenswrapper[4830]: I0318 18:21:41.378875 4830 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/56705b88-b409-4811-a999-f548a0f108c7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:41 crc kubenswrapper[4830]: I0318 18:21:41.378883 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56705b88-b409-4811-a999-f548a0f108c7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:41 crc kubenswrapper[4830]: I0318 18:21:41.744805 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-75b6876cd4-frf5t" Mar 18 18:21:42 crc kubenswrapper[4830]: I0318 18:21:42.009616 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d8b7b7d5-kjddm" event={"ID":"56705b88-b409-4811-a999-f548a0f108c7","Type":"ContainerDied","Data":"05109e7091ef81f157aeed03c072a59d2e8765e3a804c5c8316ed9e69ffcd911"} Mar 18 18:21:42 crc kubenswrapper[4830]: I0318 18:21:42.009650 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d8b7b7d5-kjddm" Mar 18 18:21:42 crc kubenswrapper[4830]: I0318 18:21:42.009671 4830 scope.go:117] "RemoveContainer" containerID="023f57ad1843e4c93604c976b9f7c86bb24ad0a5d2f1dfeb4f8efe8a9f50ae62" Mar 18 18:21:42 crc kubenswrapper[4830]: I0318 18:21:42.012398 4830 generic.go:334] "Generic (PLEG): container finished" podID="a60a5ad8-7dde-4000-acc7-83a4bc24fd7e" containerID="1beae2c931be0c4737d38ea81296666d857479db917f360e7a3d94688386583c" exitCode=0 Mar 18 18:21:42 crc kubenswrapper[4830]: I0318 18:21:42.013229 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e","Type":"ContainerDied","Data":"1beae2c931be0c4737d38ea81296666d857479db917f360e7a3d94688386583c"} Mar 18 18:21:42 crc kubenswrapper[4830]: I0318 18:21:42.035034 4830 scope.go:117] "RemoveContainer" containerID="f21019a22eb84ca940826b9de78b1fbbaa1ea01b137efc55759b6a2d08e34568" Mar 18 18:21:42 crc kubenswrapper[4830]: I0318 18:21:42.067025 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d8b7b7d5-kjddm"] Mar 18 18:21:42 crc kubenswrapper[4830]: I0318 18:21:42.077234 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d8b7b7d5-kjddm"] Mar 18 18:21:42 crc kubenswrapper[4830]: I0318 18:21:42.248508 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56705b88-b409-4811-a999-f548a0f108c7" path="/var/lib/kubelet/pods/56705b88-b409-4811-a999-f548a0f108c7/volumes" Mar 18 18:21:42 crc kubenswrapper[4830]: I0318 18:21:42.685401 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7f884dc87d-6wvs2" Mar 18 18:21:44 crc kubenswrapper[4830]: I0318 18:21:44.011388 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7f884dc87d-6wvs2" Mar 18 18:21:44 crc kubenswrapper[4830]: I0318 18:21:44.112889 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6dc87fdf44-w2w2m"] Mar 18 18:21:44 crc kubenswrapper[4830]: I0318 18:21:44.113146 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6dc87fdf44-w2w2m" podUID="79e7329c-f85f-4a8c-a2a6-67b1a8f11a22" containerName="barbican-api-log" containerID="cri-o://f9e9fc6cd88f98e08dc7d5c4419d20524d12a4843dc86a24b46f29998b131c98" gracePeriod=30 Mar 18 18:21:44 crc kubenswrapper[4830]: I0318 18:21:44.113255 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6dc87fdf44-w2w2m" podUID="79e7329c-f85f-4a8c-a2a6-67b1a8f11a22" containerName="barbican-api" containerID="cri-o://8fd7b21c0dd8911f3a79460cf0b624cea7d3f7d24a3296d05117200282e82b9a" gracePeriod=30 Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.051729 4830 generic.go:334] "Generic (PLEG): container finished" podID="a60a5ad8-7dde-4000-acc7-83a4bc24fd7e" containerID="2798e226f5d33632ac3e39e1b2da992a846726cf311ecc786f14ceaefbe70926" exitCode=0 Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.052131 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e","Type":"ContainerDied","Data":"2798e226f5d33632ac3e39e1b2da992a846726cf311ecc786f14ceaefbe70926"} Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.052160 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e","Type":"ContainerDied","Data":"27b26da026f4f072e149696ffbb45d1357ad51359a08f15bba482d1b467e658b"} Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.052170 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27b26da026f4f072e149696ffbb45d1357ad51359a08f15bba482d1b467e658b" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.075510 4830 generic.go:334] "Generic (PLEG): container finished" podID="79e7329c-f85f-4a8c-a2a6-67b1a8f11a22" containerID="f9e9fc6cd88f98e08dc7d5c4419d20524d12a4843dc86a24b46f29998b131c98" exitCode=143 Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.075610 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dc87fdf44-w2w2m" event={"ID":"79e7329c-f85f-4a8c-a2a6-67b1a8f11a22","Type":"ContainerDied","Data":"f9e9fc6cd88f98e08dc7d5c4419d20524d12a4843dc86a24b46f29998b131c98"} Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.091223 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.148639 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-config-data\") pod \"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e\" (UID: \"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e\") " Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.148872 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-combined-ca-bundle\") pod \"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e\" (UID: \"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e\") " Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.150352 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-scripts\") pod \"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e\" (UID: \"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e\") " Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.150961 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-config-data-custom\") pod \"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e\" (UID: \"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e\") " Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.151002 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-etc-machine-id\") pod \"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e\" (UID: \"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e\") " Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.151108 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rqgj\" (UniqueName: \"kubernetes.io/projected/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-kube-api-access-9rqgj\") pod \"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e\" (UID: \"a60a5ad8-7dde-4000-acc7-83a4bc24fd7e\") " Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.154880 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a60a5ad8-7dde-4000-acc7-83a4bc24fd7e" (UID: "a60a5ad8-7dde-4000-acc7-83a4bc24fd7e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.173077 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-kube-api-access-9rqgj" (OuterVolumeSpecName: "kube-api-access-9rqgj") pod "a60a5ad8-7dde-4000-acc7-83a4bc24fd7e" (UID: "a60a5ad8-7dde-4000-acc7-83a4bc24fd7e"). InnerVolumeSpecName "kube-api-access-9rqgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.173095 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a60a5ad8-7dde-4000-acc7-83a4bc24fd7e" (UID: "a60a5ad8-7dde-4000-acc7-83a4bc24fd7e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.174936 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-scripts" (OuterVolumeSpecName: "scripts") pod "a60a5ad8-7dde-4000-acc7-83a4bc24fd7e" (UID: "a60a5ad8-7dde-4000-acc7-83a4bc24fd7e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.228347 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a60a5ad8-7dde-4000-acc7-83a4bc24fd7e" (UID: "a60a5ad8-7dde-4000-acc7-83a4bc24fd7e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.255877 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.255910 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.255919 4830 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.255937 4830 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.255946 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rqgj\" (UniqueName: \"kubernetes.io/projected/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-kube-api-access-9rqgj\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.336229 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-config-data" (OuterVolumeSpecName: "config-data") pod "a60a5ad8-7dde-4000-acc7-83a4bc24fd7e" (UID: "a60a5ad8-7dde-4000-acc7-83a4bc24fd7e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.357292 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.463506 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-788c577778-lg8kp" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.468442 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-788c577778-lg8kp" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.521288 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-85cbc86c69-bkfst" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.588006 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-75b6876cd4-frf5t"] Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.588244 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-75b6876cd4-frf5t" podUID="0eaa5e65-715d-45fd-9ce9-cb2c1adcf283" containerName="neutron-api" containerID="cri-o://fdd3b6055bbc8797932488dbe9516bb55785797cab91378825cd527f760a35af" gracePeriod=30 Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.588636 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-75b6876cd4-frf5t" podUID="0eaa5e65-715d-45fd-9ce9-cb2c1adcf283" containerName="neutron-httpd" containerID="cri-o://d8f7462ba94d30ff3af81c767a9205afeac42282e30e8263a26db708a3be5d4e" gracePeriod=30 Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.780580 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-676956db6-6grw2"] Mar 18 18:21:45 crc kubenswrapper[4830]: E0318 18:21:45.782289 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a60a5ad8-7dde-4000-acc7-83a4bc24fd7e" containerName="cinder-scheduler" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.782304 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a60a5ad8-7dde-4000-acc7-83a4bc24fd7e" containerName="cinder-scheduler" Mar 18 18:21:45 crc kubenswrapper[4830]: E0318 18:21:45.783047 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56705b88-b409-4811-a999-f548a0f108c7" containerName="dnsmasq-dns" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.783056 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="56705b88-b409-4811-a999-f548a0f108c7" containerName="dnsmasq-dns" Mar 18 18:21:45 crc kubenswrapper[4830]: E0318 18:21:45.783068 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a60a5ad8-7dde-4000-acc7-83a4bc24fd7e" containerName="probe" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.783073 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a60a5ad8-7dde-4000-acc7-83a4bc24fd7e" containerName="probe" Mar 18 18:21:45 crc kubenswrapper[4830]: E0318 18:21:45.783090 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56705b88-b409-4811-a999-f548a0f108c7" containerName="init" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.783096 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="56705b88-b409-4811-a999-f548a0f108c7" containerName="init" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.783253 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="a60a5ad8-7dde-4000-acc7-83a4bc24fd7e" containerName="probe" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.783275 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="a60a5ad8-7dde-4000-acc7-83a4bc24fd7e" containerName="cinder-scheduler" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.783288 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="56705b88-b409-4811-a999-f548a0f108c7" containerName="dnsmasq-dns" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.784159 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-676956db6-6grw2" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.840361 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-676956db6-6grw2"] Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.874501 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad760963-34af-440e-9931-fbc23783d7cb-combined-ca-bundle\") pod \"placement-676956db6-6grw2\" (UID: \"ad760963-34af-440e-9931-fbc23783d7cb\") " pod="openstack/placement-676956db6-6grw2" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.874543 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad760963-34af-440e-9931-fbc23783d7cb-public-tls-certs\") pod \"placement-676956db6-6grw2\" (UID: \"ad760963-34af-440e-9931-fbc23783d7cb\") " pod="openstack/placement-676956db6-6grw2" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.874569 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad760963-34af-440e-9931-fbc23783d7cb-scripts\") pod \"placement-676956db6-6grw2\" (UID: \"ad760963-34af-440e-9931-fbc23783d7cb\") " pod="openstack/placement-676956db6-6grw2" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.874589 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrdng\" (UniqueName: \"kubernetes.io/projected/ad760963-34af-440e-9931-fbc23783d7cb-kube-api-access-lrdng\") pod \"placement-676956db6-6grw2\" (UID: \"ad760963-34af-440e-9931-fbc23783d7cb\") " pod="openstack/placement-676956db6-6grw2" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.874609 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad760963-34af-440e-9931-fbc23783d7cb-internal-tls-certs\") pod \"placement-676956db6-6grw2\" (UID: \"ad760963-34af-440e-9931-fbc23783d7cb\") " pod="openstack/placement-676956db6-6grw2" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.874683 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad760963-34af-440e-9931-fbc23783d7cb-logs\") pod \"placement-676956db6-6grw2\" (UID: \"ad760963-34af-440e-9931-fbc23783d7cb\") " pod="openstack/placement-676956db6-6grw2" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.874725 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad760963-34af-440e-9931-fbc23783d7cb-config-data\") pod \"placement-676956db6-6grw2\" (UID: \"ad760963-34af-440e-9931-fbc23783d7cb\") " pod="openstack/placement-676956db6-6grw2" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.975859 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad760963-34af-440e-9931-fbc23783d7cb-logs\") pod \"placement-676956db6-6grw2\" (UID: \"ad760963-34af-440e-9931-fbc23783d7cb\") " pod="openstack/placement-676956db6-6grw2" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.975930 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad760963-34af-440e-9931-fbc23783d7cb-config-data\") pod \"placement-676956db6-6grw2\" (UID: \"ad760963-34af-440e-9931-fbc23783d7cb\") " pod="openstack/placement-676956db6-6grw2" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.975976 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad760963-34af-440e-9931-fbc23783d7cb-combined-ca-bundle\") pod \"placement-676956db6-6grw2\" (UID: \"ad760963-34af-440e-9931-fbc23783d7cb\") " pod="openstack/placement-676956db6-6grw2" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.975995 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad760963-34af-440e-9931-fbc23783d7cb-public-tls-certs\") pod \"placement-676956db6-6grw2\" (UID: \"ad760963-34af-440e-9931-fbc23783d7cb\") " pod="openstack/placement-676956db6-6grw2" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.976015 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad760963-34af-440e-9931-fbc23783d7cb-scripts\") pod \"placement-676956db6-6grw2\" (UID: \"ad760963-34af-440e-9931-fbc23783d7cb\") " pod="openstack/placement-676956db6-6grw2" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.976033 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrdng\" (UniqueName: \"kubernetes.io/projected/ad760963-34af-440e-9931-fbc23783d7cb-kube-api-access-lrdng\") pod \"placement-676956db6-6grw2\" (UID: \"ad760963-34af-440e-9931-fbc23783d7cb\") " pod="openstack/placement-676956db6-6grw2" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.976053 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad760963-34af-440e-9931-fbc23783d7cb-internal-tls-certs\") pod \"placement-676956db6-6grw2\" (UID: \"ad760963-34af-440e-9931-fbc23783d7cb\") " pod="openstack/placement-676956db6-6grw2" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.976410 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad760963-34af-440e-9931-fbc23783d7cb-logs\") pod \"placement-676956db6-6grw2\" (UID: \"ad760963-34af-440e-9931-fbc23783d7cb\") " pod="openstack/placement-676956db6-6grw2" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.982910 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad760963-34af-440e-9931-fbc23783d7cb-internal-tls-certs\") pod \"placement-676956db6-6grw2\" (UID: \"ad760963-34af-440e-9931-fbc23783d7cb\") " pod="openstack/placement-676956db6-6grw2" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.984604 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad760963-34af-440e-9931-fbc23783d7cb-scripts\") pod \"placement-676956db6-6grw2\" (UID: \"ad760963-34af-440e-9931-fbc23783d7cb\") " pod="openstack/placement-676956db6-6grw2" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.985303 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad760963-34af-440e-9931-fbc23783d7cb-public-tls-certs\") pod \"placement-676956db6-6grw2\" (UID: \"ad760963-34af-440e-9931-fbc23783d7cb\") " pod="openstack/placement-676956db6-6grw2" Mar 18 18:21:45 crc kubenswrapper[4830]: I0318 18:21:45.988269 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad760963-34af-440e-9931-fbc23783d7cb-combined-ca-bundle\") pod \"placement-676956db6-6grw2\" (UID: \"ad760963-34af-440e-9931-fbc23783d7cb\") " pod="openstack/placement-676956db6-6grw2" Mar 18 18:21:46 crc kubenswrapper[4830]: I0318 18:21:45.999972 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrdng\" (UniqueName: \"kubernetes.io/projected/ad760963-34af-440e-9931-fbc23783d7cb-kube-api-access-lrdng\") pod \"placement-676956db6-6grw2\" (UID: \"ad760963-34af-440e-9931-fbc23783d7cb\") " pod="openstack/placement-676956db6-6grw2" Mar 18 18:21:46 crc kubenswrapper[4830]: I0318 18:21:46.005513 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad760963-34af-440e-9931-fbc23783d7cb-config-data\") pod \"placement-676956db6-6grw2\" (UID: \"ad760963-34af-440e-9931-fbc23783d7cb\") " pod="openstack/placement-676956db6-6grw2" Mar 18 18:21:46 crc kubenswrapper[4830]: I0318 18:21:46.084728 4830 generic.go:334] "Generic (PLEG): container finished" podID="0eaa5e65-715d-45fd-9ce9-cb2c1adcf283" containerID="d8f7462ba94d30ff3af81c767a9205afeac42282e30e8263a26db708a3be5d4e" exitCode=0 Mar 18 18:21:46 crc kubenswrapper[4830]: I0318 18:21:46.085037 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75b6876cd4-frf5t" event={"ID":"0eaa5e65-715d-45fd-9ce9-cb2c1adcf283","Type":"ContainerDied","Data":"d8f7462ba94d30ff3af81c767a9205afeac42282e30e8263a26db708a3be5d4e"} Mar 18 18:21:46 crc kubenswrapper[4830]: I0318 18:21:46.085158 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 18:21:46 crc kubenswrapper[4830]: I0318 18:21:46.117512 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-676956db6-6grw2" Mar 18 18:21:46 crc kubenswrapper[4830]: I0318 18:21:46.119154 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 18:21:46 crc kubenswrapper[4830]: I0318 18:21:46.128249 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 18:21:46 crc kubenswrapper[4830]: I0318 18:21:46.144883 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 18:21:46 crc kubenswrapper[4830]: I0318 18:21:46.146258 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 18:21:46 crc kubenswrapper[4830]: I0318 18:21:46.155855 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 18 18:21:46 crc kubenswrapper[4830]: I0318 18:21:46.169086 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 18:21:46 crc kubenswrapper[4830]: I0318 18:21:46.268235 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a60a5ad8-7dde-4000-acc7-83a4bc24fd7e" path="/var/lib/kubelet/pods/a60a5ad8-7dde-4000-acc7-83a4bc24fd7e/volumes" Mar 18 18:21:46 crc kubenswrapper[4830]: I0318 18:21:46.291585 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f82e5b2f-cb79-4b83-901f-eca64116c6dc-scripts\") pod \"cinder-scheduler-0\" (UID: \"f82e5b2f-cb79-4b83-901f-eca64116c6dc\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:46 crc kubenswrapper[4830]: I0318 18:21:46.291660 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f82e5b2f-cb79-4b83-901f-eca64116c6dc-config-data\") pod \"cinder-scheduler-0\" (UID: \"f82e5b2f-cb79-4b83-901f-eca64116c6dc\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:46 crc kubenswrapper[4830]: I0318 18:21:46.291716 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f82e5b2f-cb79-4b83-901f-eca64116c6dc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f82e5b2f-cb79-4b83-901f-eca64116c6dc\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:46 crc kubenswrapper[4830]: I0318 18:21:46.291737 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f82e5b2f-cb79-4b83-901f-eca64116c6dc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f82e5b2f-cb79-4b83-901f-eca64116c6dc\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:46 crc kubenswrapper[4830]: I0318 18:21:46.291788 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2z5q\" (UniqueName: \"kubernetes.io/projected/f82e5b2f-cb79-4b83-901f-eca64116c6dc-kube-api-access-g2z5q\") pod \"cinder-scheduler-0\" (UID: \"f82e5b2f-cb79-4b83-901f-eca64116c6dc\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:46 crc kubenswrapper[4830]: I0318 18:21:46.291816 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f82e5b2f-cb79-4b83-901f-eca64116c6dc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f82e5b2f-cb79-4b83-901f-eca64116c6dc\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:46 crc kubenswrapper[4830]: I0318 18:21:46.394174 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f82e5b2f-cb79-4b83-901f-eca64116c6dc-scripts\") pod \"cinder-scheduler-0\" (UID: \"f82e5b2f-cb79-4b83-901f-eca64116c6dc\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:46 crc kubenswrapper[4830]: I0318 18:21:46.394236 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f82e5b2f-cb79-4b83-901f-eca64116c6dc-config-data\") pod \"cinder-scheduler-0\" (UID: \"f82e5b2f-cb79-4b83-901f-eca64116c6dc\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:46 crc kubenswrapper[4830]: I0318 18:21:46.394299 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f82e5b2f-cb79-4b83-901f-eca64116c6dc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f82e5b2f-cb79-4b83-901f-eca64116c6dc\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:46 crc kubenswrapper[4830]: I0318 18:21:46.394319 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f82e5b2f-cb79-4b83-901f-eca64116c6dc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f82e5b2f-cb79-4b83-901f-eca64116c6dc\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:46 crc kubenswrapper[4830]: I0318 18:21:46.394358 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2z5q\" (UniqueName: \"kubernetes.io/projected/f82e5b2f-cb79-4b83-901f-eca64116c6dc-kube-api-access-g2z5q\") pod \"cinder-scheduler-0\" (UID: \"f82e5b2f-cb79-4b83-901f-eca64116c6dc\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:46 crc kubenswrapper[4830]: I0318 18:21:46.394411 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f82e5b2f-cb79-4b83-901f-eca64116c6dc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f82e5b2f-cb79-4b83-901f-eca64116c6dc\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:46 crc kubenswrapper[4830]: I0318 18:21:46.396066 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f82e5b2f-cb79-4b83-901f-eca64116c6dc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f82e5b2f-cb79-4b83-901f-eca64116c6dc\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:46 crc kubenswrapper[4830]: I0318 18:21:46.401148 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f82e5b2f-cb79-4b83-901f-eca64116c6dc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f82e5b2f-cb79-4b83-901f-eca64116c6dc\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:46 crc kubenswrapper[4830]: I0318 18:21:46.409698 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f82e5b2f-cb79-4b83-901f-eca64116c6dc-config-data\") pod \"cinder-scheduler-0\" (UID: \"f82e5b2f-cb79-4b83-901f-eca64116c6dc\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:46 crc kubenswrapper[4830]: I0318 18:21:46.416677 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f82e5b2f-cb79-4b83-901f-eca64116c6dc-scripts\") pod \"cinder-scheduler-0\" (UID: \"f82e5b2f-cb79-4b83-901f-eca64116c6dc\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:46 crc kubenswrapper[4830]: I0318 18:21:46.418339 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f82e5b2f-cb79-4b83-901f-eca64116c6dc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f82e5b2f-cb79-4b83-901f-eca64116c6dc\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:46 crc kubenswrapper[4830]: I0318 18:21:46.425522 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2z5q\" (UniqueName: \"kubernetes.io/projected/f82e5b2f-cb79-4b83-901f-eca64116c6dc-kube-api-access-g2z5q\") pod \"cinder-scheduler-0\" (UID: \"f82e5b2f-cb79-4b83-901f-eca64116c6dc\") " pod="openstack/cinder-scheduler-0" Mar 18 18:21:46 crc kubenswrapper[4830]: I0318 18:21:46.558607 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 18:21:46 crc kubenswrapper[4830]: I0318 18:21:46.693403 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-676956db6-6grw2"] Mar 18 18:21:47 crc kubenswrapper[4830]: I0318 18:21:47.074919 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 18:21:47 crc kubenswrapper[4830]: I0318 18:21:47.125595 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-676956db6-6grw2" event={"ID":"ad760963-34af-440e-9931-fbc23783d7cb","Type":"ContainerStarted","Data":"1f65787d2e3aac204498b2bda1b107a09472a1e7a4c737c2468ded43190d999e"} Mar 18 18:21:47 crc kubenswrapper[4830]: I0318 18:21:47.125644 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-676956db6-6grw2" event={"ID":"ad760963-34af-440e-9931-fbc23783d7cb","Type":"ContainerStarted","Data":"ba4e7878acb6a02897824a50a1bd651baa3285202223a312e374b33d07c03478"} Mar 18 18:21:47 crc kubenswrapper[4830]: I0318 18:21:47.711134 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6dc87fdf44-w2w2m" podUID="79e7329c-f85f-4a8c-a2a6-67b1a8f11a22" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.165:9311/healthcheck\": read tcp 10.217.0.2:38144->10.217.0.165:9311: read: connection reset by peer" Mar 18 18:21:47 crc kubenswrapper[4830]: I0318 18:21:47.711187 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6dc87fdf44-w2w2m" podUID="79e7329c-f85f-4a8c-a2a6-67b1a8f11a22" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.165:9311/healthcheck\": read tcp 10.217.0.2:38148->10.217.0.165:9311: read: connection reset by peer" Mar 18 18:21:48 crc kubenswrapper[4830]: I0318 18:21:48.178810 4830 generic.go:334] "Generic (PLEG): container finished" podID="79e7329c-f85f-4a8c-a2a6-67b1a8f11a22" containerID="8fd7b21c0dd8911f3a79460cf0b624cea7d3f7d24a3296d05117200282e82b9a" exitCode=0 Mar 18 18:21:48 crc kubenswrapper[4830]: I0318 18:21:48.178878 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dc87fdf44-w2w2m" event={"ID":"79e7329c-f85f-4a8c-a2a6-67b1a8f11a22","Type":"ContainerDied","Data":"8fd7b21c0dd8911f3a79460cf0b624cea7d3f7d24a3296d05117200282e82b9a"} Mar 18 18:21:48 crc kubenswrapper[4830]: I0318 18:21:48.187412 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f82e5b2f-cb79-4b83-901f-eca64116c6dc","Type":"ContainerStarted","Data":"b5a1cb9ea4b62aec9ff11f16f75f04dd21b28a1d37ed79fbc6fa3de1b8390289"} Mar 18 18:21:48 crc kubenswrapper[4830]: I0318 18:21:48.187455 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f82e5b2f-cb79-4b83-901f-eca64116c6dc","Type":"ContainerStarted","Data":"cb2d6e60f8a2bd293191e5cf991ab5c2cf38315ea96e0ad1d379d20116844258"} Mar 18 18:21:48 crc kubenswrapper[4830]: I0318 18:21:48.197087 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-676956db6-6grw2" event={"ID":"ad760963-34af-440e-9931-fbc23783d7cb","Type":"ContainerStarted","Data":"06b5da3aa085e9b3e11d65936e872fab74b18aa97d39f5db82fc225e3ce954b4"} Mar 18 18:21:48 crc kubenswrapper[4830]: I0318 18:21:48.197348 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-676956db6-6grw2" Mar 18 18:21:48 crc kubenswrapper[4830]: I0318 18:21:48.197395 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-676956db6-6grw2" Mar 18 18:21:48 crc kubenswrapper[4830]: I0318 18:21:48.234890 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-676956db6-6grw2" podStartSLOduration=3.234854039 podStartE2EDuration="3.234854039s" podCreationTimestamp="2026-03-18 18:21:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:21:48.218844928 +0000 UTC m=+1142.786475260" watchObservedRunningTime="2026-03-18 18:21:48.234854039 +0000 UTC m=+1142.802484371" Mar 18 18:21:48 crc kubenswrapper[4830]: I0318 18:21:48.246096 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6dc87fdf44-w2w2m" Mar 18 18:21:48 crc kubenswrapper[4830]: I0318 18:21:48.341182 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt9hr\" (UniqueName: \"kubernetes.io/projected/79e7329c-f85f-4a8c-a2a6-67b1a8f11a22-kube-api-access-jt9hr\") pod \"79e7329c-f85f-4a8c-a2a6-67b1a8f11a22\" (UID: \"79e7329c-f85f-4a8c-a2a6-67b1a8f11a22\") " Mar 18 18:21:48 crc kubenswrapper[4830]: I0318 18:21:48.341308 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79e7329c-f85f-4a8c-a2a6-67b1a8f11a22-logs\") pod \"79e7329c-f85f-4a8c-a2a6-67b1a8f11a22\" (UID: \"79e7329c-f85f-4a8c-a2a6-67b1a8f11a22\") " Mar 18 18:21:48 crc kubenswrapper[4830]: I0318 18:21:48.341483 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79e7329c-f85f-4a8c-a2a6-67b1a8f11a22-config-data-custom\") pod \"79e7329c-f85f-4a8c-a2a6-67b1a8f11a22\" (UID: \"79e7329c-f85f-4a8c-a2a6-67b1a8f11a22\") " Mar 18 18:21:48 crc kubenswrapper[4830]: I0318 18:21:48.341557 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79e7329c-f85f-4a8c-a2a6-67b1a8f11a22-combined-ca-bundle\") pod \"79e7329c-f85f-4a8c-a2a6-67b1a8f11a22\" (UID: \"79e7329c-f85f-4a8c-a2a6-67b1a8f11a22\") " Mar 18 18:21:48 crc kubenswrapper[4830]: I0318 18:21:48.341721 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79e7329c-f85f-4a8c-a2a6-67b1a8f11a22-config-data\") pod \"79e7329c-f85f-4a8c-a2a6-67b1a8f11a22\" (UID: \"79e7329c-f85f-4a8c-a2a6-67b1a8f11a22\") " Mar 18 18:21:48 crc kubenswrapper[4830]: I0318 18:21:48.354485 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79e7329c-f85f-4a8c-a2a6-67b1a8f11a22-logs" (OuterVolumeSpecName: "logs") pod "79e7329c-f85f-4a8c-a2a6-67b1a8f11a22" (UID: "79e7329c-f85f-4a8c-a2a6-67b1a8f11a22"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:21:48 crc kubenswrapper[4830]: I0318 18:21:48.361849 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79e7329c-f85f-4a8c-a2a6-67b1a8f11a22-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "79e7329c-f85f-4a8c-a2a6-67b1a8f11a22" (UID: "79e7329c-f85f-4a8c-a2a6-67b1a8f11a22"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:48 crc kubenswrapper[4830]: I0318 18:21:48.363506 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79e7329c-f85f-4a8c-a2a6-67b1a8f11a22-kube-api-access-jt9hr" (OuterVolumeSpecName: "kube-api-access-jt9hr") pod "79e7329c-f85f-4a8c-a2a6-67b1a8f11a22" (UID: "79e7329c-f85f-4a8c-a2a6-67b1a8f11a22"). InnerVolumeSpecName "kube-api-access-jt9hr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:21:48 crc kubenswrapper[4830]: I0318 18:21:48.398922 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79e7329c-f85f-4a8c-a2a6-67b1a8f11a22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79e7329c-f85f-4a8c-a2a6-67b1a8f11a22" (UID: "79e7329c-f85f-4a8c-a2a6-67b1a8f11a22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:48 crc kubenswrapper[4830]: I0318 18:21:48.403189 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79e7329c-f85f-4a8c-a2a6-67b1a8f11a22-config-data" (OuterVolumeSpecName: "config-data") pod "79e7329c-f85f-4a8c-a2a6-67b1a8f11a22" (UID: "79e7329c-f85f-4a8c-a2a6-67b1a8f11a22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:48 crc kubenswrapper[4830]: I0318 18:21:48.443898 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79e7329c-f85f-4a8c-a2a6-67b1a8f11a22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:48 crc kubenswrapper[4830]: I0318 18:21:48.444205 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79e7329c-f85f-4a8c-a2a6-67b1a8f11a22-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:48 crc kubenswrapper[4830]: I0318 18:21:48.444272 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jt9hr\" (UniqueName: \"kubernetes.io/projected/79e7329c-f85f-4a8c-a2a6-67b1a8f11a22-kube-api-access-jt9hr\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:48 crc kubenswrapper[4830]: I0318 18:21:48.444332 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79e7329c-f85f-4a8c-a2a6-67b1a8f11a22-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:48 crc kubenswrapper[4830]: I0318 18:21:48.444384 4830 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79e7329c-f85f-4a8c-a2a6-67b1a8f11a22-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:48 crc kubenswrapper[4830]: I0318 18:21:48.538732 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6f6ff8b5bf-p5xgc" Mar 18 18:21:48 crc kubenswrapper[4830]: I0318 18:21:48.886799 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 18 18:21:49 crc kubenswrapper[4830]: I0318 18:21:49.207184 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dc87fdf44-w2w2m" event={"ID":"79e7329c-f85f-4a8c-a2a6-67b1a8f11a22","Type":"ContainerDied","Data":"f127aec0f0f848aa2fd8d54fb8227417a9ac9135ec653fa2b922073c732b6cbd"} Mar 18 18:21:49 crc kubenswrapper[4830]: I0318 18:21:49.207235 4830 scope.go:117] "RemoveContainer" containerID="8fd7b21c0dd8911f3a79460cf0b624cea7d3f7d24a3296d05117200282e82b9a" Mar 18 18:21:49 crc kubenswrapper[4830]: I0318 18:21:49.207293 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6dc87fdf44-w2w2m" Mar 18 18:21:49 crc kubenswrapper[4830]: I0318 18:21:49.209643 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f82e5b2f-cb79-4b83-901f-eca64116c6dc","Type":"ContainerStarted","Data":"29d7529f10ab82210873010bcc63b7af8c9609591cd2c3e31e8d0689d2b017f6"} Mar 18 18:21:49 crc kubenswrapper[4830]: I0318 18:21:49.239202 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.239179884 podStartE2EDuration="3.239179884s" podCreationTimestamp="2026-03-18 18:21:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:21:49.235202572 +0000 UTC m=+1143.802832904" watchObservedRunningTime="2026-03-18 18:21:49.239179884 +0000 UTC m=+1143.806810216" Mar 18 18:21:49 crc kubenswrapper[4830]: I0318 18:21:49.248648 4830 scope.go:117] "RemoveContainer" containerID="f9e9fc6cd88f98e08dc7d5c4419d20524d12a4843dc86a24b46f29998b131c98" Mar 18 18:21:49 crc kubenswrapper[4830]: I0318 18:21:49.255279 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6dc87fdf44-w2w2m"] Mar 18 18:21:49 crc kubenswrapper[4830]: I0318 18:21:49.262928 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6dc87fdf44-w2w2m"] Mar 18 18:21:49 crc kubenswrapper[4830]: I0318 18:21:49.954566 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 18 18:21:49 crc kubenswrapper[4830]: E0318 18:21:49.955294 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79e7329c-f85f-4a8c-a2a6-67b1a8f11a22" containerName="barbican-api-log" Mar 18 18:21:49 crc kubenswrapper[4830]: I0318 18:21:49.955312 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="79e7329c-f85f-4a8c-a2a6-67b1a8f11a22" containerName="barbican-api-log" Mar 18 18:21:49 crc kubenswrapper[4830]: E0318 18:21:49.955323 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79e7329c-f85f-4a8c-a2a6-67b1a8f11a22" containerName="barbican-api" Mar 18 18:21:49 crc kubenswrapper[4830]: I0318 18:21:49.955329 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="79e7329c-f85f-4a8c-a2a6-67b1a8f11a22" containerName="barbican-api" Mar 18 18:21:49 crc kubenswrapper[4830]: I0318 18:21:49.955520 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="79e7329c-f85f-4a8c-a2a6-67b1a8f11a22" containerName="barbican-api-log" Mar 18 18:21:49 crc kubenswrapper[4830]: I0318 18:21:49.955541 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="79e7329c-f85f-4a8c-a2a6-67b1a8f11a22" containerName="barbican-api" Mar 18 18:21:49 crc kubenswrapper[4830]: I0318 18:21:49.956207 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 18:21:49 crc kubenswrapper[4830]: I0318 18:21:49.960070 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 18 18:21:49 crc kubenswrapper[4830]: I0318 18:21:49.960245 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-jfgs6" Mar 18 18:21:49 crc kubenswrapper[4830]: I0318 18:21:49.960436 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 18 18:21:49 crc kubenswrapper[4830]: I0318 18:21:49.979299 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 18:21:49 crc kubenswrapper[4830]: I0318 18:21:49.980606 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75b6876cd4-frf5t" Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.073372 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wb24t\" (UniqueName: \"kubernetes.io/projected/0eaa5e65-715d-45fd-9ce9-cb2c1adcf283-kube-api-access-wb24t\") pod \"0eaa5e65-715d-45fd-9ce9-cb2c1adcf283\" (UID: \"0eaa5e65-715d-45fd-9ce9-cb2c1adcf283\") " Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.073437 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eaa5e65-715d-45fd-9ce9-cb2c1adcf283-combined-ca-bundle\") pod \"0eaa5e65-715d-45fd-9ce9-cb2c1adcf283\" (UID: \"0eaa5e65-715d-45fd-9ce9-cb2c1adcf283\") " Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.073498 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0eaa5e65-715d-45fd-9ce9-cb2c1adcf283-httpd-config\") pod \"0eaa5e65-715d-45fd-9ce9-cb2c1adcf283\" (UID: \"0eaa5e65-715d-45fd-9ce9-cb2c1adcf283\") " Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.073614 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0eaa5e65-715d-45fd-9ce9-cb2c1adcf283-ovndb-tls-certs\") pod \"0eaa5e65-715d-45fd-9ce9-cb2c1adcf283\" (UID: \"0eaa5e65-715d-45fd-9ce9-cb2c1adcf283\") " Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.073649 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0eaa5e65-715d-45fd-9ce9-cb2c1adcf283-config\") pod \"0eaa5e65-715d-45fd-9ce9-cb2c1adcf283\" (UID: \"0eaa5e65-715d-45fd-9ce9-cb2c1adcf283\") " Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.073885 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1965a180-09c8-4af1-852e-7792c02564ca-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1965a180-09c8-4af1-852e-7792c02564ca\") " pod="openstack/openstackclient" Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.073952 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1965a180-09c8-4af1-852e-7792c02564ca-openstack-config-secret\") pod \"openstackclient\" (UID: \"1965a180-09c8-4af1-852e-7792c02564ca\") " pod="openstack/openstackclient" Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.074028 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc42f\" (UniqueName: \"kubernetes.io/projected/1965a180-09c8-4af1-852e-7792c02564ca-kube-api-access-dc42f\") pod \"openstackclient\" (UID: \"1965a180-09c8-4af1-852e-7792c02564ca\") " pod="openstack/openstackclient" Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.074064 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1965a180-09c8-4af1-852e-7792c02564ca-openstack-config\") pod \"openstackclient\" (UID: \"1965a180-09c8-4af1-852e-7792c02564ca\") " pod="openstack/openstackclient" Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.081104 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eaa5e65-715d-45fd-9ce9-cb2c1adcf283-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "0eaa5e65-715d-45fd-9ce9-cb2c1adcf283" (UID: "0eaa5e65-715d-45fd-9ce9-cb2c1adcf283"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.090714 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eaa5e65-715d-45fd-9ce9-cb2c1adcf283-kube-api-access-wb24t" (OuterVolumeSpecName: "kube-api-access-wb24t") pod "0eaa5e65-715d-45fd-9ce9-cb2c1adcf283" (UID: "0eaa5e65-715d-45fd-9ce9-cb2c1adcf283"). InnerVolumeSpecName "kube-api-access-wb24t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.151702 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eaa5e65-715d-45fd-9ce9-cb2c1adcf283-config" (OuterVolumeSpecName: "config") pod "0eaa5e65-715d-45fd-9ce9-cb2c1adcf283" (UID: "0eaa5e65-715d-45fd-9ce9-cb2c1adcf283"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.161550 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eaa5e65-715d-45fd-9ce9-cb2c1adcf283-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0eaa5e65-715d-45fd-9ce9-cb2c1adcf283" (UID: "0eaa5e65-715d-45fd-9ce9-cb2c1adcf283"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.162714 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eaa5e65-715d-45fd-9ce9-cb2c1adcf283-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "0eaa5e65-715d-45fd-9ce9-cb2c1adcf283" (UID: "0eaa5e65-715d-45fd-9ce9-cb2c1adcf283"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.177140 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1965a180-09c8-4af1-852e-7792c02564ca-openstack-config\") pod \"openstackclient\" (UID: \"1965a180-09c8-4af1-852e-7792c02564ca\") " pod="openstack/openstackclient" Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.177318 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1965a180-09c8-4af1-852e-7792c02564ca-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1965a180-09c8-4af1-852e-7792c02564ca\") " pod="openstack/openstackclient" Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.177538 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1965a180-09c8-4af1-852e-7792c02564ca-openstack-config-secret\") pod \"openstackclient\" (UID: \"1965a180-09c8-4af1-852e-7792c02564ca\") " pod="openstack/openstackclient" Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.177910 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc42f\" (UniqueName: \"kubernetes.io/projected/1965a180-09c8-4af1-852e-7792c02564ca-kube-api-access-dc42f\") pod \"openstackclient\" (UID: \"1965a180-09c8-4af1-852e-7792c02564ca\") " pod="openstack/openstackclient" Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.178003 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wb24t\" (UniqueName: \"kubernetes.io/projected/0eaa5e65-715d-45fd-9ce9-cb2c1adcf283-kube-api-access-wb24t\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.178025 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eaa5e65-715d-45fd-9ce9-cb2c1adcf283-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.178039 4830 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0eaa5e65-715d-45fd-9ce9-cb2c1adcf283-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.178054 4830 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0eaa5e65-715d-45fd-9ce9-cb2c1adcf283-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.178067 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0eaa5e65-715d-45fd-9ce9-cb2c1adcf283-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.178304 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1965a180-09c8-4af1-852e-7792c02564ca-openstack-config\") pod \"openstackclient\" (UID: \"1965a180-09c8-4af1-852e-7792c02564ca\") " pod="openstack/openstackclient" Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.182107 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1965a180-09c8-4af1-852e-7792c02564ca-openstack-config-secret\") pod \"openstackclient\" (UID: \"1965a180-09c8-4af1-852e-7792c02564ca\") " pod="openstack/openstackclient" Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.183807 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1965a180-09c8-4af1-852e-7792c02564ca-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1965a180-09c8-4af1-852e-7792c02564ca\") " pod="openstack/openstackclient" Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.200385 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc42f\" (UniqueName: \"kubernetes.io/projected/1965a180-09c8-4af1-852e-7792c02564ca-kube-api-access-dc42f\") pod \"openstackclient\" (UID: \"1965a180-09c8-4af1-852e-7792c02564ca\") " pod="openstack/openstackclient" Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.255540 4830 generic.go:334] "Generic (PLEG): container finished" podID="0eaa5e65-715d-45fd-9ce9-cb2c1adcf283" containerID="fdd3b6055bbc8797932488dbe9516bb55785797cab91378825cd527f760a35af" exitCode=0 Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.256800 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75b6876cd4-frf5t" Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.287797 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79e7329c-f85f-4a8c-a2a6-67b1a8f11a22" path="/var/lib/kubelet/pods/79e7329c-f85f-4a8c-a2a6-67b1a8f11a22/volumes" Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.288722 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75b6876cd4-frf5t" event={"ID":"0eaa5e65-715d-45fd-9ce9-cb2c1adcf283","Type":"ContainerDied","Data":"fdd3b6055bbc8797932488dbe9516bb55785797cab91378825cd527f760a35af"} Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.288752 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75b6876cd4-frf5t" event={"ID":"0eaa5e65-715d-45fd-9ce9-cb2c1adcf283","Type":"ContainerDied","Data":"b397e3843d9fe6905ef94a24c04b49242d68c5037d106a7914625d142c38dc3b"} Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.288783 4830 scope.go:117] "RemoveContainer" containerID="d8f7462ba94d30ff3af81c767a9205afeac42282e30e8263a26db708a3be5d4e" Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.299003 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.302734 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-75b6876cd4-frf5t"] Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.310605 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-75b6876cd4-frf5t"] Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.325958 4830 scope.go:117] "RemoveContainer" containerID="fdd3b6055bbc8797932488dbe9516bb55785797cab91378825cd527f760a35af" Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.351454 4830 scope.go:117] "RemoveContainer" containerID="d8f7462ba94d30ff3af81c767a9205afeac42282e30e8263a26db708a3be5d4e" Mar 18 18:21:50 crc kubenswrapper[4830]: E0318 18:21:50.352364 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8f7462ba94d30ff3af81c767a9205afeac42282e30e8263a26db708a3be5d4e\": container with ID starting with d8f7462ba94d30ff3af81c767a9205afeac42282e30e8263a26db708a3be5d4e not found: ID does not exist" containerID="d8f7462ba94d30ff3af81c767a9205afeac42282e30e8263a26db708a3be5d4e" Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.352445 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8f7462ba94d30ff3af81c767a9205afeac42282e30e8263a26db708a3be5d4e"} err="failed to get container status \"d8f7462ba94d30ff3af81c767a9205afeac42282e30e8263a26db708a3be5d4e\": rpc error: code = NotFound desc = could not find container \"d8f7462ba94d30ff3af81c767a9205afeac42282e30e8263a26db708a3be5d4e\": container with ID starting with d8f7462ba94d30ff3af81c767a9205afeac42282e30e8263a26db708a3be5d4e not found: ID does not exist" Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.352692 4830 scope.go:117] "RemoveContainer" containerID="fdd3b6055bbc8797932488dbe9516bb55785797cab91378825cd527f760a35af" Mar 18 18:21:50 crc kubenswrapper[4830]: E0318 18:21:50.353196 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdd3b6055bbc8797932488dbe9516bb55785797cab91378825cd527f760a35af\": container with ID starting with fdd3b6055bbc8797932488dbe9516bb55785797cab91378825cd527f760a35af not found: ID does not exist" containerID="fdd3b6055bbc8797932488dbe9516bb55785797cab91378825cd527f760a35af" Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.353230 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdd3b6055bbc8797932488dbe9516bb55785797cab91378825cd527f760a35af"} err="failed to get container status \"fdd3b6055bbc8797932488dbe9516bb55785797cab91378825cd527f760a35af\": rpc error: code = NotFound desc = could not find container \"fdd3b6055bbc8797932488dbe9516bb55785797cab91378825cd527f760a35af\": container with ID starting with fdd3b6055bbc8797932488dbe9516bb55785797cab91378825cd527f760a35af not found: ID does not exist" Mar 18 18:21:50 crc kubenswrapper[4830]: I0318 18:21:50.712667 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 18:21:51 crc kubenswrapper[4830]: I0318 18:21:51.279820 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"1965a180-09c8-4af1-852e-7792c02564ca","Type":"ContainerStarted","Data":"bf4a168a4c3f8770a8c86755cf5674b09ec68c9b7d624b4fbd33927be9e66c7b"} Mar 18 18:21:51 crc kubenswrapper[4830]: I0318 18:21:51.558898 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 18:21:52 crc kubenswrapper[4830]: I0318 18:21:52.244311 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0eaa5e65-715d-45fd-9ce9-cb2c1adcf283" path="/var/lib/kubelet/pods/0eaa5e65-715d-45fd-9ce9-cb2c1adcf283/volumes" Mar 18 18:21:54 crc kubenswrapper[4830]: I0318 18:21:54.981879 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-d76d78d97-bs4hd"] Mar 18 18:21:54 crc kubenswrapper[4830]: E0318 18:21:54.982620 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eaa5e65-715d-45fd-9ce9-cb2c1adcf283" containerName="neutron-httpd" Mar 18 18:21:54 crc kubenswrapper[4830]: I0318 18:21:54.982631 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eaa5e65-715d-45fd-9ce9-cb2c1adcf283" containerName="neutron-httpd" Mar 18 18:21:54 crc kubenswrapper[4830]: E0318 18:21:54.982640 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eaa5e65-715d-45fd-9ce9-cb2c1adcf283" containerName="neutron-api" Mar 18 18:21:54 crc kubenswrapper[4830]: I0318 18:21:54.982646 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eaa5e65-715d-45fd-9ce9-cb2c1adcf283" containerName="neutron-api" Mar 18 18:21:54 crc kubenswrapper[4830]: I0318 18:21:54.982814 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eaa5e65-715d-45fd-9ce9-cb2c1adcf283" containerName="neutron-api" Mar 18 18:21:54 crc kubenswrapper[4830]: I0318 18:21:54.982827 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eaa5e65-715d-45fd-9ce9-cb2c1adcf283" containerName="neutron-httpd" Mar 18 18:21:54 crc kubenswrapper[4830]: I0318 18:21:54.983727 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-d76d78d97-bs4hd" Mar 18 18:21:54 crc kubenswrapper[4830]: I0318 18:21:54.986040 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 18 18:21:54 crc kubenswrapper[4830]: I0318 18:21:54.986066 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 18 18:21:54 crc kubenswrapper[4830]: I0318 18:21:54.986102 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 18 18:21:55 crc kubenswrapper[4830]: I0318 18:21:55.007529 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-d76d78d97-bs4hd"] Mar 18 18:21:55 crc kubenswrapper[4830]: I0318 18:21:55.072826 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fk6s\" (UniqueName: \"kubernetes.io/projected/9be76a38-b85f-458f-b5c9-181abf962109-kube-api-access-6fk6s\") pod \"swift-proxy-d76d78d97-bs4hd\" (UID: \"9be76a38-b85f-458f-b5c9-181abf962109\") " pod="openstack/swift-proxy-d76d78d97-bs4hd" Mar 18 18:21:55 crc kubenswrapper[4830]: I0318 18:21:55.072872 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9be76a38-b85f-458f-b5c9-181abf962109-public-tls-certs\") pod \"swift-proxy-d76d78d97-bs4hd\" (UID: \"9be76a38-b85f-458f-b5c9-181abf962109\") " pod="openstack/swift-proxy-d76d78d97-bs4hd" Mar 18 18:21:55 crc kubenswrapper[4830]: I0318 18:21:55.072911 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9be76a38-b85f-458f-b5c9-181abf962109-etc-swift\") pod \"swift-proxy-d76d78d97-bs4hd\" (UID: \"9be76a38-b85f-458f-b5c9-181abf962109\") " pod="openstack/swift-proxy-d76d78d97-bs4hd" Mar 18 18:21:55 crc kubenswrapper[4830]: I0318 18:21:55.073047 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9be76a38-b85f-458f-b5c9-181abf962109-internal-tls-certs\") pod \"swift-proxy-d76d78d97-bs4hd\" (UID: \"9be76a38-b85f-458f-b5c9-181abf962109\") " pod="openstack/swift-proxy-d76d78d97-bs4hd" Mar 18 18:21:55 crc kubenswrapper[4830]: I0318 18:21:55.073155 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9be76a38-b85f-458f-b5c9-181abf962109-log-httpd\") pod \"swift-proxy-d76d78d97-bs4hd\" (UID: \"9be76a38-b85f-458f-b5c9-181abf962109\") " pod="openstack/swift-proxy-d76d78d97-bs4hd" Mar 18 18:21:55 crc kubenswrapper[4830]: I0318 18:21:55.073266 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9be76a38-b85f-458f-b5c9-181abf962109-run-httpd\") pod \"swift-proxy-d76d78d97-bs4hd\" (UID: \"9be76a38-b85f-458f-b5c9-181abf962109\") " pod="openstack/swift-proxy-d76d78d97-bs4hd" Mar 18 18:21:55 crc kubenswrapper[4830]: I0318 18:21:55.073305 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9be76a38-b85f-458f-b5c9-181abf962109-combined-ca-bundle\") pod \"swift-proxy-d76d78d97-bs4hd\" (UID: \"9be76a38-b85f-458f-b5c9-181abf962109\") " pod="openstack/swift-proxy-d76d78d97-bs4hd" Mar 18 18:21:55 crc kubenswrapper[4830]: I0318 18:21:55.073360 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9be76a38-b85f-458f-b5c9-181abf962109-config-data\") pod \"swift-proxy-d76d78d97-bs4hd\" (UID: \"9be76a38-b85f-458f-b5c9-181abf962109\") " pod="openstack/swift-proxy-d76d78d97-bs4hd" Mar 18 18:21:55 crc kubenswrapper[4830]: I0318 18:21:55.175062 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fk6s\" (UniqueName: \"kubernetes.io/projected/9be76a38-b85f-458f-b5c9-181abf962109-kube-api-access-6fk6s\") pod \"swift-proxy-d76d78d97-bs4hd\" (UID: \"9be76a38-b85f-458f-b5c9-181abf962109\") " pod="openstack/swift-proxy-d76d78d97-bs4hd" Mar 18 18:21:55 crc kubenswrapper[4830]: I0318 18:21:55.175148 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9be76a38-b85f-458f-b5c9-181abf962109-public-tls-certs\") pod \"swift-proxy-d76d78d97-bs4hd\" (UID: \"9be76a38-b85f-458f-b5c9-181abf962109\") " pod="openstack/swift-proxy-d76d78d97-bs4hd" Mar 18 18:21:55 crc kubenswrapper[4830]: I0318 18:21:55.176119 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9be76a38-b85f-458f-b5c9-181abf962109-etc-swift\") pod \"swift-proxy-d76d78d97-bs4hd\" (UID: \"9be76a38-b85f-458f-b5c9-181abf962109\") " pod="openstack/swift-proxy-d76d78d97-bs4hd" Mar 18 18:21:55 crc kubenswrapper[4830]: I0318 18:21:55.176174 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9be76a38-b85f-458f-b5c9-181abf962109-internal-tls-certs\") pod \"swift-proxy-d76d78d97-bs4hd\" (UID: \"9be76a38-b85f-458f-b5c9-181abf962109\") " pod="openstack/swift-proxy-d76d78d97-bs4hd" Mar 18 18:21:55 crc kubenswrapper[4830]: I0318 18:21:55.176223 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9be76a38-b85f-458f-b5c9-181abf962109-log-httpd\") pod \"swift-proxy-d76d78d97-bs4hd\" (UID: \"9be76a38-b85f-458f-b5c9-181abf962109\") " pod="openstack/swift-proxy-d76d78d97-bs4hd" Mar 18 18:21:55 crc kubenswrapper[4830]: I0318 18:21:55.176280 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9be76a38-b85f-458f-b5c9-181abf962109-run-httpd\") pod \"swift-proxy-d76d78d97-bs4hd\" (UID: \"9be76a38-b85f-458f-b5c9-181abf962109\") " pod="openstack/swift-proxy-d76d78d97-bs4hd" Mar 18 18:21:55 crc kubenswrapper[4830]: I0318 18:21:55.176323 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9be76a38-b85f-458f-b5c9-181abf962109-combined-ca-bundle\") pod \"swift-proxy-d76d78d97-bs4hd\" (UID: \"9be76a38-b85f-458f-b5c9-181abf962109\") " pod="openstack/swift-proxy-d76d78d97-bs4hd" Mar 18 18:21:55 crc kubenswrapper[4830]: I0318 18:21:55.176360 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9be76a38-b85f-458f-b5c9-181abf962109-config-data\") pod \"swift-proxy-d76d78d97-bs4hd\" (UID: \"9be76a38-b85f-458f-b5c9-181abf962109\") " pod="openstack/swift-proxy-d76d78d97-bs4hd" Mar 18 18:21:55 crc kubenswrapper[4830]: I0318 18:21:55.177130 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9be76a38-b85f-458f-b5c9-181abf962109-log-httpd\") pod \"swift-proxy-d76d78d97-bs4hd\" (UID: \"9be76a38-b85f-458f-b5c9-181abf962109\") " pod="openstack/swift-proxy-d76d78d97-bs4hd" Mar 18 18:21:55 crc kubenswrapper[4830]: I0318 18:21:55.180539 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9be76a38-b85f-458f-b5c9-181abf962109-run-httpd\") pod \"swift-proxy-d76d78d97-bs4hd\" (UID: \"9be76a38-b85f-458f-b5c9-181abf962109\") " pod="openstack/swift-proxy-d76d78d97-bs4hd" Mar 18 18:21:55 crc kubenswrapper[4830]: I0318 18:21:55.181203 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9be76a38-b85f-458f-b5c9-181abf962109-internal-tls-certs\") pod \"swift-proxy-d76d78d97-bs4hd\" (UID: \"9be76a38-b85f-458f-b5c9-181abf962109\") " pod="openstack/swift-proxy-d76d78d97-bs4hd" Mar 18 18:21:55 crc kubenswrapper[4830]: I0318 18:21:55.182550 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9be76a38-b85f-458f-b5c9-181abf962109-public-tls-certs\") pod \"swift-proxy-d76d78d97-bs4hd\" (UID: \"9be76a38-b85f-458f-b5c9-181abf962109\") " pod="openstack/swift-proxy-d76d78d97-bs4hd" Mar 18 18:21:55 crc kubenswrapper[4830]: I0318 18:21:55.187274 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9be76a38-b85f-458f-b5c9-181abf962109-etc-swift\") pod \"swift-proxy-d76d78d97-bs4hd\" (UID: \"9be76a38-b85f-458f-b5c9-181abf962109\") " pod="openstack/swift-proxy-d76d78d97-bs4hd" Mar 18 18:21:55 crc kubenswrapper[4830]: I0318 18:21:55.192460 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fk6s\" (UniqueName: \"kubernetes.io/projected/9be76a38-b85f-458f-b5c9-181abf962109-kube-api-access-6fk6s\") pod \"swift-proxy-d76d78d97-bs4hd\" (UID: \"9be76a38-b85f-458f-b5c9-181abf962109\") " pod="openstack/swift-proxy-d76d78d97-bs4hd" Mar 18 18:21:55 crc kubenswrapper[4830]: I0318 18:21:55.195513 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9be76a38-b85f-458f-b5c9-181abf962109-config-data\") pod \"swift-proxy-d76d78d97-bs4hd\" (UID: \"9be76a38-b85f-458f-b5c9-181abf962109\") " pod="openstack/swift-proxy-d76d78d97-bs4hd" Mar 18 18:21:55 crc kubenswrapper[4830]: I0318 18:21:55.197553 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9be76a38-b85f-458f-b5c9-181abf962109-combined-ca-bundle\") pod \"swift-proxy-d76d78d97-bs4hd\" (UID: \"9be76a38-b85f-458f-b5c9-181abf962109\") " pod="openstack/swift-proxy-d76d78d97-bs4hd" Mar 18 18:21:55 crc kubenswrapper[4830]: I0318 18:21:55.308645 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-d76d78d97-bs4hd" Mar 18 18:21:55 crc kubenswrapper[4830]: I0318 18:21:55.423771 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:21:55 crc kubenswrapper[4830]: I0318 18:21:55.430220 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="82751f36-d42f-4edd-a9c7-e6657da91e34" containerName="ceilometer-central-agent" containerID="cri-o://3943dd6326b04778113bfba3570fc62a9f0270a46dfc202dbf962068fe710305" gracePeriod=30 Mar 18 18:21:55 crc kubenswrapper[4830]: I0318 18:21:55.430956 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="82751f36-d42f-4edd-a9c7-e6657da91e34" containerName="proxy-httpd" containerID="cri-o://0aea7afbac277c581335abdba76abe516fb4c397a82745e2f599d61da1db92bb" gracePeriod=30 Mar 18 18:21:55 crc kubenswrapper[4830]: I0318 18:21:55.431089 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="82751f36-d42f-4edd-a9c7-e6657da91e34" containerName="ceilometer-notification-agent" containerID="cri-o://1d4c0a46d0e3d58389d6d4c79bc3a4447609fd3f2a0aa21592f1f31e7aea934b" gracePeriod=30 Mar 18 18:21:55 crc kubenswrapper[4830]: I0318 18:21:55.431261 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="82751f36-d42f-4edd-a9c7-e6657da91e34" containerName="sg-core" containerID="cri-o://292a2024950e615405e4ab375dd3bfc11b4aa3a56bc9871f42d40431922a45a4" gracePeriod=30 Mar 18 18:21:55 crc kubenswrapper[4830]: I0318 18:21:55.437095 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 18:21:55 crc kubenswrapper[4830]: I0318 18:21:55.925062 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-d76d78d97-bs4hd"] Mar 18 18:21:56 crc kubenswrapper[4830]: I0318 18:21:56.343040 4830 generic.go:334] "Generic (PLEG): container finished" podID="82751f36-d42f-4edd-a9c7-e6657da91e34" containerID="0aea7afbac277c581335abdba76abe516fb4c397a82745e2f599d61da1db92bb" exitCode=0 Mar 18 18:21:56 crc kubenswrapper[4830]: I0318 18:21:56.343086 4830 generic.go:334] "Generic (PLEG): container finished" podID="82751f36-d42f-4edd-a9c7-e6657da91e34" containerID="292a2024950e615405e4ab375dd3bfc11b4aa3a56bc9871f42d40431922a45a4" exitCode=2 Mar 18 18:21:56 crc kubenswrapper[4830]: I0318 18:21:56.343096 4830 generic.go:334] "Generic (PLEG): container finished" podID="82751f36-d42f-4edd-a9c7-e6657da91e34" containerID="1d4c0a46d0e3d58389d6d4c79bc3a4447609fd3f2a0aa21592f1f31e7aea934b" exitCode=0 Mar 18 18:21:56 crc kubenswrapper[4830]: I0318 18:21:56.343111 4830 generic.go:334] "Generic (PLEG): container finished" podID="82751f36-d42f-4edd-a9c7-e6657da91e34" containerID="3943dd6326b04778113bfba3570fc62a9f0270a46dfc202dbf962068fe710305" exitCode=0 Mar 18 18:21:56 crc kubenswrapper[4830]: I0318 18:21:56.343136 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82751f36-d42f-4edd-a9c7-e6657da91e34","Type":"ContainerDied","Data":"0aea7afbac277c581335abdba76abe516fb4c397a82745e2f599d61da1db92bb"} Mar 18 18:21:56 crc kubenswrapper[4830]: I0318 18:21:56.343169 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82751f36-d42f-4edd-a9c7-e6657da91e34","Type":"ContainerDied","Data":"292a2024950e615405e4ab375dd3bfc11b4aa3a56bc9871f42d40431922a45a4"} Mar 18 18:21:56 crc kubenswrapper[4830]: I0318 18:21:56.343182 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82751f36-d42f-4edd-a9c7-e6657da91e34","Type":"ContainerDied","Data":"1d4c0a46d0e3d58389d6d4c79bc3a4447609fd3f2a0aa21592f1f31e7aea934b"} Mar 18 18:21:56 crc kubenswrapper[4830]: I0318 18:21:56.343198 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82751f36-d42f-4edd-a9c7-e6657da91e34","Type":"ContainerDied","Data":"3943dd6326b04778113bfba3570fc62a9f0270a46dfc202dbf962068fe710305"} Mar 18 18:21:56 crc kubenswrapper[4830]: I0318 18:21:56.667856 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-4v25q"] Mar 18 18:21:56 crc kubenswrapper[4830]: I0318 18:21:56.671543 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4v25q" Mar 18 18:21:56 crc kubenswrapper[4830]: I0318 18:21:56.735504 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-4v25q"] Mar 18 18:21:56 crc kubenswrapper[4830]: I0318 18:21:56.787661 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-5vmth"] Mar 18 18:21:56 crc kubenswrapper[4830]: I0318 18:21:56.788736 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5vmth" Mar 18 18:21:56 crc kubenswrapper[4830]: I0318 18:21:56.807432 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-5vmth"] Mar 18 18:21:56 crc kubenswrapper[4830]: I0318 18:21:56.879839 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95625855-e07d-4366-8b59-7bc241752fab-operator-scripts\") pod \"nova-api-db-create-4v25q\" (UID: \"95625855-e07d-4366-8b59-7bc241752fab\") " pod="openstack/nova-api-db-create-4v25q" Mar 18 18:21:56 crc kubenswrapper[4830]: I0318 18:21:56.879958 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tg4d\" (UniqueName: \"kubernetes.io/projected/95625855-e07d-4366-8b59-7bc241752fab-kube-api-access-6tg4d\") pod \"nova-api-db-create-4v25q\" (UID: \"95625855-e07d-4366-8b59-7bc241752fab\") " pod="openstack/nova-api-db-create-4v25q" Mar 18 18:21:56 crc kubenswrapper[4830]: I0318 18:21:56.927075 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-a2d6-account-create-update-qh85j"] Mar 18 18:21:56 crc kubenswrapper[4830]: I0318 18:21:56.928676 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a2d6-account-create-update-qh85j" Mar 18 18:21:56 crc kubenswrapper[4830]: I0318 18:21:56.931135 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-fhlvn"] Mar 18 18:21:56 crc kubenswrapper[4830]: I0318 18:21:56.932848 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-fhlvn" Mar 18 18:21:56 crc kubenswrapper[4830]: I0318 18:21:56.933200 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 18 18:21:56 crc kubenswrapper[4830]: I0318 18:21:56.948291 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-a2d6-account-create-update-qh85j"] Mar 18 18:21:56 crc kubenswrapper[4830]: I0318 18:21:56.983162 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7mnw\" (UniqueName: \"kubernetes.io/projected/0b3aa2e8-fa67-406b-b8cd-e21725c059c3-kube-api-access-t7mnw\") pod \"nova-cell0-db-create-5vmth\" (UID: \"0b3aa2e8-fa67-406b-b8cd-e21725c059c3\") " pod="openstack/nova-cell0-db-create-5vmth" Mar 18 18:21:56 crc kubenswrapper[4830]: I0318 18:21:56.983232 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95625855-e07d-4366-8b59-7bc241752fab-operator-scripts\") pod \"nova-api-db-create-4v25q\" (UID: \"95625855-e07d-4366-8b59-7bc241752fab\") " pod="openstack/nova-api-db-create-4v25q" Mar 18 18:21:56 crc kubenswrapper[4830]: I0318 18:21:56.983315 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tg4d\" (UniqueName: \"kubernetes.io/projected/95625855-e07d-4366-8b59-7bc241752fab-kube-api-access-6tg4d\") pod \"nova-api-db-create-4v25q\" (UID: \"95625855-e07d-4366-8b59-7bc241752fab\") " pod="openstack/nova-api-db-create-4v25q" Mar 18 18:21:56 crc kubenswrapper[4830]: I0318 18:21:56.983348 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b3aa2e8-fa67-406b-b8cd-e21725c059c3-operator-scripts\") pod \"nova-cell0-db-create-5vmth\" (UID: \"0b3aa2e8-fa67-406b-b8cd-e21725c059c3\") " pod="openstack/nova-cell0-db-create-5vmth" Mar 18 18:21:56 crc kubenswrapper[4830]: I0318 18:21:56.984050 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95625855-e07d-4366-8b59-7bc241752fab-operator-scripts\") pod \"nova-api-db-create-4v25q\" (UID: \"95625855-e07d-4366-8b59-7bc241752fab\") " pod="openstack/nova-api-db-create-4v25q" Mar 18 18:21:56 crc kubenswrapper[4830]: I0318 18:21:56.991388 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-fhlvn"] Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.003279 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tg4d\" (UniqueName: \"kubernetes.io/projected/95625855-e07d-4366-8b59-7bc241752fab-kube-api-access-6tg4d\") pod \"nova-api-db-create-4v25q\" (UID: \"95625855-e07d-4366-8b59-7bc241752fab\") " pod="openstack/nova-api-db-create-4v25q" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.095248 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-03fa-account-create-update-b4pjm"] Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.096213 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7mnw\" (UniqueName: \"kubernetes.io/projected/0b3aa2e8-fa67-406b-b8cd-e21725c059c3-kube-api-access-t7mnw\") pod \"nova-cell0-db-create-5vmth\" (UID: \"0b3aa2e8-fa67-406b-b8cd-e21725c059c3\") " pod="openstack/nova-cell0-db-create-5vmth" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.096268 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1057ef8c-09f6-4dc3-9350-bb834240d748-operator-scripts\") pod \"nova-api-a2d6-account-create-update-qh85j\" (UID: \"1057ef8c-09f6-4dc3-9350-bb834240d748\") " pod="openstack/nova-api-a2d6-account-create-update-qh85j" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.096335 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hxq7\" (UniqueName: \"kubernetes.io/projected/74526bf3-152f-40de-9923-9197ffddfc2d-kube-api-access-8hxq7\") pod \"nova-cell1-db-create-fhlvn\" (UID: \"74526bf3-152f-40de-9923-9197ffddfc2d\") " pod="openstack/nova-cell1-db-create-fhlvn" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.096417 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74526bf3-152f-40de-9923-9197ffddfc2d-operator-scripts\") pod \"nova-cell1-db-create-fhlvn\" (UID: \"74526bf3-152f-40de-9923-9197ffddfc2d\") " pod="openstack/nova-cell1-db-create-fhlvn" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.096508 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b3aa2e8-fa67-406b-b8cd-e21725c059c3-operator-scripts\") pod \"nova-cell0-db-create-5vmth\" (UID: \"0b3aa2e8-fa67-406b-b8cd-e21725c059c3\") " pod="openstack/nova-cell0-db-create-5vmth" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.096527 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdztn\" (UniqueName: \"kubernetes.io/projected/1057ef8c-09f6-4dc3-9350-bb834240d748-kube-api-access-pdztn\") pod \"nova-api-a2d6-account-create-update-qh85j\" (UID: \"1057ef8c-09f6-4dc3-9350-bb834240d748\") " pod="openstack/nova-api-a2d6-account-create-update-qh85j" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.096988 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-03fa-account-create-update-b4pjm" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.097325 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b3aa2e8-fa67-406b-b8cd-e21725c059c3-operator-scripts\") pod \"nova-cell0-db-create-5vmth\" (UID: \"0b3aa2e8-fa67-406b-b8cd-e21725c059c3\") " pod="openstack/nova-cell0-db-create-5vmth" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.105926 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.116499 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-03fa-account-create-update-b4pjm"] Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.124249 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7mnw\" (UniqueName: \"kubernetes.io/projected/0b3aa2e8-fa67-406b-b8cd-e21725c059c3-kube-api-access-t7mnw\") pod \"nova-cell0-db-create-5vmth\" (UID: \"0b3aa2e8-fa67-406b-b8cd-e21725c059c3\") " pod="openstack/nova-cell0-db-create-5vmth" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.124645 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5vmth" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.201696 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1057ef8c-09f6-4dc3-9350-bb834240d748-operator-scripts\") pod \"nova-api-a2d6-account-create-update-qh85j\" (UID: \"1057ef8c-09f6-4dc3-9350-bb834240d748\") " pod="openstack/nova-api-a2d6-account-create-update-qh85j" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.201744 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d255h\" (UniqueName: \"kubernetes.io/projected/19dfbcf1-1e28-475a-ae01-092ae2e8764a-kube-api-access-d255h\") pod \"nova-cell0-03fa-account-create-update-b4pjm\" (UID: \"19dfbcf1-1e28-475a-ae01-092ae2e8764a\") " pod="openstack/nova-cell0-03fa-account-create-update-b4pjm" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.201811 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hxq7\" (UniqueName: \"kubernetes.io/projected/74526bf3-152f-40de-9923-9197ffddfc2d-kube-api-access-8hxq7\") pod \"nova-cell1-db-create-fhlvn\" (UID: \"74526bf3-152f-40de-9923-9197ffddfc2d\") " pod="openstack/nova-cell1-db-create-fhlvn" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.201902 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19dfbcf1-1e28-475a-ae01-092ae2e8764a-operator-scripts\") pod \"nova-cell0-03fa-account-create-update-b4pjm\" (UID: \"19dfbcf1-1e28-475a-ae01-092ae2e8764a\") " pod="openstack/nova-cell0-03fa-account-create-update-b4pjm" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.201952 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74526bf3-152f-40de-9923-9197ffddfc2d-operator-scripts\") pod \"nova-cell1-db-create-fhlvn\" (UID: \"74526bf3-152f-40de-9923-9197ffddfc2d\") " pod="openstack/nova-cell1-db-create-fhlvn" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.202040 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdztn\" (UniqueName: \"kubernetes.io/projected/1057ef8c-09f6-4dc3-9350-bb834240d748-kube-api-access-pdztn\") pod \"nova-api-a2d6-account-create-update-qh85j\" (UID: \"1057ef8c-09f6-4dc3-9350-bb834240d748\") " pod="openstack/nova-api-a2d6-account-create-update-qh85j" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.202618 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74526bf3-152f-40de-9923-9197ffddfc2d-operator-scripts\") pod \"nova-cell1-db-create-fhlvn\" (UID: \"74526bf3-152f-40de-9923-9197ffddfc2d\") " pod="openstack/nova-cell1-db-create-fhlvn" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.202940 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1057ef8c-09f6-4dc3-9350-bb834240d748-operator-scripts\") pod \"nova-api-a2d6-account-create-update-qh85j\" (UID: \"1057ef8c-09f6-4dc3-9350-bb834240d748\") " pod="openstack/nova-api-a2d6-account-create-update-qh85j" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.218345 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdztn\" (UniqueName: \"kubernetes.io/projected/1057ef8c-09f6-4dc3-9350-bb834240d748-kube-api-access-pdztn\") pod \"nova-api-a2d6-account-create-update-qh85j\" (UID: \"1057ef8c-09f6-4dc3-9350-bb834240d748\") " pod="openstack/nova-api-a2d6-account-create-update-qh85j" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.218951 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hxq7\" (UniqueName: \"kubernetes.io/projected/74526bf3-152f-40de-9923-9197ffddfc2d-kube-api-access-8hxq7\") pod \"nova-cell1-db-create-fhlvn\" (UID: \"74526bf3-152f-40de-9923-9197ffddfc2d\") " pod="openstack/nova-cell1-db-create-fhlvn" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.267437 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a2d6-account-create-update-qh85j" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.280388 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-88cd-account-create-update-qgxqj"] Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.281507 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-88cd-account-create-update-qgxqj" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.284018 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.295502 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.296702 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4v25q" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.299123 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-fhlvn" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.304327 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d255h\" (UniqueName: \"kubernetes.io/projected/19dfbcf1-1e28-475a-ae01-092ae2e8764a-kube-api-access-d255h\") pod \"nova-cell0-03fa-account-create-update-b4pjm\" (UID: \"19dfbcf1-1e28-475a-ae01-092ae2e8764a\") " pod="openstack/nova-cell0-03fa-account-create-update-b4pjm" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.304492 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19dfbcf1-1e28-475a-ae01-092ae2e8764a-operator-scripts\") pod \"nova-cell0-03fa-account-create-update-b4pjm\" (UID: \"19dfbcf1-1e28-475a-ae01-092ae2e8764a\") " pod="openstack/nova-cell0-03fa-account-create-update-b4pjm" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.306278 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19dfbcf1-1e28-475a-ae01-092ae2e8764a-operator-scripts\") pod \"nova-cell0-03fa-account-create-update-b4pjm\" (UID: \"19dfbcf1-1e28-475a-ae01-092ae2e8764a\") " pod="openstack/nova-cell0-03fa-account-create-update-b4pjm" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.316822 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-88cd-account-create-update-qgxqj"] Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.335933 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d255h\" (UniqueName: \"kubernetes.io/projected/19dfbcf1-1e28-475a-ae01-092ae2e8764a-kube-api-access-d255h\") pod \"nova-cell0-03fa-account-create-update-b4pjm\" (UID: \"19dfbcf1-1e28-475a-ae01-092ae2e8764a\") " pod="openstack/nova-cell0-03fa-account-create-update-b4pjm" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.407968 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/523e43c6-ca4d-4107-bb08-02085e1fcd14-operator-scripts\") pod \"nova-cell1-88cd-account-create-update-qgxqj\" (UID: \"523e43c6-ca4d-4107-bb08-02085e1fcd14\") " pod="openstack/nova-cell1-88cd-account-create-update-qgxqj" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.409162 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nljlp\" (UniqueName: \"kubernetes.io/projected/523e43c6-ca4d-4107-bb08-02085e1fcd14-kube-api-access-nljlp\") pod \"nova-cell1-88cd-account-create-update-qgxqj\" (UID: \"523e43c6-ca4d-4107-bb08-02085e1fcd14\") " pod="openstack/nova-cell1-88cd-account-create-update-qgxqj" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.484610 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-03fa-account-create-update-b4pjm" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.511546 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nljlp\" (UniqueName: \"kubernetes.io/projected/523e43c6-ca4d-4107-bb08-02085e1fcd14-kube-api-access-nljlp\") pod \"nova-cell1-88cd-account-create-update-qgxqj\" (UID: \"523e43c6-ca4d-4107-bb08-02085e1fcd14\") " pod="openstack/nova-cell1-88cd-account-create-update-qgxqj" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.511666 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/523e43c6-ca4d-4107-bb08-02085e1fcd14-operator-scripts\") pod \"nova-cell1-88cd-account-create-update-qgxqj\" (UID: \"523e43c6-ca4d-4107-bb08-02085e1fcd14\") " pod="openstack/nova-cell1-88cd-account-create-update-qgxqj" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.512347 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/523e43c6-ca4d-4107-bb08-02085e1fcd14-operator-scripts\") pod \"nova-cell1-88cd-account-create-update-qgxqj\" (UID: \"523e43c6-ca4d-4107-bb08-02085e1fcd14\") " pod="openstack/nova-cell1-88cd-account-create-update-qgxqj" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.538348 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nljlp\" (UniqueName: \"kubernetes.io/projected/523e43c6-ca4d-4107-bb08-02085e1fcd14-kube-api-access-nljlp\") pod \"nova-cell1-88cd-account-create-update-qgxqj\" (UID: \"523e43c6-ca4d-4107-bb08-02085e1fcd14\") " pod="openstack/nova-cell1-88cd-account-create-update-qgxqj" Mar 18 18:21:57 crc kubenswrapper[4830]: I0318 18:21:57.631926 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-88cd-account-create-update-qgxqj" Mar 18 18:22:00 crc kubenswrapper[4830]: I0318 18:22:00.137645 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564302-vdnd2"] Mar 18 18:22:00 crc kubenswrapper[4830]: I0318 18:22:00.138937 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564302-vdnd2" Mar 18 18:22:00 crc kubenswrapper[4830]: I0318 18:22:00.149678 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 18:22:00 crc kubenswrapper[4830]: I0318 18:22:00.150213 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:22:00 crc kubenswrapper[4830]: I0318 18:22:00.150355 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:22:00 crc kubenswrapper[4830]: I0318 18:22:00.158595 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwvg5\" (UniqueName: \"kubernetes.io/projected/00b94971-0d4a-45d3-8b41-4730dbcd9c81-kube-api-access-pwvg5\") pod \"auto-csr-approver-29564302-vdnd2\" (UID: \"00b94971-0d4a-45d3-8b41-4730dbcd9c81\") " pod="openshift-infra/auto-csr-approver-29564302-vdnd2" Mar 18 18:22:00 crc kubenswrapper[4830]: I0318 18:22:00.163421 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564302-vdnd2"] Mar 18 18:22:00 crc kubenswrapper[4830]: I0318 18:22:00.262215 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwvg5\" (UniqueName: \"kubernetes.io/projected/00b94971-0d4a-45d3-8b41-4730dbcd9c81-kube-api-access-pwvg5\") pod \"auto-csr-approver-29564302-vdnd2\" (UID: \"00b94971-0d4a-45d3-8b41-4730dbcd9c81\") " pod="openshift-infra/auto-csr-approver-29564302-vdnd2" Mar 18 18:22:00 crc kubenswrapper[4830]: I0318 18:22:00.285268 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwvg5\" (UniqueName: \"kubernetes.io/projected/00b94971-0d4a-45d3-8b41-4730dbcd9c81-kube-api-access-pwvg5\") pod \"auto-csr-approver-29564302-vdnd2\" (UID: \"00b94971-0d4a-45d3-8b41-4730dbcd9c81\") " pod="openshift-infra/auto-csr-approver-29564302-vdnd2" Mar 18 18:22:00 crc kubenswrapper[4830]: I0318 18:22:00.459398 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564302-vdnd2" Mar 18 18:22:02 crc kubenswrapper[4830]: I0318 18:22:02.418349 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d76d78d97-bs4hd" event={"ID":"9be76a38-b85f-458f-b5c9-181abf962109","Type":"ContainerStarted","Data":"e4ae57fbf770084e2f021cc7025e5db67046ff08c356ae524ef0c1c7a7981718"} Mar 18 18:22:02 crc kubenswrapper[4830]: I0318 18:22:02.558433 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:22:02 crc kubenswrapper[4830]: I0318 18:22:02.701287 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82751f36-d42f-4edd-a9c7-e6657da91e34-combined-ca-bundle\") pod \"82751f36-d42f-4edd-a9c7-e6657da91e34\" (UID: \"82751f36-d42f-4edd-a9c7-e6657da91e34\") " Mar 18 18:22:02 crc kubenswrapper[4830]: I0318 18:22:02.702669 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82751f36-d42f-4edd-a9c7-e6657da91e34-sg-core-conf-yaml\") pod \"82751f36-d42f-4edd-a9c7-e6657da91e34\" (UID: \"82751f36-d42f-4edd-a9c7-e6657da91e34\") " Mar 18 18:22:02 crc kubenswrapper[4830]: I0318 18:22:02.702894 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82751f36-d42f-4edd-a9c7-e6657da91e34-run-httpd\") pod \"82751f36-d42f-4edd-a9c7-e6657da91e34\" (UID: \"82751f36-d42f-4edd-a9c7-e6657da91e34\") " Mar 18 18:22:02 crc kubenswrapper[4830]: I0318 18:22:02.703006 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82751f36-d42f-4edd-a9c7-e6657da91e34-log-httpd\") pod \"82751f36-d42f-4edd-a9c7-e6657da91e34\" (UID: \"82751f36-d42f-4edd-a9c7-e6657da91e34\") " Mar 18 18:22:02 crc kubenswrapper[4830]: I0318 18:22:02.703043 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82751f36-d42f-4edd-a9c7-e6657da91e34-scripts\") pod \"82751f36-d42f-4edd-a9c7-e6657da91e34\" (UID: \"82751f36-d42f-4edd-a9c7-e6657da91e34\") " Mar 18 18:22:02 crc kubenswrapper[4830]: I0318 18:22:02.703111 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82751f36-d42f-4edd-a9c7-e6657da91e34-config-data\") pod \"82751f36-d42f-4edd-a9c7-e6657da91e34\" (UID: \"82751f36-d42f-4edd-a9c7-e6657da91e34\") " Mar 18 18:22:02 crc kubenswrapper[4830]: I0318 18:22:02.703156 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtkzl\" (UniqueName: \"kubernetes.io/projected/82751f36-d42f-4edd-a9c7-e6657da91e34-kube-api-access-jtkzl\") pod \"82751f36-d42f-4edd-a9c7-e6657da91e34\" (UID: \"82751f36-d42f-4edd-a9c7-e6657da91e34\") " Mar 18 18:22:02 crc kubenswrapper[4830]: I0318 18:22:02.703379 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82751f36-d42f-4edd-a9c7-e6657da91e34-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "82751f36-d42f-4edd-a9c7-e6657da91e34" (UID: "82751f36-d42f-4edd-a9c7-e6657da91e34"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:22:02 crc kubenswrapper[4830]: I0318 18:22:02.703511 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82751f36-d42f-4edd-a9c7-e6657da91e34-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "82751f36-d42f-4edd-a9c7-e6657da91e34" (UID: "82751f36-d42f-4edd-a9c7-e6657da91e34"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:22:02 crc kubenswrapper[4830]: I0318 18:22:02.703962 4830 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82751f36-d42f-4edd-a9c7-e6657da91e34-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:02 crc kubenswrapper[4830]: I0318 18:22:02.703985 4830 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82751f36-d42f-4edd-a9c7-e6657da91e34-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:02 crc kubenswrapper[4830]: I0318 18:22:02.711555 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82751f36-d42f-4edd-a9c7-e6657da91e34-scripts" (OuterVolumeSpecName: "scripts") pod "82751f36-d42f-4edd-a9c7-e6657da91e34" (UID: "82751f36-d42f-4edd-a9c7-e6657da91e34"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:02 crc kubenswrapper[4830]: I0318 18:22:02.730322 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82751f36-d42f-4edd-a9c7-e6657da91e34-kube-api-access-jtkzl" (OuterVolumeSpecName: "kube-api-access-jtkzl") pod "82751f36-d42f-4edd-a9c7-e6657da91e34" (UID: "82751f36-d42f-4edd-a9c7-e6657da91e34"). InnerVolumeSpecName "kube-api-access-jtkzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:02 crc kubenswrapper[4830]: I0318 18:22:02.779357 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82751f36-d42f-4edd-a9c7-e6657da91e34-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "82751f36-d42f-4edd-a9c7-e6657da91e34" (UID: "82751f36-d42f-4edd-a9c7-e6657da91e34"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:02 crc kubenswrapper[4830]: I0318 18:22:02.805273 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82751f36-d42f-4edd-a9c7-e6657da91e34-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:02 crc kubenswrapper[4830]: I0318 18:22:02.805314 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtkzl\" (UniqueName: \"kubernetes.io/projected/82751f36-d42f-4edd-a9c7-e6657da91e34-kube-api-access-jtkzl\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:02 crc kubenswrapper[4830]: I0318 18:22:02.805323 4830 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82751f36-d42f-4edd-a9c7-e6657da91e34-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:02 crc kubenswrapper[4830]: I0318 18:22:02.844554 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-4v25q"] Mar 18 18:22:02 crc kubenswrapper[4830]: I0318 18:22:02.966137 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82751f36-d42f-4edd-a9c7-e6657da91e34-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82751f36-d42f-4edd-a9c7-e6657da91e34" (UID: "82751f36-d42f-4edd-a9c7-e6657da91e34"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.008621 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82751f36-d42f-4edd-a9c7-e6657da91e34-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.012221 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82751f36-d42f-4edd-a9c7-e6657da91e34-config-data" (OuterVolumeSpecName: "config-data") pod "82751f36-d42f-4edd-a9c7-e6657da91e34" (UID: "82751f36-d42f-4edd-a9c7-e6657da91e34"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.113016 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82751f36-d42f-4edd-a9c7-e6657da91e34-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.436181 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-5vmth"] Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.460866 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564302-vdnd2"] Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.527844 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-a2d6-account-create-update-qh85j"] Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.546391 4830 generic.go:334] "Generic (PLEG): container finished" podID="95625855-e07d-4366-8b59-7bc241752fab" containerID="3607115effce7bd86408002d82ffa86b5174142250b309d1f475cbb22f7dc450" exitCode=0 Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.546484 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4v25q" event={"ID":"95625855-e07d-4366-8b59-7bc241752fab","Type":"ContainerDied","Data":"3607115effce7bd86408002d82ffa86b5174142250b309d1f475cbb22f7dc450"} Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.546513 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4v25q" event={"ID":"95625855-e07d-4366-8b59-7bc241752fab","Type":"ContainerStarted","Data":"713b67111e2b4676e07f6201a4b06175d7879c732dea8731a25e3100bfd68162"} Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.550750 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-03fa-account-create-update-b4pjm"] Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.571830 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-88cd-account-create-update-qgxqj"] Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.599263 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d76d78d97-bs4hd" event={"ID":"9be76a38-b85f-458f-b5c9-181abf962109","Type":"ContainerStarted","Data":"c0416d3b3912bda28adfb32ff6910ca06aa3d2a68ff4208501b26467c7a964b5"} Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.599482 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d76d78d97-bs4hd" event={"ID":"9be76a38-b85f-458f-b5c9-181abf962109","Type":"ContainerStarted","Data":"10ea1ae62f7573f638e31db710f3455f544b39c9e8f84f23270b74eeae48b588"} Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.600536 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-d76d78d97-bs4hd" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.600641 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-d76d78d97-bs4hd" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.603879 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-fhlvn"] Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.629346 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"1965a180-09c8-4af1-852e-7792c02564ca","Type":"ContainerStarted","Data":"ad0f8b84dc205164a749c73530020347ca97fd9f6445a06f2cc16f1876d40ecc"} Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.651507 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82751f36-d42f-4edd-a9c7-e6657da91e34","Type":"ContainerDied","Data":"ca961ff56c2a64ad3c6c171e67ee0c3f74464740fa004e173274e78f41ddc694"} Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.651554 4830 scope.go:117] "RemoveContainer" containerID="0aea7afbac277c581335abdba76abe516fb4c397a82745e2f599d61da1db92bb" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.651580 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.655224 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-d76d78d97-bs4hd" podStartSLOduration=9.655205167 podStartE2EDuration="9.655205167s" podCreationTimestamp="2026-03-18 18:21:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:22:03.647222653 +0000 UTC m=+1158.214852985" watchObservedRunningTime="2026-03-18 18:22:03.655205167 +0000 UTC m=+1158.222835499" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.683892 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.067758507 podStartE2EDuration="14.683855923s" podCreationTimestamp="2026-03-18 18:21:49 +0000 UTC" firstStartedPulling="2026-03-18 18:21:50.722725673 +0000 UTC m=+1145.290356005" lastFinishedPulling="2026-03-18 18:22:02.338823079 +0000 UTC m=+1156.906453421" observedRunningTime="2026-03-18 18:22:03.663809619 +0000 UTC m=+1158.231439951" watchObservedRunningTime="2026-03-18 18:22:03.683855923 +0000 UTC m=+1158.251486245" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.698946 4830 scope.go:117] "RemoveContainer" containerID="292a2024950e615405e4ab375dd3bfc11b4aa3a56bc9871f42d40431922a45a4" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.703755 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.717491 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.729335 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:22:03 crc kubenswrapper[4830]: E0318 18:22:03.729681 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82751f36-d42f-4edd-a9c7-e6657da91e34" containerName="sg-core" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.729696 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="82751f36-d42f-4edd-a9c7-e6657da91e34" containerName="sg-core" Mar 18 18:22:03 crc kubenswrapper[4830]: E0318 18:22:03.729711 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82751f36-d42f-4edd-a9c7-e6657da91e34" containerName="ceilometer-central-agent" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.729717 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="82751f36-d42f-4edd-a9c7-e6657da91e34" containerName="ceilometer-central-agent" Mar 18 18:22:03 crc kubenswrapper[4830]: E0318 18:22:03.729728 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82751f36-d42f-4edd-a9c7-e6657da91e34" containerName="proxy-httpd" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.729736 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="82751f36-d42f-4edd-a9c7-e6657da91e34" containerName="proxy-httpd" Mar 18 18:22:03 crc kubenswrapper[4830]: E0318 18:22:03.729747 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82751f36-d42f-4edd-a9c7-e6657da91e34" containerName="ceilometer-notification-agent" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.729752 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="82751f36-d42f-4edd-a9c7-e6657da91e34" containerName="ceilometer-notification-agent" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.729928 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="82751f36-d42f-4edd-a9c7-e6657da91e34" containerName="sg-core" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.729941 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="82751f36-d42f-4edd-a9c7-e6657da91e34" containerName="proxy-httpd" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.729951 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="82751f36-d42f-4edd-a9c7-e6657da91e34" containerName="ceilometer-notification-agent" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.729965 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="82751f36-d42f-4edd-a9c7-e6657da91e34" containerName="ceilometer-central-agent" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.731387 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.733490 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.734205 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.742188 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.783755 4830 scope.go:117] "RemoveContainer" containerID="1d4c0a46d0e3d58389d6d4c79bc3a4447609fd3f2a0aa21592f1f31e7aea934b" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.845542 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"18ee448a-62ed-4a7e-b28f-62cea12a4c5b\") " pod="openstack/ceilometer-0" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.845594 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-run-httpd\") pod \"ceilometer-0\" (UID: \"18ee448a-62ed-4a7e-b28f-62cea12a4c5b\") " pod="openstack/ceilometer-0" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.845629 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"18ee448a-62ed-4a7e-b28f-62cea12a4c5b\") " pod="openstack/ceilometer-0" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.845669 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-config-data\") pod \"ceilometer-0\" (UID: \"18ee448a-62ed-4a7e-b28f-62cea12a4c5b\") " pod="openstack/ceilometer-0" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.845796 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ksw6\" (UniqueName: \"kubernetes.io/projected/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-kube-api-access-2ksw6\") pod \"ceilometer-0\" (UID: \"18ee448a-62ed-4a7e-b28f-62cea12a4c5b\") " pod="openstack/ceilometer-0" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.845828 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-log-httpd\") pod \"ceilometer-0\" (UID: \"18ee448a-62ed-4a7e-b28f-62cea12a4c5b\") " pod="openstack/ceilometer-0" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.845880 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-scripts\") pod \"ceilometer-0\" (UID: \"18ee448a-62ed-4a7e-b28f-62cea12a4c5b\") " pod="openstack/ceilometer-0" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.889803 4830 scope.go:117] "RemoveContainer" containerID="3943dd6326b04778113bfba3570fc62a9f0270a46dfc202dbf962068fe710305" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.946871 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-log-httpd\") pod \"ceilometer-0\" (UID: \"18ee448a-62ed-4a7e-b28f-62cea12a4c5b\") " pod="openstack/ceilometer-0" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.946952 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-scripts\") pod \"ceilometer-0\" (UID: \"18ee448a-62ed-4a7e-b28f-62cea12a4c5b\") " pod="openstack/ceilometer-0" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.946974 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"18ee448a-62ed-4a7e-b28f-62cea12a4c5b\") " pod="openstack/ceilometer-0" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.947004 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-run-httpd\") pod \"ceilometer-0\" (UID: \"18ee448a-62ed-4a7e-b28f-62cea12a4c5b\") " pod="openstack/ceilometer-0" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.947034 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"18ee448a-62ed-4a7e-b28f-62cea12a4c5b\") " pod="openstack/ceilometer-0" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.947074 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-config-data\") pod \"ceilometer-0\" (UID: \"18ee448a-62ed-4a7e-b28f-62cea12a4c5b\") " pod="openstack/ceilometer-0" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.947121 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ksw6\" (UniqueName: \"kubernetes.io/projected/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-kube-api-access-2ksw6\") pod \"ceilometer-0\" (UID: \"18ee448a-62ed-4a7e-b28f-62cea12a4c5b\") " pod="openstack/ceilometer-0" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.947591 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-log-httpd\") pod \"ceilometer-0\" (UID: \"18ee448a-62ed-4a7e-b28f-62cea12a4c5b\") " pod="openstack/ceilometer-0" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.948307 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-run-httpd\") pod \"ceilometer-0\" (UID: \"18ee448a-62ed-4a7e-b28f-62cea12a4c5b\") " pod="openstack/ceilometer-0" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.960275 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-scripts\") pod \"ceilometer-0\" (UID: \"18ee448a-62ed-4a7e-b28f-62cea12a4c5b\") " pod="openstack/ceilometer-0" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.960469 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"18ee448a-62ed-4a7e-b28f-62cea12a4c5b\") " pod="openstack/ceilometer-0" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.960478 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-config-data\") pod \"ceilometer-0\" (UID: \"18ee448a-62ed-4a7e-b28f-62cea12a4c5b\") " pod="openstack/ceilometer-0" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.960999 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"18ee448a-62ed-4a7e-b28f-62cea12a4c5b\") " pod="openstack/ceilometer-0" Mar 18 18:22:03 crc kubenswrapper[4830]: I0318 18:22:03.963508 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ksw6\" (UniqueName: \"kubernetes.io/projected/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-kube-api-access-2ksw6\") pod \"ceilometer-0\" (UID: \"18ee448a-62ed-4a7e-b28f-62cea12a4c5b\") " pod="openstack/ceilometer-0" Mar 18 18:22:04 crc kubenswrapper[4830]: I0318 18:22:04.115194 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:22:04 crc kubenswrapper[4830]: I0318 18:22:04.320285 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82751f36-d42f-4edd-a9c7-e6657da91e34" path="/var/lib/kubelet/pods/82751f36-d42f-4edd-a9c7-e6657da91e34/volumes" Mar 18 18:22:04 crc kubenswrapper[4830]: I0318 18:22:04.646584 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:22:04 crc kubenswrapper[4830]: I0318 18:22:04.678635 4830 generic.go:334] "Generic (PLEG): container finished" podID="74526bf3-152f-40de-9923-9197ffddfc2d" containerID="1107cede367b6677dfcb133136a1fc600b75615f76790b67e94f9a87ead486bd" exitCode=0 Mar 18 18:22:04 crc kubenswrapper[4830]: I0318 18:22:04.678690 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-fhlvn" event={"ID":"74526bf3-152f-40de-9923-9197ffddfc2d","Type":"ContainerDied","Data":"1107cede367b6677dfcb133136a1fc600b75615f76790b67e94f9a87ead486bd"} Mar 18 18:22:04 crc kubenswrapper[4830]: I0318 18:22:04.678714 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-fhlvn" event={"ID":"74526bf3-152f-40de-9923-9197ffddfc2d","Type":"ContainerStarted","Data":"9406f07eba6d8c4b1484ab6e3ea043ad0c0536a30a9bf10dd4009b94d14e826a"} Mar 18 18:22:04 crc kubenswrapper[4830]: I0318 18:22:04.693585 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564302-vdnd2" event={"ID":"00b94971-0d4a-45d3-8b41-4730dbcd9c81","Type":"ContainerStarted","Data":"e02273a001fbee2f06cf947ea00019899f92f0b8a8ad2535248af558ca2baa44"} Mar 18 18:22:04 crc kubenswrapper[4830]: I0318 18:22:04.714181 4830 generic.go:334] "Generic (PLEG): container finished" podID="19dfbcf1-1e28-475a-ae01-092ae2e8764a" containerID="27397b7f0510c55e80bce8e6b2943cb87b8939f2cbb3fc1f61ad4e083d4a54cd" exitCode=0 Mar 18 18:22:04 crc kubenswrapper[4830]: I0318 18:22:04.714332 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-03fa-account-create-update-b4pjm" event={"ID":"19dfbcf1-1e28-475a-ae01-092ae2e8764a","Type":"ContainerDied","Data":"27397b7f0510c55e80bce8e6b2943cb87b8939f2cbb3fc1f61ad4e083d4a54cd"} Mar 18 18:22:04 crc kubenswrapper[4830]: I0318 18:22:04.714366 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-03fa-account-create-update-b4pjm" event={"ID":"19dfbcf1-1e28-475a-ae01-092ae2e8764a","Type":"ContainerStarted","Data":"f6b10999fbf343097fe1e91766e53c4b574998ea93bc2a9d6501bd955cbb280e"} Mar 18 18:22:04 crc kubenswrapper[4830]: I0318 18:22:04.752709 4830 generic.go:334] "Generic (PLEG): container finished" podID="0b3aa2e8-fa67-406b-b8cd-e21725c059c3" containerID="95aed85a54b5c9ff61365b28e7d347033cc76e530b5b1df84aab3499edc37f6b" exitCode=0 Mar 18 18:22:04 crc kubenswrapper[4830]: I0318 18:22:04.752835 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5vmth" event={"ID":"0b3aa2e8-fa67-406b-b8cd-e21725c059c3","Type":"ContainerDied","Data":"95aed85a54b5c9ff61365b28e7d347033cc76e530b5b1df84aab3499edc37f6b"} Mar 18 18:22:04 crc kubenswrapper[4830]: I0318 18:22:04.752970 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5vmth" event={"ID":"0b3aa2e8-fa67-406b-b8cd-e21725c059c3","Type":"ContainerStarted","Data":"4db12bb4422f46f8ff59f023a5b6119666a49512850029e492f061cf45792d4b"} Mar 18 18:22:04 crc kubenswrapper[4830]: I0318 18:22:04.756898 4830 generic.go:334] "Generic (PLEG): container finished" podID="1057ef8c-09f6-4dc3-9350-bb834240d748" containerID="41f91f23eaf980683c2e539fdf83cddc5976b7b8cc570cd5a0dd21341c97981e" exitCode=0 Mar 18 18:22:04 crc kubenswrapper[4830]: I0318 18:22:04.756967 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-a2d6-account-create-update-qh85j" event={"ID":"1057ef8c-09f6-4dc3-9350-bb834240d748","Type":"ContainerDied","Data":"41f91f23eaf980683c2e539fdf83cddc5976b7b8cc570cd5a0dd21341c97981e"} Mar 18 18:22:04 crc kubenswrapper[4830]: I0318 18:22:04.756992 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-a2d6-account-create-update-qh85j" event={"ID":"1057ef8c-09f6-4dc3-9350-bb834240d748","Type":"ContainerStarted","Data":"f9d995b4704966545a09254e104d5eba5dfc8192e5abf56dea17220fe9a2d3ce"} Mar 18 18:22:04 crc kubenswrapper[4830]: I0318 18:22:04.785743 4830 generic.go:334] "Generic (PLEG): container finished" podID="523e43c6-ca4d-4107-bb08-02085e1fcd14" containerID="a777ec6c17a30f13877088b3dbdfc19eeb5cb76c8ad52c4fdc99d771910af2fb" exitCode=0 Mar 18 18:22:04 crc kubenswrapper[4830]: I0318 18:22:04.786042 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-88cd-account-create-update-qgxqj" event={"ID":"523e43c6-ca4d-4107-bb08-02085e1fcd14","Type":"ContainerDied","Data":"a777ec6c17a30f13877088b3dbdfc19eeb5cb76c8ad52c4fdc99d771910af2fb"} Mar 18 18:22:04 crc kubenswrapper[4830]: I0318 18:22:04.786131 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-88cd-account-create-update-qgxqj" event={"ID":"523e43c6-ca4d-4107-bb08-02085e1fcd14","Type":"ContainerStarted","Data":"cee584e7138c8c41a36261aed7b6d89f1b3e9f402931894fce8f2683bb7153ec"} Mar 18 18:22:04 crc kubenswrapper[4830]: I0318 18:22:04.801421 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18ee448a-62ed-4a7e-b28f-62cea12a4c5b","Type":"ContainerStarted","Data":"3bd11c348e4c5f758dec0e1784e4565ae52e14f4f34210864e77888a0125ad1d"} Mar 18 18:22:04 crc kubenswrapper[4830]: I0318 18:22:04.982327 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:22:05 crc kubenswrapper[4830]: I0318 18:22:05.190541 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4v25q" Mar 18 18:22:05 crc kubenswrapper[4830]: I0318 18:22:05.296062 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95625855-e07d-4366-8b59-7bc241752fab-operator-scripts\") pod \"95625855-e07d-4366-8b59-7bc241752fab\" (UID: \"95625855-e07d-4366-8b59-7bc241752fab\") " Mar 18 18:22:05 crc kubenswrapper[4830]: I0318 18:22:05.296158 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tg4d\" (UniqueName: \"kubernetes.io/projected/95625855-e07d-4366-8b59-7bc241752fab-kube-api-access-6tg4d\") pod \"95625855-e07d-4366-8b59-7bc241752fab\" (UID: \"95625855-e07d-4366-8b59-7bc241752fab\") " Mar 18 18:22:05 crc kubenswrapper[4830]: I0318 18:22:05.297667 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95625855-e07d-4366-8b59-7bc241752fab-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "95625855-e07d-4366-8b59-7bc241752fab" (UID: "95625855-e07d-4366-8b59-7bc241752fab"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:05 crc kubenswrapper[4830]: I0318 18:22:05.301152 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95625855-e07d-4366-8b59-7bc241752fab-kube-api-access-6tg4d" (OuterVolumeSpecName: "kube-api-access-6tg4d") pod "95625855-e07d-4366-8b59-7bc241752fab" (UID: "95625855-e07d-4366-8b59-7bc241752fab"). InnerVolumeSpecName "kube-api-access-6tg4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:05 crc kubenswrapper[4830]: I0318 18:22:05.398450 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95625855-e07d-4366-8b59-7bc241752fab-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:05 crc kubenswrapper[4830]: I0318 18:22:05.398499 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tg4d\" (UniqueName: \"kubernetes.io/projected/95625855-e07d-4366-8b59-7bc241752fab-kube-api-access-6tg4d\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:05 crc kubenswrapper[4830]: I0318 18:22:05.810679 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4v25q" event={"ID":"95625855-e07d-4366-8b59-7bc241752fab","Type":"ContainerDied","Data":"713b67111e2b4676e07f6201a4b06175d7879c732dea8731a25e3100bfd68162"} Mar 18 18:22:05 crc kubenswrapper[4830]: I0318 18:22:05.810962 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="713b67111e2b4676e07f6201a4b06175d7879c732dea8731a25e3100bfd68162" Mar 18 18:22:05 crc kubenswrapper[4830]: I0318 18:22:05.810699 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4v25q" Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.349761 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-88cd-account-create-update-qgxqj" Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.545006 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/523e43c6-ca4d-4107-bb08-02085e1fcd14-operator-scripts\") pod \"523e43c6-ca4d-4107-bb08-02085e1fcd14\" (UID: \"523e43c6-ca4d-4107-bb08-02085e1fcd14\") " Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.545284 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nljlp\" (UniqueName: \"kubernetes.io/projected/523e43c6-ca4d-4107-bb08-02085e1fcd14-kube-api-access-nljlp\") pod \"523e43c6-ca4d-4107-bb08-02085e1fcd14\" (UID: \"523e43c6-ca4d-4107-bb08-02085e1fcd14\") " Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.545693 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/523e43c6-ca4d-4107-bb08-02085e1fcd14-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "523e43c6-ca4d-4107-bb08-02085e1fcd14" (UID: "523e43c6-ca4d-4107-bb08-02085e1fcd14"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.545888 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-fhlvn" Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.546935 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/523e43c6-ca4d-4107-bb08-02085e1fcd14-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.552252 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/523e43c6-ca4d-4107-bb08-02085e1fcd14-kube-api-access-nljlp" (OuterVolumeSpecName: "kube-api-access-nljlp") pod "523e43c6-ca4d-4107-bb08-02085e1fcd14" (UID: "523e43c6-ca4d-4107-bb08-02085e1fcd14"). InnerVolumeSpecName "kube-api-access-nljlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.581163 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-03fa-account-create-update-b4pjm" Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.593931 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5vmth" Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.597972 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a2d6-account-create-update-qh85j" Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.653606 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74526bf3-152f-40de-9923-9197ffddfc2d-operator-scripts\") pod \"74526bf3-152f-40de-9923-9197ffddfc2d\" (UID: \"74526bf3-152f-40de-9923-9197ffddfc2d\") " Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.653703 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hxq7\" (UniqueName: \"kubernetes.io/projected/74526bf3-152f-40de-9923-9197ffddfc2d-kube-api-access-8hxq7\") pod \"74526bf3-152f-40de-9923-9197ffddfc2d\" (UID: \"74526bf3-152f-40de-9923-9197ffddfc2d\") " Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.654580 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nljlp\" (UniqueName: \"kubernetes.io/projected/523e43c6-ca4d-4107-bb08-02085e1fcd14-kube-api-access-nljlp\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.655671 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74526bf3-152f-40de-9923-9197ffddfc2d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "74526bf3-152f-40de-9923-9197ffddfc2d" (UID: "74526bf3-152f-40de-9923-9197ffddfc2d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.689028 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74526bf3-152f-40de-9923-9197ffddfc2d-kube-api-access-8hxq7" (OuterVolumeSpecName: "kube-api-access-8hxq7") pod "74526bf3-152f-40de-9923-9197ffddfc2d" (UID: "74526bf3-152f-40de-9923-9197ffddfc2d"). InnerVolumeSpecName "kube-api-access-8hxq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.760764 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7mnw\" (UniqueName: \"kubernetes.io/projected/0b3aa2e8-fa67-406b-b8cd-e21725c059c3-kube-api-access-t7mnw\") pod \"0b3aa2e8-fa67-406b-b8cd-e21725c059c3\" (UID: \"0b3aa2e8-fa67-406b-b8cd-e21725c059c3\") " Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.761035 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1057ef8c-09f6-4dc3-9350-bb834240d748-operator-scripts\") pod \"1057ef8c-09f6-4dc3-9350-bb834240d748\" (UID: \"1057ef8c-09f6-4dc3-9350-bb834240d748\") " Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.761209 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b3aa2e8-fa67-406b-b8cd-e21725c059c3-operator-scripts\") pod \"0b3aa2e8-fa67-406b-b8cd-e21725c059c3\" (UID: \"0b3aa2e8-fa67-406b-b8cd-e21725c059c3\") " Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.761347 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19dfbcf1-1e28-475a-ae01-092ae2e8764a-operator-scripts\") pod \"19dfbcf1-1e28-475a-ae01-092ae2e8764a\" (UID: \"19dfbcf1-1e28-475a-ae01-092ae2e8764a\") " Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.761470 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdztn\" (UniqueName: \"kubernetes.io/projected/1057ef8c-09f6-4dc3-9350-bb834240d748-kube-api-access-pdztn\") pod \"1057ef8c-09f6-4dc3-9350-bb834240d748\" (UID: \"1057ef8c-09f6-4dc3-9350-bb834240d748\") " Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.761529 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d255h\" (UniqueName: \"kubernetes.io/projected/19dfbcf1-1e28-475a-ae01-092ae2e8764a-kube-api-access-d255h\") pod \"19dfbcf1-1e28-475a-ae01-092ae2e8764a\" (UID: \"19dfbcf1-1e28-475a-ae01-092ae2e8764a\") " Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.761564 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1057ef8c-09f6-4dc3-9350-bb834240d748-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1057ef8c-09f6-4dc3-9350-bb834240d748" (UID: "1057ef8c-09f6-4dc3-9350-bb834240d748"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.761871 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b3aa2e8-fa67-406b-b8cd-e21725c059c3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0b3aa2e8-fa67-406b-b8cd-e21725c059c3" (UID: "0b3aa2e8-fa67-406b-b8cd-e21725c059c3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.761908 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19dfbcf1-1e28-475a-ae01-092ae2e8764a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "19dfbcf1-1e28-475a-ae01-092ae2e8764a" (UID: "19dfbcf1-1e28-475a-ae01-092ae2e8764a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.762328 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19dfbcf1-1e28-475a-ae01-092ae2e8764a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.762397 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74526bf3-152f-40de-9923-9197ffddfc2d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.762453 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hxq7\" (UniqueName: \"kubernetes.io/projected/74526bf3-152f-40de-9923-9197ffddfc2d-kube-api-access-8hxq7\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.762517 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1057ef8c-09f6-4dc3-9350-bb834240d748-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.762570 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b3aa2e8-fa67-406b-b8cd-e21725c059c3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.765314 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19dfbcf1-1e28-475a-ae01-092ae2e8764a-kube-api-access-d255h" (OuterVolumeSpecName: "kube-api-access-d255h") pod "19dfbcf1-1e28-475a-ae01-092ae2e8764a" (UID: "19dfbcf1-1e28-475a-ae01-092ae2e8764a"). InnerVolumeSpecName "kube-api-access-d255h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.765513 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b3aa2e8-fa67-406b-b8cd-e21725c059c3-kube-api-access-t7mnw" (OuterVolumeSpecName: "kube-api-access-t7mnw") pod "0b3aa2e8-fa67-406b-b8cd-e21725c059c3" (UID: "0b3aa2e8-fa67-406b-b8cd-e21725c059c3"). InnerVolumeSpecName "kube-api-access-t7mnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.770109 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1057ef8c-09f6-4dc3-9350-bb834240d748-kube-api-access-pdztn" (OuterVolumeSpecName: "kube-api-access-pdztn") pod "1057ef8c-09f6-4dc3-9350-bb834240d748" (UID: "1057ef8c-09f6-4dc3-9350-bb834240d748"). InnerVolumeSpecName "kube-api-access-pdztn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.833162 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18ee448a-62ed-4a7e-b28f-62cea12a4c5b","Type":"ContainerStarted","Data":"4663c0ef68fb98defea46668bd69633ce844fb9343eac7a6f6be7e3cf96ef15f"} Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.838867 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-fhlvn" event={"ID":"74526bf3-152f-40de-9923-9197ffddfc2d","Type":"ContainerDied","Data":"9406f07eba6d8c4b1484ab6e3ea043ad0c0536a30a9bf10dd4009b94d14e826a"} Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.839011 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9406f07eba6d8c4b1484ab6e3ea043ad0c0536a30a9bf10dd4009b94d14e826a" Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.839223 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-fhlvn" Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.843300 4830 generic.go:334] "Generic (PLEG): container finished" podID="00b94971-0d4a-45d3-8b41-4730dbcd9c81" containerID="4c0b35b5d195b0f9135287d08ef0223c294be7b2eccc6c9451f2bc66faf3424e" exitCode=0 Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.843358 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564302-vdnd2" event={"ID":"00b94971-0d4a-45d3-8b41-4730dbcd9c81","Type":"ContainerDied","Data":"4c0b35b5d195b0f9135287d08ef0223c294be7b2eccc6c9451f2bc66faf3424e"} Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.848196 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-03fa-account-create-update-b4pjm" event={"ID":"19dfbcf1-1e28-475a-ae01-092ae2e8764a","Type":"ContainerDied","Data":"f6b10999fbf343097fe1e91766e53c4b574998ea93bc2a9d6501bd955cbb280e"} Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.848243 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6b10999fbf343097fe1e91766e53c4b574998ea93bc2a9d6501bd955cbb280e" Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.848214 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-03fa-account-create-update-b4pjm" Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.850333 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5vmth" event={"ID":"0b3aa2e8-fa67-406b-b8cd-e21725c059c3","Type":"ContainerDied","Data":"4db12bb4422f46f8ff59f023a5b6119666a49512850029e492f061cf45792d4b"} Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.850366 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4db12bb4422f46f8ff59f023a5b6119666a49512850029e492f061cf45792d4b" Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.850417 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5vmth" Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.857359 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a2d6-account-create-update-qh85j" Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.857517 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-a2d6-account-create-update-qh85j" event={"ID":"1057ef8c-09f6-4dc3-9350-bb834240d748","Type":"ContainerDied","Data":"f9d995b4704966545a09254e104d5eba5dfc8192e5abf56dea17220fe9a2d3ce"} Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.857556 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9d995b4704966545a09254e104d5eba5dfc8192e5abf56dea17220fe9a2d3ce" Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.864490 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7mnw\" (UniqueName: \"kubernetes.io/projected/0b3aa2e8-fa67-406b-b8cd-e21725c059c3-kube-api-access-t7mnw\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.864756 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdztn\" (UniqueName: \"kubernetes.io/projected/1057ef8c-09f6-4dc3-9350-bb834240d748-kube-api-access-pdztn\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.864886 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d255h\" (UniqueName: \"kubernetes.io/projected/19dfbcf1-1e28-475a-ae01-092ae2e8764a-kube-api-access-d255h\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.866138 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-88cd-account-create-update-qgxqj" event={"ID":"523e43c6-ca4d-4107-bb08-02085e1fcd14","Type":"ContainerDied","Data":"cee584e7138c8c41a36261aed7b6d89f1b3e9f402931894fce8f2683bb7153ec"} Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.866168 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cee584e7138c8c41a36261aed7b6d89f1b3e9f402931894fce8f2683bb7153ec" Mar 18 18:22:06 crc kubenswrapper[4830]: I0318 18:22:06.866171 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-88cd-account-create-update-qgxqj" Mar 18 18:22:07 crc kubenswrapper[4830]: I0318 18:22:07.878074 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18ee448a-62ed-4a7e-b28f-62cea12a4c5b","Type":"ContainerStarted","Data":"d0a987d37d3dffa3fb5dfc9003643d84d015b1e84822b6e7f569d657ca1e841e"} Mar 18 18:22:07 crc kubenswrapper[4830]: I0318 18:22:07.878392 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18ee448a-62ed-4a7e-b28f-62cea12a4c5b","Type":"ContainerStarted","Data":"d10e7880f72e640772c0234f3bef0d3e3e31a5cf4a7d4d00882e87c6c43bb6bb"} Mar 18 18:22:08 crc kubenswrapper[4830]: I0318 18:22:08.221426 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564302-vdnd2" Mar 18 18:22:08 crc kubenswrapper[4830]: I0318 18:22:08.392800 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwvg5\" (UniqueName: \"kubernetes.io/projected/00b94971-0d4a-45d3-8b41-4730dbcd9c81-kube-api-access-pwvg5\") pod \"00b94971-0d4a-45d3-8b41-4730dbcd9c81\" (UID: \"00b94971-0d4a-45d3-8b41-4730dbcd9c81\") " Mar 18 18:22:08 crc kubenswrapper[4830]: I0318 18:22:08.397403 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00b94971-0d4a-45d3-8b41-4730dbcd9c81-kube-api-access-pwvg5" (OuterVolumeSpecName: "kube-api-access-pwvg5") pod "00b94971-0d4a-45d3-8b41-4730dbcd9c81" (UID: "00b94971-0d4a-45d3-8b41-4730dbcd9c81"). InnerVolumeSpecName "kube-api-access-pwvg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:08 crc kubenswrapper[4830]: I0318 18:22:08.497265 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwvg5\" (UniqueName: \"kubernetes.io/projected/00b94971-0d4a-45d3-8b41-4730dbcd9c81-kube-api-access-pwvg5\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:08 crc kubenswrapper[4830]: I0318 18:22:08.891893 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564302-vdnd2" event={"ID":"00b94971-0d4a-45d3-8b41-4730dbcd9c81","Type":"ContainerDied","Data":"e02273a001fbee2f06cf947ea00019899f92f0b8a8ad2535248af558ca2baa44"} Mar 18 18:22:08 crc kubenswrapper[4830]: I0318 18:22:08.891947 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e02273a001fbee2f06cf947ea00019899f92f0b8a8ad2535248af558ca2baa44" Mar 18 18:22:08 crc kubenswrapper[4830]: I0318 18:22:08.892032 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564302-vdnd2" Mar 18 18:22:09 crc kubenswrapper[4830]: I0318 18:22:09.329925 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564296-x6d6s"] Mar 18 18:22:09 crc kubenswrapper[4830]: I0318 18:22:09.337599 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564296-x6d6s"] Mar 18 18:22:10 crc kubenswrapper[4830]: I0318 18:22:10.247363 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4491dda-341a-42c5-b842-c42e7ac2f5bd" path="/var/lib/kubelet/pods/e4491dda-341a-42c5-b842-c42e7ac2f5bd/volumes" Mar 18 18:22:10 crc kubenswrapper[4830]: I0318 18:22:10.315928 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-d76d78d97-bs4hd" Mar 18 18:22:10 crc kubenswrapper[4830]: I0318 18:22:10.320058 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-d76d78d97-bs4hd" Mar 18 18:22:10 crc kubenswrapper[4830]: I0318 18:22:10.927925 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18ee448a-62ed-4a7e-b28f-62cea12a4c5b","Type":"ContainerStarted","Data":"da409b6c55821a0b72df9ec24021047d29fc13fb430de82427e6b0b8a3a01a0b"} Mar 18 18:22:10 crc kubenswrapper[4830]: I0318 18:22:10.928437 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18ee448a-62ed-4a7e-b28f-62cea12a4c5b" containerName="sg-core" containerID="cri-o://d0a987d37d3dffa3fb5dfc9003643d84d015b1e84822b6e7f569d657ca1e841e" gracePeriod=30 Mar 18 18:22:10 crc kubenswrapper[4830]: I0318 18:22:10.928489 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18ee448a-62ed-4a7e-b28f-62cea12a4c5b" containerName="ceilometer-notification-agent" containerID="cri-o://d10e7880f72e640772c0234f3bef0d3e3e31a5cf4a7d4d00882e87c6c43bb6bb" gracePeriod=30 Mar 18 18:22:10 crc kubenswrapper[4830]: I0318 18:22:10.928441 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18ee448a-62ed-4a7e-b28f-62cea12a4c5b" containerName="proxy-httpd" containerID="cri-o://da409b6c55821a0b72df9ec24021047d29fc13fb430de82427e6b0b8a3a01a0b" gracePeriod=30 Mar 18 18:22:10 crc kubenswrapper[4830]: I0318 18:22:10.928399 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18ee448a-62ed-4a7e-b28f-62cea12a4c5b" containerName="ceilometer-central-agent" containerID="cri-o://4663c0ef68fb98defea46668bd69633ce844fb9343eac7a6f6be7e3cf96ef15f" gracePeriod=30 Mar 18 18:22:10 crc kubenswrapper[4830]: I0318 18:22:10.956031 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.552703755 podStartE2EDuration="7.956010453s" podCreationTimestamp="2026-03-18 18:22:03 +0000 UTC" firstStartedPulling="2026-03-18 18:22:04.648545045 +0000 UTC m=+1159.216175387" lastFinishedPulling="2026-03-18 18:22:10.051851753 +0000 UTC m=+1164.619482085" observedRunningTime="2026-03-18 18:22:10.953074831 +0000 UTC m=+1165.520705163" watchObservedRunningTime="2026-03-18 18:22:10.956010453 +0000 UTC m=+1165.523640785" Mar 18 18:22:11 crc kubenswrapper[4830]: I0318 18:22:11.740498 4830 scope.go:117] "RemoveContainer" containerID="bbd47d7797a6e3b1e959aae752fd9cafe12dda4054864a8b3ab7c23584ffcf72" Mar 18 18:22:11 crc kubenswrapper[4830]: I0318 18:22:11.970946 4830 generic.go:334] "Generic (PLEG): container finished" podID="18ee448a-62ed-4a7e-b28f-62cea12a4c5b" containerID="da409b6c55821a0b72df9ec24021047d29fc13fb430de82427e6b0b8a3a01a0b" exitCode=0 Mar 18 18:22:11 crc kubenswrapper[4830]: I0318 18:22:11.970973 4830 generic.go:334] "Generic (PLEG): container finished" podID="18ee448a-62ed-4a7e-b28f-62cea12a4c5b" containerID="d0a987d37d3dffa3fb5dfc9003643d84d015b1e84822b6e7f569d657ca1e841e" exitCode=2 Mar 18 18:22:11 crc kubenswrapper[4830]: I0318 18:22:11.970982 4830 generic.go:334] "Generic (PLEG): container finished" podID="18ee448a-62ed-4a7e-b28f-62cea12a4c5b" containerID="d10e7880f72e640772c0234f3bef0d3e3e31a5cf4a7d4d00882e87c6c43bb6bb" exitCode=0 Mar 18 18:22:11 crc kubenswrapper[4830]: I0318 18:22:11.970977 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18ee448a-62ed-4a7e-b28f-62cea12a4c5b","Type":"ContainerDied","Data":"da409b6c55821a0b72df9ec24021047d29fc13fb430de82427e6b0b8a3a01a0b"} Mar 18 18:22:11 crc kubenswrapper[4830]: I0318 18:22:11.971053 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18ee448a-62ed-4a7e-b28f-62cea12a4c5b","Type":"ContainerDied","Data":"d0a987d37d3dffa3fb5dfc9003643d84d015b1e84822b6e7f569d657ca1e841e"} Mar 18 18:22:11 crc kubenswrapper[4830]: I0318 18:22:11.971085 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18ee448a-62ed-4a7e-b28f-62cea12a4c5b","Type":"ContainerDied","Data":"d10e7880f72e640772c0234f3bef0d3e3e31a5cf4a7d4d00882e87c6c43bb6bb"} Mar 18 18:22:12 crc kubenswrapper[4830]: I0318 18:22:12.369675 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mbk9z"] Mar 18 18:22:12 crc kubenswrapper[4830]: E0318 18:22:12.370043 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95625855-e07d-4366-8b59-7bc241752fab" containerName="mariadb-database-create" Mar 18 18:22:12 crc kubenswrapper[4830]: I0318 18:22:12.370054 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="95625855-e07d-4366-8b59-7bc241752fab" containerName="mariadb-database-create" Mar 18 18:22:12 crc kubenswrapper[4830]: E0318 18:22:12.370068 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1057ef8c-09f6-4dc3-9350-bb834240d748" containerName="mariadb-account-create-update" Mar 18 18:22:12 crc kubenswrapper[4830]: I0318 18:22:12.370074 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1057ef8c-09f6-4dc3-9350-bb834240d748" containerName="mariadb-account-create-update" Mar 18 18:22:12 crc kubenswrapper[4830]: E0318 18:22:12.370091 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="523e43c6-ca4d-4107-bb08-02085e1fcd14" containerName="mariadb-account-create-update" Mar 18 18:22:12 crc kubenswrapper[4830]: I0318 18:22:12.370097 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="523e43c6-ca4d-4107-bb08-02085e1fcd14" containerName="mariadb-account-create-update" Mar 18 18:22:12 crc kubenswrapper[4830]: E0318 18:22:12.370109 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00b94971-0d4a-45d3-8b41-4730dbcd9c81" containerName="oc" Mar 18 18:22:12 crc kubenswrapper[4830]: I0318 18:22:12.370115 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="00b94971-0d4a-45d3-8b41-4730dbcd9c81" containerName="oc" Mar 18 18:22:12 crc kubenswrapper[4830]: E0318 18:22:12.370123 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74526bf3-152f-40de-9923-9197ffddfc2d" containerName="mariadb-database-create" Mar 18 18:22:12 crc kubenswrapper[4830]: I0318 18:22:12.370129 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="74526bf3-152f-40de-9923-9197ffddfc2d" containerName="mariadb-database-create" Mar 18 18:22:12 crc kubenswrapper[4830]: E0318 18:22:12.370152 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19dfbcf1-1e28-475a-ae01-092ae2e8764a" containerName="mariadb-account-create-update" Mar 18 18:22:12 crc kubenswrapper[4830]: I0318 18:22:12.370157 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="19dfbcf1-1e28-475a-ae01-092ae2e8764a" containerName="mariadb-account-create-update" Mar 18 18:22:12 crc kubenswrapper[4830]: E0318 18:22:12.370165 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b3aa2e8-fa67-406b-b8cd-e21725c059c3" containerName="mariadb-database-create" Mar 18 18:22:12 crc kubenswrapper[4830]: I0318 18:22:12.370171 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b3aa2e8-fa67-406b-b8cd-e21725c059c3" containerName="mariadb-database-create" Mar 18 18:22:12 crc kubenswrapper[4830]: I0318 18:22:12.370311 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b3aa2e8-fa67-406b-b8cd-e21725c059c3" containerName="mariadb-database-create" Mar 18 18:22:12 crc kubenswrapper[4830]: I0318 18:22:12.370329 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="523e43c6-ca4d-4107-bb08-02085e1fcd14" containerName="mariadb-account-create-update" Mar 18 18:22:12 crc kubenswrapper[4830]: I0318 18:22:12.370339 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="95625855-e07d-4366-8b59-7bc241752fab" containerName="mariadb-database-create" Mar 18 18:22:12 crc kubenswrapper[4830]: I0318 18:22:12.370347 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="74526bf3-152f-40de-9923-9197ffddfc2d" containerName="mariadb-database-create" Mar 18 18:22:12 crc kubenswrapper[4830]: I0318 18:22:12.370367 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="19dfbcf1-1e28-475a-ae01-092ae2e8764a" containerName="mariadb-account-create-update" Mar 18 18:22:12 crc kubenswrapper[4830]: I0318 18:22:12.370375 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="1057ef8c-09f6-4dc3-9350-bb834240d748" containerName="mariadb-account-create-update" Mar 18 18:22:12 crc kubenswrapper[4830]: I0318 18:22:12.370383 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="00b94971-0d4a-45d3-8b41-4730dbcd9c81" containerName="oc" Mar 18 18:22:12 crc kubenswrapper[4830]: I0318 18:22:12.372863 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mbk9z" Mar 18 18:22:12 crc kubenswrapper[4830]: I0318 18:22:12.376100 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 18 18:22:12 crc kubenswrapper[4830]: I0318 18:22:12.376272 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-v7lpj" Mar 18 18:22:12 crc kubenswrapper[4830]: I0318 18:22:12.376387 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 18:22:12 crc kubenswrapper[4830]: I0318 18:22:12.390146 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mbk9z"] Mar 18 18:22:12 crc kubenswrapper[4830]: I0318 18:22:12.463755 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c12321e-7436-4126-9ad3-597fa7216bc8-scripts\") pod \"nova-cell0-conductor-db-sync-mbk9z\" (UID: \"0c12321e-7436-4126-9ad3-597fa7216bc8\") " pod="openstack/nova-cell0-conductor-db-sync-mbk9z" Mar 18 18:22:12 crc kubenswrapper[4830]: I0318 18:22:12.463947 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c12321e-7436-4126-9ad3-597fa7216bc8-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mbk9z\" (UID: \"0c12321e-7436-4126-9ad3-597fa7216bc8\") " pod="openstack/nova-cell0-conductor-db-sync-mbk9z" Mar 18 18:22:12 crc kubenswrapper[4830]: I0318 18:22:12.464039 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c12321e-7436-4126-9ad3-597fa7216bc8-config-data\") pod \"nova-cell0-conductor-db-sync-mbk9z\" (UID: \"0c12321e-7436-4126-9ad3-597fa7216bc8\") " pod="openstack/nova-cell0-conductor-db-sync-mbk9z" Mar 18 18:22:12 crc kubenswrapper[4830]: I0318 18:22:12.464174 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vfp4\" (UniqueName: \"kubernetes.io/projected/0c12321e-7436-4126-9ad3-597fa7216bc8-kube-api-access-9vfp4\") pod \"nova-cell0-conductor-db-sync-mbk9z\" (UID: \"0c12321e-7436-4126-9ad3-597fa7216bc8\") " pod="openstack/nova-cell0-conductor-db-sync-mbk9z" Mar 18 18:22:12 crc kubenswrapper[4830]: I0318 18:22:12.565870 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c12321e-7436-4126-9ad3-597fa7216bc8-scripts\") pod \"nova-cell0-conductor-db-sync-mbk9z\" (UID: \"0c12321e-7436-4126-9ad3-597fa7216bc8\") " pod="openstack/nova-cell0-conductor-db-sync-mbk9z" Mar 18 18:22:12 crc kubenswrapper[4830]: I0318 18:22:12.566241 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c12321e-7436-4126-9ad3-597fa7216bc8-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mbk9z\" (UID: \"0c12321e-7436-4126-9ad3-597fa7216bc8\") " pod="openstack/nova-cell0-conductor-db-sync-mbk9z" Mar 18 18:22:12 crc kubenswrapper[4830]: I0318 18:22:12.566275 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c12321e-7436-4126-9ad3-597fa7216bc8-config-data\") pod \"nova-cell0-conductor-db-sync-mbk9z\" (UID: \"0c12321e-7436-4126-9ad3-597fa7216bc8\") " pod="openstack/nova-cell0-conductor-db-sync-mbk9z" Mar 18 18:22:12 crc kubenswrapper[4830]: I0318 18:22:12.566324 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vfp4\" (UniqueName: \"kubernetes.io/projected/0c12321e-7436-4126-9ad3-597fa7216bc8-kube-api-access-9vfp4\") pod \"nova-cell0-conductor-db-sync-mbk9z\" (UID: \"0c12321e-7436-4126-9ad3-597fa7216bc8\") " pod="openstack/nova-cell0-conductor-db-sync-mbk9z" Mar 18 18:22:12 crc kubenswrapper[4830]: I0318 18:22:12.574343 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c12321e-7436-4126-9ad3-597fa7216bc8-scripts\") pod \"nova-cell0-conductor-db-sync-mbk9z\" (UID: \"0c12321e-7436-4126-9ad3-597fa7216bc8\") " pod="openstack/nova-cell0-conductor-db-sync-mbk9z" Mar 18 18:22:12 crc kubenswrapper[4830]: I0318 18:22:12.574787 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c12321e-7436-4126-9ad3-597fa7216bc8-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mbk9z\" (UID: \"0c12321e-7436-4126-9ad3-597fa7216bc8\") " pod="openstack/nova-cell0-conductor-db-sync-mbk9z" Mar 18 18:22:12 crc kubenswrapper[4830]: I0318 18:22:12.575933 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c12321e-7436-4126-9ad3-597fa7216bc8-config-data\") pod \"nova-cell0-conductor-db-sync-mbk9z\" (UID: \"0c12321e-7436-4126-9ad3-597fa7216bc8\") " pod="openstack/nova-cell0-conductor-db-sync-mbk9z" Mar 18 18:22:12 crc kubenswrapper[4830]: I0318 18:22:12.582350 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vfp4\" (UniqueName: \"kubernetes.io/projected/0c12321e-7436-4126-9ad3-597fa7216bc8-kube-api-access-9vfp4\") pod \"nova-cell0-conductor-db-sync-mbk9z\" (UID: \"0c12321e-7436-4126-9ad3-597fa7216bc8\") " pod="openstack/nova-cell0-conductor-db-sync-mbk9z" Mar 18 18:22:12 crc kubenswrapper[4830]: I0318 18:22:12.688806 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mbk9z" Mar 18 18:22:13 crc kubenswrapper[4830]: I0318 18:22:13.376710 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mbk9z"] Mar 18 18:22:14 crc kubenswrapper[4830]: I0318 18:22:14.023875 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mbk9z" event={"ID":"0c12321e-7436-4126-9ad3-597fa7216bc8","Type":"ContainerStarted","Data":"741fec637dedd1b67c3a0dea78ad608663d4bbcab737663721475d70f8ee784c"} Mar 18 18:22:17 crc kubenswrapper[4830]: I0318 18:22:17.026076 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-676956db6-6grw2" Mar 18 18:22:17 crc kubenswrapper[4830]: I0318 18:22:17.028316 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-676956db6-6grw2" Mar 18 18:22:17 crc kubenswrapper[4830]: I0318 18:22:17.058674 4830 generic.go:334] "Generic (PLEG): container finished" podID="18ee448a-62ed-4a7e-b28f-62cea12a4c5b" containerID="4663c0ef68fb98defea46668bd69633ce844fb9343eac7a6f6be7e3cf96ef15f" exitCode=0 Mar 18 18:22:17 crc kubenswrapper[4830]: I0318 18:22:17.058762 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18ee448a-62ed-4a7e-b28f-62cea12a4c5b","Type":"ContainerDied","Data":"4663c0ef68fb98defea46668bd69633ce844fb9343eac7a6f6be7e3cf96ef15f"} Mar 18 18:22:17 crc kubenswrapper[4830]: I0318 18:22:17.111320 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-788c577778-lg8kp"] Mar 18 18:22:17 crc kubenswrapper[4830]: I0318 18:22:17.111562 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-788c577778-lg8kp" podUID="4cea3ec6-0f57-4c56-92fb-5dc40de81fe4" containerName="placement-log" containerID="cri-o://bcdbca28adb2b677a3b5cdbe0dc2125d47ea6cf58677c075e7a03027a4601543" gracePeriod=30 Mar 18 18:22:17 crc kubenswrapper[4830]: I0318 18:22:17.111991 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-788c577778-lg8kp" podUID="4cea3ec6-0f57-4c56-92fb-5dc40de81fe4" containerName="placement-api" containerID="cri-o://c528121726764a5f97954024e587f792b928d1ed21a311edd7665687664f45be" gracePeriod=30 Mar 18 18:22:18 crc kubenswrapper[4830]: I0318 18:22:18.067900 4830 generic.go:334] "Generic (PLEG): container finished" podID="4cea3ec6-0f57-4c56-92fb-5dc40de81fe4" containerID="bcdbca28adb2b677a3b5cdbe0dc2125d47ea6cf58677c075e7a03027a4601543" exitCode=143 Mar 18 18:22:18 crc kubenswrapper[4830]: I0318 18:22:18.068171 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-788c577778-lg8kp" event={"ID":"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4","Type":"ContainerDied","Data":"bcdbca28adb2b677a3b5cdbe0dc2125d47ea6cf58677c075e7a03027a4601543"} Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.102966 4830 generic.go:334] "Generic (PLEG): container finished" podID="4cea3ec6-0f57-4c56-92fb-5dc40de81fe4" containerID="c528121726764a5f97954024e587f792b928d1ed21a311edd7665687664f45be" exitCode=0 Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.103062 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-788c577778-lg8kp" event={"ID":"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4","Type":"ContainerDied","Data":"c528121726764a5f97954024e587f792b928d1ed21a311edd7665687664f45be"} Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.708745 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-788c577778-lg8kp" Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.725522 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.840615 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-run-httpd\") pod \"18ee448a-62ed-4a7e-b28f-62cea12a4c5b\" (UID: \"18ee448a-62ed-4a7e-b28f-62cea12a4c5b\") " Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.840669 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-config-data\") pod \"18ee448a-62ed-4a7e-b28f-62cea12a4c5b\" (UID: \"18ee448a-62ed-4a7e-b28f-62cea12a4c5b\") " Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.840720 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-public-tls-certs\") pod \"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4\" (UID: \"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4\") " Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.840784 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ksw6\" (UniqueName: \"kubernetes.io/projected/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-kube-api-access-2ksw6\") pod \"18ee448a-62ed-4a7e-b28f-62cea12a4c5b\" (UID: \"18ee448a-62ed-4a7e-b28f-62cea12a4c5b\") " Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.840809 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-scripts\") pod \"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4\" (UID: \"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4\") " Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.840851 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-logs\") pod \"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4\" (UID: \"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4\") " Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.840870 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-scripts\") pod \"18ee448a-62ed-4a7e-b28f-62cea12a4c5b\" (UID: \"18ee448a-62ed-4a7e-b28f-62cea12a4c5b\") " Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.840901 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-config-data\") pod \"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4\" (UID: \"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4\") " Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.840922 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-combined-ca-bundle\") pod \"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4\" (UID: \"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4\") " Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.840944 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-sg-core-conf-yaml\") pod \"18ee448a-62ed-4a7e-b28f-62cea12a4c5b\" (UID: \"18ee448a-62ed-4a7e-b28f-62cea12a4c5b\") " Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.840984 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbwjb\" (UniqueName: \"kubernetes.io/projected/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-kube-api-access-zbwjb\") pod \"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4\" (UID: \"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4\") " Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.841002 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-combined-ca-bundle\") pod \"18ee448a-62ed-4a7e-b28f-62cea12a4c5b\" (UID: \"18ee448a-62ed-4a7e-b28f-62cea12a4c5b\") " Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.841036 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-internal-tls-certs\") pod \"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4\" (UID: \"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4\") " Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.841085 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-log-httpd\") pod \"18ee448a-62ed-4a7e-b28f-62cea12a4c5b\" (UID: \"18ee448a-62ed-4a7e-b28f-62cea12a4c5b\") " Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.841690 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "18ee448a-62ed-4a7e-b28f-62cea12a4c5b" (UID: "18ee448a-62ed-4a7e-b28f-62cea12a4c5b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.841993 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "18ee448a-62ed-4a7e-b28f-62cea12a4c5b" (UID: "18ee448a-62ed-4a7e-b28f-62cea12a4c5b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.842000 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-logs" (OuterVolumeSpecName: "logs") pod "4cea3ec6-0f57-4c56-92fb-5dc40de81fe4" (UID: "4cea3ec6-0f57-4c56-92fb-5dc40de81fe4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.847270 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-kube-api-access-2ksw6" (OuterVolumeSpecName: "kube-api-access-2ksw6") pod "18ee448a-62ed-4a7e-b28f-62cea12a4c5b" (UID: "18ee448a-62ed-4a7e-b28f-62cea12a4c5b"). InnerVolumeSpecName "kube-api-access-2ksw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.847447 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-kube-api-access-zbwjb" (OuterVolumeSpecName: "kube-api-access-zbwjb") pod "4cea3ec6-0f57-4c56-92fb-5dc40de81fe4" (UID: "4cea3ec6-0f57-4c56-92fb-5dc40de81fe4"). InnerVolumeSpecName "kube-api-access-zbwjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.848411 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-scripts" (OuterVolumeSpecName: "scripts") pod "4cea3ec6-0f57-4c56-92fb-5dc40de81fe4" (UID: "4cea3ec6-0f57-4c56-92fb-5dc40de81fe4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.849039 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-scripts" (OuterVolumeSpecName: "scripts") pod "18ee448a-62ed-4a7e-b28f-62cea12a4c5b" (UID: "18ee448a-62ed-4a7e-b28f-62cea12a4c5b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.878182 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "18ee448a-62ed-4a7e-b28f-62cea12a4c5b" (UID: "18ee448a-62ed-4a7e-b28f-62cea12a4c5b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.895422 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-config-data" (OuterVolumeSpecName: "config-data") pod "4cea3ec6-0f57-4c56-92fb-5dc40de81fe4" (UID: "4cea3ec6-0f57-4c56-92fb-5dc40de81fe4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.900991 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4cea3ec6-0f57-4c56-92fb-5dc40de81fe4" (UID: "4cea3ec6-0f57-4c56-92fb-5dc40de81fe4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.934866 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18ee448a-62ed-4a7e-b28f-62cea12a4c5b" (UID: "18ee448a-62ed-4a7e-b28f-62cea12a4c5b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.945841 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.945871 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.945879 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.945888 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.945898 4830 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.945907 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbwjb\" (UniqueName: \"kubernetes.io/projected/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-kube-api-access-zbwjb\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.945917 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.945925 4830 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.945933 4830 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.945940 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ksw6\" (UniqueName: \"kubernetes.io/projected/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-kube-api-access-2ksw6\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.945948 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.951633 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4cea3ec6-0f57-4c56-92fb-5dc40de81fe4" (UID: "4cea3ec6-0f57-4c56-92fb-5dc40de81fe4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.957957 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4cea3ec6-0f57-4c56-92fb-5dc40de81fe4" (UID: "4cea3ec6-0f57-4c56-92fb-5dc40de81fe4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:21 crc kubenswrapper[4830]: I0318 18:22:21.964564 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-config-data" (OuterVolumeSpecName: "config-data") pod "18ee448a-62ed-4a7e-b28f-62cea12a4c5b" (UID: "18ee448a-62ed-4a7e-b28f-62cea12a4c5b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.048474 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18ee448a-62ed-4a7e-b28f-62cea12a4c5b-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.048519 4830 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.048537 4830 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.116943 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18ee448a-62ed-4a7e-b28f-62cea12a4c5b","Type":"ContainerDied","Data":"3bd11c348e4c5f758dec0e1784e4565ae52e14f4f34210864e77888a0125ad1d"} Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.117017 4830 scope.go:117] "RemoveContainer" containerID="da409b6c55821a0b72df9ec24021047d29fc13fb430de82427e6b0b8a3a01a0b" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.117179 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.121241 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mbk9z" event={"ID":"0c12321e-7436-4126-9ad3-597fa7216bc8","Type":"ContainerStarted","Data":"ff1f5ffd4b74593f213089537aa13600d46dd89ff13f70a78e08d9778968bb0c"} Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.125539 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-788c577778-lg8kp" event={"ID":"4cea3ec6-0f57-4c56-92fb-5dc40de81fe4","Type":"ContainerDied","Data":"f550bc3b9d453218b2c6339b04d35bb1b92975b54fa0f5a9f503e69992114125"} Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.125618 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-788c577778-lg8kp" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.156685 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-mbk9z" podStartSLOduration=1.8964568019999999 podStartE2EDuration="10.1566639s" podCreationTimestamp="2026-03-18 18:22:12 +0000 UTC" firstStartedPulling="2026-03-18 18:22:13.382989466 +0000 UTC m=+1167.950619798" lastFinishedPulling="2026-03-18 18:22:21.643196564 +0000 UTC m=+1176.210826896" observedRunningTime="2026-03-18 18:22:22.150720103 +0000 UTC m=+1176.718350455" watchObservedRunningTime="2026-03-18 18:22:22.1566639 +0000 UTC m=+1176.724294242" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.176511 4830 scope.go:117] "RemoveContainer" containerID="d0a987d37d3dffa3fb5dfc9003643d84d015b1e84822b6e7f569d657ca1e841e" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.190609 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.209398 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.223319 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:22:22 crc kubenswrapper[4830]: E0318 18:22:22.223888 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cea3ec6-0f57-4c56-92fb-5dc40de81fe4" containerName="placement-api" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.223912 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cea3ec6-0f57-4c56-92fb-5dc40de81fe4" containerName="placement-api" Mar 18 18:22:22 crc kubenswrapper[4830]: E0318 18:22:22.223938 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ee448a-62ed-4a7e-b28f-62cea12a4c5b" containerName="ceilometer-central-agent" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.223949 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ee448a-62ed-4a7e-b28f-62cea12a4c5b" containerName="ceilometer-central-agent" Mar 18 18:22:22 crc kubenswrapper[4830]: E0318 18:22:22.223974 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ee448a-62ed-4a7e-b28f-62cea12a4c5b" containerName="sg-core" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.223986 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ee448a-62ed-4a7e-b28f-62cea12a4c5b" containerName="sg-core" Mar 18 18:22:22 crc kubenswrapper[4830]: E0318 18:22:22.224001 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cea3ec6-0f57-4c56-92fb-5dc40de81fe4" containerName="placement-log" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.224012 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cea3ec6-0f57-4c56-92fb-5dc40de81fe4" containerName="placement-log" Mar 18 18:22:22 crc kubenswrapper[4830]: E0318 18:22:22.224029 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ee448a-62ed-4a7e-b28f-62cea12a4c5b" containerName="proxy-httpd" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.224041 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ee448a-62ed-4a7e-b28f-62cea12a4c5b" containerName="proxy-httpd" Mar 18 18:22:22 crc kubenswrapper[4830]: E0318 18:22:22.224075 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ee448a-62ed-4a7e-b28f-62cea12a4c5b" containerName="ceilometer-notification-agent" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.224088 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ee448a-62ed-4a7e-b28f-62cea12a4c5b" containerName="ceilometer-notification-agent" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.224359 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="18ee448a-62ed-4a7e-b28f-62cea12a4c5b" containerName="ceilometer-central-agent" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.224396 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="18ee448a-62ed-4a7e-b28f-62cea12a4c5b" containerName="ceilometer-notification-agent" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.224415 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cea3ec6-0f57-4c56-92fb-5dc40de81fe4" containerName="placement-api" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.224426 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="18ee448a-62ed-4a7e-b28f-62cea12a4c5b" containerName="proxy-httpd" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.224441 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="18ee448a-62ed-4a7e-b28f-62cea12a4c5b" containerName="sg-core" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.224457 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cea3ec6-0f57-4c56-92fb-5dc40de81fe4" containerName="placement-log" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.226470 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.229190 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.229412 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.232081 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.260669 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18ee448a-62ed-4a7e-b28f-62cea12a4c5b" path="/var/lib/kubelet/pods/18ee448a-62ed-4a7e-b28f-62cea12a4c5b/volumes" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.296762 4830 scope.go:117] "RemoveContainer" containerID="d10e7880f72e640772c0234f3bef0d3e3e31a5cf4a7d4d00882e87c6c43bb6bb" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.299007 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-788c577778-lg8kp"] Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.307073 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-788c577778-lg8kp"] Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.319108 4830 scope.go:117] "RemoveContainer" containerID="4663c0ef68fb98defea46668bd69633ce844fb9343eac7a6f6be7e3cf96ef15f" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.335759 4830 scope.go:117] "RemoveContainer" containerID="c528121726764a5f97954024e587f792b928d1ed21a311edd7665687664f45be" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.359163 4830 scope.go:117] "RemoveContainer" containerID="bcdbca28adb2b677a3b5cdbe0dc2125d47ea6cf58677c075e7a03027a4601543" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.359204 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/931cef43-2b46-4774-9c28-1993c52f8131-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"931cef43-2b46-4774-9c28-1993c52f8131\") " pod="openstack/ceilometer-0" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.359298 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/931cef43-2b46-4774-9c28-1993c52f8131-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"931cef43-2b46-4774-9c28-1993c52f8131\") " pod="openstack/ceilometer-0" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.359329 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/931cef43-2b46-4774-9c28-1993c52f8131-config-data\") pod \"ceilometer-0\" (UID: \"931cef43-2b46-4774-9c28-1993c52f8131\") " pod="openstack/ceilometer-0" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.359403 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/931cef43-2b46-4774-9c28-1993c52f8131-log-httpd\") pod \"ceilometer-0\" (UID: \"931cef43-2b46-4774-9c28-1993c52f8131\") " pod="openstack/ceilometer-0" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.359431 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/931cef43-2b46-4774-9c28-1993c52f8131-scripts\") pod \"ceilometer-0\" (UID: \"931cef43-2b46-4774-9c28-1993c52f8131\") " pod="openstack/ceilometer-0" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.359452 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/931cef43-2b46-4774-9c28-1993c52f8131-run-httpd\") pod \"ceilometer-0\" (UID: \"931cef43-2b46-4774-9c28-1993c52f8131\") " pod="openstack/ceilometer-0" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.359494 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phzfz\" (UniqueName: \"kubernetes.io/projected/931cef43-2b46-4774-9c28-1993c52f8131-kube-api-access-phzfz\") pod \"ceilometer-0\" (UID: \"931cef43-2b46-4774-9c28-1993c52f8131\") " pod="openstack/ceilometer-0" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.461097 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/931cef43-2b46-4774-9c28-1993c52f8131-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"931cef43-2b46-4774-9c28-1993c52f8131\") " pod="openstack/ceilometer-0" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.461146 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/931cef43-2b46-4774-9c28-1993c52f8131-config-data\") pod \"ceilometer-0\" (UID: \"931cef43-2b46-4774-9c28-1993c52f8131\") " pod="openstack/ceilometer-0" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.461236 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/931cef43-2b46-4774-9c28-1993c52f8131-log-httpd\") pod \"ceilometer-0\" (UID: \"931cef43-2b46-4774-9c28-1993c52f8131\") " pod="openstack/ceilometer-0" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.461305 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/931cef43-2b46-4774-9c28-1993c52f8131-scripts\") pod \"ceilometer-0\" (UID: \"931cef43-2b46-4774-9c28-1993c52f8131\") " pod="openstack/ceilometer-0" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.461358 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/931cef43-2b46-4774-9c28-1993c52f8131-run-httpd\") pod \"ceilometer-0\" (UID: \"931cef43-2b46-4774-9c28-1993c52f8131\") " pod="openstack/ceilometer-0" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.461395 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phzfz\" (UniqueName: \"kubernetes.io/projected/931cef43-2b46-4774-9c28-1993c52f8131-kube-api-access-phzfz\") pod \"ceilometer-0\" (UID: \"931cef43-2b46-4774-9c28-1993c52f8131\") " pod="openstack/ceilometer-0" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.461508 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/931cef43-2b46-4774-9c28-1993c52f8131-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"931cef43-2b46-4774-9c28-1993c52f8131\") " pod="openstack/ceilometer-0" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.462935 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/931cef43-2b46-4774-9c28-1993c52f8131-run-httpd\") pod \"ceilometer-0\" (UID: \"931cef43-2b46-4774-9c28-1993c52f8131\") " pod="openstack/ceilometer-0" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.467012 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/931cef43-2b46-4774-9c28-1993c52f8131-log-httpd\") pod \"ceilometer-0\" (UID: \"931cef43-2b46-4774-9c28-1993c52f8131\") " pod="openstack/ceilometer-0" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.468252 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/931cef43-2b46-4774-9c28-1993c52f8131-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"931cef43-2b46-4774-9c28-1993c52f8131\") " pod="openstack/ceilometer-0" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.468308 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/931cef43-2b46-4774-9c28-1993c52f8131-config-data\") pod \"ceilometer-0\" (UID: \"931cef43-2b46-4774-9c28-1993c52f8131\") " pod="openstack/ceilometer-0" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.468728 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/931cef43-2b46-4774-9c28-1993c52f8131-scripts\") pod \"ceilometer-0\" (UID: \"931cef43-2b46-4774-9c28-1993c52f8131\") " pod="openstack/ceilometer-0" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.468813 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/931cef43-2b46-4774-9c28-1993c52f8131-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"931cef43-2b46-4774-9c28-1993c52f8131\") " pod="openstack/ceilometer-0" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.483737 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phzfz\" (UniqueName: \"kubernetes.io/projected/931cef43-2b46-4774-9c28-1993c52f8131-kube-api-access-phzfz\") pod \"ceilometer-0\" (UID: \"931cef43-2b46-4774-9c28-1993c52f8131\") " pod="openstack/ceilometer-0" Mar 18 18:22:22 crc kubenswrapper[4830]: I0318 18:22:22.587818 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:22:23 crc kubenswrapper[4830]: I0318 18:22:23.069642 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:22:23 crc kubenswrapper[4830]: W0318 18:22:23.071660 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod931cef43_2b46_4774_9c28_1993c52f8131.slice/crio-5f69faffaef725993b29171f496cf8d763dcd0a962eeb28adbd6dbd4fe1a07e7 WatchSource:0}: Error finding container 5f69faffaef725993b29171f496cf8d763dcd0a962eeb28adbd6dbd4fe1a07e7: Status 404 returned error can't find the container with id 5f69faffaef725993b29171f496cf8d763dcd0a962eeb28adbd6dbd4fe1a07e7 Mar 18 18:22:23 crc kubenswrapper[4830]: I0318 18:22:23.137313 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"931cef43-2b46-4774-9c28-1993c52f8131","Type":"ContainerStarted","Data":"5f69faffaef725993b29171f496cf8d763dcd0a962eeb28adbd6dbd4fe1a07e7"} Mar 18 18:22:24 crc kubenswrapper[4830]: I0318 18:22:24.153017 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"931cef43-2b46-4774-9c28-1993c52f8131","Type":"ContainerStarted","Data":"a7ca8bdc24e0f3b4cb0d43e9dae73d8a9dad5aeaf424cfc2b8a27bbad63affe0"} Mar 18 18:22:24 crc kubenswrapper[4830]: I0318 18:22:24.255497 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cea3ec6-0f57-4c56-92fb-5dc40de81fe4" path="/var/lib/kubelet/pods/4cea3ec6-0f57-4c56-92fb-5dc40de81fe4/volumes" Mar 18 18:22:25 crc kubenswrapper[4830]: I0318 18:22:25.174804 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"931cef43-2b46-4774-9c28-1993c52f8131","Type":"ContainerStarted","Data":"fbc292b60f63f5f72c9ea3eddd30e732c09373740daf20121910f96cdd19ff76"} Mar 18 18:22:26 crc kubenswrapper[4830]: I0318 18:22:26.192760 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"931cef43-2b46-4774-9c28-1993c52f8131","Type":"ContainerStarted","Data":"abd3e78a4500ff21131db49133175683c761aea36090c7ec2532bfe04b4515dd"} Mar 18 18:22:28 crc kubenswrapper[4830]: I0318 18:22:28.219029 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"931cef43-2b46-4774-9c28-1993c52f8131","Type":"ContainerStarted","Data":"260cb0ed5c756d6068933a7b79022d8709a37f96f1dc54ceeb55708fca7420fc"} Mar 18 18:22:28 crc kubenswrapper[4830]: I0318 18:22:28.219530 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 18:22:28 crc kubenswrapper[4830]: I0318 18:22:28.248753 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.327027507 podStartE2EDuration="6.248716662s" podCreationTimestamp="2026-03-18 18:22:22 +0000 UTC" firstStartedPulling="2026-03-18 18:22:23.073925198 +0000 UTC m=+1177.641555530" lastFinishedPulling="2026-03-18 18:22:26.995614343 +0000 UTC m=+1181.563244685" observedRunningTime="2026-03-18 18:22:28.247364974 +0000 UTC m=+1182.814995386" watchObservedRunningTime="2026-03-18 18:22:28.248716662 +0000 UTC m=+1182.816346994" Mar 18 18:22:29 crc kubenswrapper[4830]: I0318 18:22:29.510312 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:22:29 crc kubenswrapper[4830]: I0318 18:22:29.510756 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:22:29 crc kubenswrapper[4830]: I0318 18:22:29.568812 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:22:29 crc kubenswrapper[4830]: I0318 18:22:29.569178 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="af2a66cf-2d32-4beb-9df1-e3958a2ff5de" containerName="glance-log" containerID="cri-o://eeaeeb52c55c3b4e532608d4a697fea60145f3d263eab500579c40079e06d51d" gracePeriod=30 Mar 18 18:22:29 crc kubenswrapper[4830]: I0318 18:22:29.569326 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="af2a66cf-2d32-4beb-9df1-e3958a2ff5de" containerName="glance-httpd" containerID="cri-o://824bf5a73987f5113ac249c45ebbb7a4b623031c0aac26c391413ea78d6b2547" gracePeriod=30 Mar 18 18:22:30 crc kubenswrapper[4830]: I0318 18:22:30.247450 4830 generic.go:334] "Generic (PLEG): container finished" podID="af2a66cf-2d32-4beb-9df1-e3958a2ff5de" containerID="eeaeeb52c55c3b4e532608d4a697fea60145f3d263eab500579c40079e06d51d" exitCode=143 Mar 18 18:22:30 crc kubenswrapper[4830]: I0318 18:22:30.247552 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"af2a66cf-2d32-4beb-9df1-e3958a2ff5de","Type":"ContainerDied","Data":"eeaeeb52c55c3b4e532608d4a697fea60145f3d263eab500579c40079e06d51d"} Mar 18 18:22:30 crc kubenswrapper[4830]: I0318 18:22:30.425320 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:22:30 crc kubenswrapper[4830]: I0318 18:22:30.425613 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="dee116fa-07c8-44cf-b7b9-8dd248a32d82" containerName="glance-log" containerID="cri-o://9bcaf510219165dbbff0561363074075213557f9315c0786aed45b26b2d4e15e" gracePeriod=30 Mar 18 18:22:30 crc kubenswrapper[4830]: I0318 18:22:30.425746 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="dee116fa-07c8-44cf-b7b9-8dd248a32d82" containerName="glance-httpd" containerID="cri-o://f2ae5ca0cd907c8d00efc0bdb756c6268dc3f31fdaf2ff901140788845cf5b79" gracePeriod=30 Mar 18 18:22:31 crc kubenswrapper[4830]: I0318 18:22:31.265848 4830 generic.go:334] "Generic (PLEG): container finished" podID="dee116fa-07c8-44cf-b7b9-8dd248a32d82" containerID="9bcaf510219165dbbff0561363074075213557f9315c0786aed45b26b2d4e15e" exitCode=143 Mar 18 18:22:31 crc kubenswrapper[4830]: I0318 18:22:31.265928 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dee116fa-07c8-44cf-b7b9-8dd248a32d82","Type":"ContainerDied","Data":"9bcaf510219165dbbff0561363074075213557f9315c0786aed45b26b2d4e15e"} Mar 18 18:22:32 crc kubenswrapper[4830]: I0318 18:22:32.005060 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:22:32 crc kubenswrapper[4830]: I0318 18:22:32.006891 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="931cef43-2b46-4774-9c28-1993c52f8131" containerName="ceilometer-central-agent" containerID="cri-o://a7ca8bdc24e0f3b4cb0d43e9dae73d8a9dad5aeaf424cfc2b8a27bbad63affe0" gracePeriod=30 Mar 18 18:22:32 crc kubenswrapper[4830]: I0318 18:22:32.006977 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="931cef43-2b46-4774-9c28-1993c52f8131" containerName="proxy-httpd" containerID="cri-o://260cb0ed5c756d6068933a7b79022d8709a37f96f1dc54ceeb55708fca7420fc" gracePeriod=30 Mar 18 18:22:32 crc kubenswrapper[4830]: I0318 18:22:32.006978 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="931cef43-2b46-4774-9c28-1993c52f8131" containerName="sg-core" containerID="cri-o://abd3e78a4500ff21131db49133175683c761aea36090c7ec2532bfe04b4515dd" gracePeriod=30 Mar 18 18:22:32 crc kubenswrapper[4830]: I0318 18:22:32.006998 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="931cef43-2b46-4774-9c28-1993c52f8131" containerName="ceilometer-notification-agent" containerID="cri-o://fbc292b60f63f5f72c9ea3eddd30e732c09373740daf20121910f96cdd19ff76" gracePeriod=30 Mar 18 18:22:32 crc kubenswrapper[4830]: I0318 18:22:32.311803 4830 generic.go:334] "Generic (PLEG): container finished" podID="931cef43-2b46-4774-9c28-1993c52f8131" containerID="260cb0ed5c756d6068933a7b79022d8709a37f96f1dc54ceeb55708fca7420fc" exitCode=0 Mar 18 18:22:32 crc kubenswrapper[4830]: I0318 18:22:32.311841 4830 generic.go:334] "Generic (PLEG): container finished" podID="931cef43-2b46-4774-9c28-1993c52f8131" containerID="abd3e78a4500ff21131db49133175683c761aea36090c7ec2532bfe04b4515dd" exitCode=2 Mar 18 18:22:32 crc kubenswrapper[4830]: I0318 18:22:32.311862 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"931cef43-2b46-4774-9c28-1993c52f8131","Type":"ContainerDied","Data":"260cb0ed5c756d6068933a7b79022d8709a37f96f1dc54ceeb55708fca7420fc"} Mar 18 18:22:32 crc kubenswrapper[4830]: I0318 18:22:32.311890 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"931cef43-2b46-4774-9c28-1993c52f8131","Type":"ContainerDied","Data":"abd3e78a4500ff21131db49133175683c761aea36090c7ec2532bfe04b4515dd"} Mar 18 18:22:32 crc kubenswrapper[4830]: I0318 18:22:32.819592 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:22:32 crc kubenswrapper[4830]: I0318 18:22:32.986517 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/931cef43-2b46-4774-9c28-1993c52f8131-run-httpd\") pod \"931cef43-2b46-4774-9c28-1993c52f8131\" (UID: \"931cef43-2b46-4774-9c28-1993c52f8131\") " Mar 18 18:22:32 crc kubenswrapper[4830]: I0318 18:22:32.986582 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/931cef43-2b46-4774-9c28-1993c52f8131-config-data\") pod \"931cef43-2b46-4774-9c28-1993c52f8131\" (UID: \"931cef43-2b46-4774-9c28-1993c52f8131\") " Mar 18 18:22:32 crc kubenswrapper[4830]: I0318 18:22:32.986638 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/931cef43-2b46-4774-9c28-1993c52f8131-combined-ca-bundle\") pod \"931cef43-2b46-4774-9c28-1993c52f8131\" (UID: \"931cef43-2b46-4774-9c28-1993c52f8131\") " Mar 18 18:22:32 crc kubenswrapper[4830]: I0318 18:22:32.986697 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/931cef43-2b46-4774-9c28-1993c52f8131-scripts\") pod \"931cef43-2b46-4774-9c28-1993c52f8131\" (UID: \"931cef43-2b46-4774-9c28-1993c52f8131\") " Mar 18 18:22:32 crc kubenswrapper[4830]: I0318 18:22:32.986754 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/931cef43-2b46-4774-9c28-1993c52f8131-log-httpd\") pod \"931cef43-2b46-4774-9c28-1993c52f8131\" (UID: \"931cef43-2b46-4774-9c28-1993c52f8131\") " Mar 18 18:22:32 crc kubenswrapper[4830]: I0318 18:22:32.986813 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phzfz\" (UniqueName: \"kubernetes.io/projected/931cef43-2b46-4774-9c28-1993c52f8131-kube-api-access-phzfz\") pod \"931cef43-2b46-4774-9c28-1993c52f8131\" (UID: \"931cef43-2b46-4774-9c28-1993c52f8131\") " Mar 18 18:22:32 crc kubenswrapper[4830]: I0318 18:22:32.986846 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/931cef43-2b46-4774-9c28-1993c52f8131-sg-core-conf-yaml\") pod \"931cef43-2b46-4774-9c28-1993c52f8131\" (UID: \"931cef43-2b46-4774-9c28-1993c52f8131\") " Mar 18 18:22:32 crc kubenswrapper[4830]: I0318 18:22:32.993465 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/931cef43-2b46-4774-9c28-1993c52f8131-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "931cef43-2b46-4774-9c28-1993c52f8131" (UID: "931cef43-2b46-4774-9c28-1993c52f8131"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:22:32 crc kubenswrapper[4830]: I0318 18:22:32.994309 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/931cef43-2b46-4774-9c28-1993c52f8131-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "931cef43-2b46-4774-9c28-1993c52f8131" (UID: "931cef43-2b46-4774-9c28-1993c52f8131"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:22:32 crc kubenswrapper[4830]: I0318 18:22:32.998617 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/931cef43-2b46-4774-9c28-1993c52f8131-scripts" (OuterVolumeSpecName: "scripts") pod "931cef43-2b46-4774-9c28-1993c52f8131" (UID: "931cef43-2b46-4774-9c28-1993c52f8131"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.000282 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/931cef43-2b46-4774-9c28-1993c52f8131-kube-api-access-phzfz" (OuterVolumeSpecName: "kube-api-access-phzfz") pod "931cef43-2b46-4774-9c28-1993c52f8131" (UID: "931cef43-2b46-4774-9c28-1993c52f8131"). InnerVolumeSpecName "kube-api-access-phzfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.035441 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/931cef43-2b46-4774-9c28-1993c52f8131-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "931cef43-2b46-4774-9c28-1993c52f8131" (UID: "931cef43-2b46-4774-9c28-1993c52f8131"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.092137 4830 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/931cef43-2b46-4774-9c28-1993c52f8131-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.092371 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phzfz\" (UniqueName: \"kubernetes.io/projected/931cef43-2b46-4774-9c28-1993c52f8131-kube-api-access-phzfz\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.092437 4830 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/931cef43-2b46-4774-9c28-1993c52f8131-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.092492 4830 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/931cef43-2b46-4774-9c28-1993c52f8131-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.092551 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/931cef43-2b46-4774-9c28-1993c52f8131-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.117994 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/931cef43-2b46-4774-9c28-1993c52f8131-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "931cef43-2b46-4774-9c28-1993c52f8131" (UID: "931cef43-2b46-4774-9c28-1993c52f8131"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.136567 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/931cef43-2b46-4774-9c28-1993c52f8131-config-data" (OuterVolumeSpecName: "config-data") pod "931cef43-2b46-4774-9c28-1993c52f8131" (UID: "931cef43-2b46-4774-9c28-1993c52f8131"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.195404 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/931cef43-2b46-4774-9c28-1993c52f8131-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.195463 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/931cef43-2b46-4774-9c28-1993c52f8131-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.300476 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.331044 4830 generic.go:334] "Generic (PLEG): container finished" podID="0c12321e-7436-4126-9ad3-597fa7216bc8" containerID="ff1f5ffd4b74593f213089537aa13600d46dd89ff13f70a78e08d9778968bb0c" exitCode=0 Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.331110 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mbk9z" event={"ID":"0c12321e-7436-4126-9ad3-597fa7216bc8","Type":"ContainerDied","Data":"ff1f5ffd4b74593f213089537aa13600d46dd89ff13f70a78e08d9778968bb0c"} Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.345665 4830 generic.go:334] "Generic (PLEG): container finished" podID="af2a66cf-2d32-4beb-9df1-e3958a2ff5de" containerID="824bf5a73987f5113ac249c45ebbb7a4b623031c0aac26c391413ea78d6b2547" exitCode=0 Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.345718 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.345751 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"af2a66cf-2d32-4beb-9df1-e3958a2ff5de","Type":"ContainerDied","Data":"824bf5a73987f5113ac249c45ebbb7a4b623031c0aac26c391413ea78d6b2547"} Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.345806 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"af2a66cf-2d32-4beb-9df1-e3958a2ff5de","Type":"ContainerDied","Data":"8924caab6e84c4053d4b26a930837897ee672254fd2d03647d16b9a320dd4c1a"} Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.345835 4830 scope.go:117] "RemoveContainer" containerID="824bf5a73987f5113ac249c45ebbb7a4b623031c0aac26c391413ea78d6b2547" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.352286 4830 generic.go:334] "Generic (PLEG): container finished" podID="931cef43-2b46-4774-9c28-1993c52f8131" containerID="fbc292b60f63f5f72c9ea3eddd30e732c09373740daf20121910f96cdd19ff76" exitCode=0 Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.352356 4830 generic.go:334] "Generic (PLEG): container finished" podID="931cef43-2b46-4774-9c28-1993c52f8131" containerID="a7ca8bdc24e0f3b4cb0d43e9dae73d8a9dad5aeaf424cfc2b8a27bbad63affe0" exitCode=0 Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.352382 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"931cef43-2b46-4774-9c28-1993c52f8131","Type":"ContainerDied","Data":"fbc292b60f63f5f72c9ea3eddd30e732c09373740daf20121910f96cdd19ff76"} Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.352406 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"931cef43-2b46-4774-9c28-1993c52f8131","Type":"ContainerDied","Data":"a7ca8bdc24e0f3b4cb0d43e9dae73d8a9dad5aeaf424cfc2b8a27bbad63affe0"} Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.352416 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"931cef43-2b46-4774-9c28-1993c52f8131","Type":"ContainerDied","Data":"5f69faffaef725993b29171f496cf8d763dcd0a962eeb28adbd6dbd4fe1a07e7"} Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.352459 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.398874 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-public-tls-certs\") pod \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\" (UID: \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\") " Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.399385 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-logs\") pod \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\" (UID: \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\") " Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.399410 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-scripts\") pod \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\" (UID: \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\") " Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.399437 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-httpd-run\") pod \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\" (UID: \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\") " Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.399470 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zskfq\" (UniqueName: \"kubernetes.io/projected/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-kube-api-access-zskfq\") pod \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\" (UID: \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\") " Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.399496 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-combined-ca-bundle\") pod \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\" (UID: \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\") " Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.399522 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-config-data\") pod \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\" (UID: \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\") " Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.399536 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\" (UID: \"af2a66cf-2d32-4beb-9df1-e3958a2ff5de\") " Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.400365 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "af2a66cf-2d32-4beb-9df1-e3958a2ff5de" (UID: "af2a66cf-2d32-4beb-9df1-e3958a2ff5de"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.400885 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-logs" (OuterVolumeSpecName: "logs") pod "af2a66cf-2d32-4beb-9df1-e3958a2ff5de" (UID: "af2a66cf-2d32-4beb-9df1-e3958a2ff5de"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.402537 4830 scope.go:117] "RemoveContainer" containerID="eeaeeb52c55c3b4e532608d4a697fea60145f3d263eab500579c40079e06d51d" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.413584 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "af2a66cf-2d32-4beb-9df1-e3958a2ff5de" (UID: "af2a66cf-2d32-4beb-9df1-e3958a2ff5de"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.413841 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-scripts" (OuterVolumeSpecName: "scripts") pod "af2a66cf-2d32-4beb-9df1-e3958a2ff5de" (UID: "af2a66cf-2d32-4beb-9df1-e3958a2ff5de"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.415053 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.421917 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-kube-api-access-zskfq" (OuterVolumeSpecName: "kube-api-access-zskfq") pod "af2a66cf-2d32-4beb-9df1-e3958a2ff5de" (UID: "af2a66cf-2d32-4beb-9df1-e3958a2ff5de"). InnerVolumeSpecName "kube-api-access-zskfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.431783 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.452738 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af2a66cf-2d32-4beb-9df1-e3958a2ff5de" (UID: "af2a66cf-2d32-4beb-9df1-e3958a2ff5de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.464182 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:22:33 crc kubenswrapper[4830]: E0318 18:22:33.464601 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="931cef43-2b46-4774-9c28-1993c52f8131" containerName="proxy-httpd" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.464618 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="931cef43-2b46-4774-9c28-1993c52f8131" containerName="proxy-httpd" Mar 18 18:22:33 crc kubenswrapper[4830]: E0318 18:22:33.464634 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af2a66cf-2d32-4beb-9df1-e3958a2ff5de" containerName="glance-httpd" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.464642 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="af2a66cf-2d32-4beb-9df1-e3958a2ff5de" containerName="glance-httpd" Mar 18 18:22:33 crc kubenswrapper[4830]: E0318 18:22:33.464654 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="931cef43-2b46-4774-9c28-1993c52f8131" containerName="sg-core" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.464660 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="931cef43-2b46-4774-9c28-1993c52f8131" containerName="sg-core" Mar 18 18:22:33 crc kubenswrapper[4830]: E0318 18:22:33.464670 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="931cef43-2b46-4774-9c28-1993c52f8131" containerName="ceilometer-central-agent" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.464675 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="931cef43-2b46-4774-9c28-1993c52f8131" containerName="ceilometer-central-agent" Mar 18 18:22:33 crc kubenswrapper[4830]: E0318 18:22:33.464686 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="931cef43-2b46-4774-9c28-1993c52f8131" containerName="ceilometer-notification-agent" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.464691 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="931cef43-2b46-4774-9c28-1993c52f8131" containerName="ceilometer-notification-agent" Mar 18 18:22:33 crc kubenswrapper[4830]: E0318 18:22:33.464703 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af2a66cf-2d32-4beb-9df1-e3958a2ff5de" containerName="glance-log" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.464708 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="af2a66cf-2d32-4beb-9df1-e3958a2ff5de" containerName="glance-log" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.464889 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="931cef43-2b46-4774-9c28-1993c52f8131" containerName="proxy-httpd" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.464902 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="931cef43-2b46-4774-9c28-1993c52f8131" containerName="ceilometer-central-agent" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.464916 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="931cef43-2b46-4774-9c28-1993c52f8131" containerName="sg-core" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.464929 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="af2a66cf-2d32-4beb-9df1-e3958a2ff5de" containerName="glance-log" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.464938 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="931cef43-2b46-4774-9c28-1993c52f8131" containerName="ceilometer-notification-agent" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.464948 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="af2a66cf-2d32-4beb-9df1-e3958a2ff5de" containerName="glance-httpd" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.466905 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.469903 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.472489 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.481997 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.484143 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-config-data" (OuterVolumeSpecName: "config-data") pod "af2a66cf-2d32-4beb-9df1-e3958a2ff5de" (UID: "af2a66cf-2d32-4beb-9df1-e3958a2ff5de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.503959 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.504010 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.504021 4830 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.504034 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zskfq\" (UniqueName: \"kubernetes.io/projected/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-kube-api-access-zskfq\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.504049 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.504059 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.504095 4830 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.532187 4830 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.535530 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "af2a66cf-2d32-4beb-9df1-e3958a2ff5de" (UID: "af2a66cf-2d32-4beb-9df1-e3958a2ff5de"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.593250 4830 scope.go:117] "RemoveContainer" containerID="824bf5a73987f5113ac249c45ebbb7a4b623031c0aac26c391413ea78d6b2547" Mar 18 18:22:33 crc kubenswrapper[4830]: E0318 18:22:33.595017 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"824bf5a73987f5113ac249c45ebbb7a4b623031c0aac26c391413ea78d6b2547\": container with ID starting with 824bf5a73987f5113ac249c45ebbb7a4b623031c0aac26c391413ea78d6b2547 not found: ID does not exist" containerID="824bf5a73987f5113ac249c45ebbb7a4b623031c0aac26c391413ea78d6b2547" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.595076 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"824bf5a73987f5113ac249c45ebbb7a4b623031c0aac26c391413ea78d6b2547"} err="failed to get container status \"824bf5a73987f5113ac249c45ebbb7a4b623031c0aac26c391413ea78d6b2547\": rpc error: code = NotFound desc = could not find container \"824bf5a73987f5113ac249c45ebbb7a4b623031c0aac26c391413ea78d6b2547\": container with ID starting with 824bf5a73987f5113ac249c45ebbb7a4b623031c0aac26c391413ea78d6b2547 not found: ID does not exist" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.595110 4830 scope.go:117] "RemoveContainer" containerID="eeaeeb52c55c3b4e532608d4a697fea60145f3d263eab500579c40079e06d51d" Mar 18 18:22:33 crc kubenswrapper[4830]: E0318 18:22:33.595518 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eeaeeb52c55c3b4e532608d4a697fea60145f3d263eab500579c40079e06d51d\": container with ID starting with eeaeeb52c55c3b4e532608d4a697fea60145f3d263eab500579c40079e06d51d not found: ID does not exist" containerID="eeaeeb52c55c3b4e532608d4a697fea60145f3d263eab500579c40079e06d51d" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.595562 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eeaeeb52c55c3b4e532608d4a697fea60145f3d263eab500579c40079e06d51d"} err="failed to get container status \"eeaeeb52c55c3b4e532608d4a697fea60145f3d263eab500579c40079e06d51d\": rpc error: code = NotFound desc = could not find container \"eeaeeb52c55c3b4e532608d4a697fea60145f3d263eab500579c40079e06d51d\": container with ID starting with eeaeeb52c55c3b4e532608d4a697fea60145f3d263eab500579c40079e06d51d not found: ID does not exist" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.595591 4830 scope.go:117] "RemoveContainer" containerID="260cb0ed5c756d6068933a7b79022d8709a37f96f1dc54ceeb55708fca7420fc" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.605717 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-log-httpd\") pod \"ceilometer-0\" (UID: \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\") " pod="openstack/ceilometer-0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.605847 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\") " pod="openstack/ceilometer-0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.605887 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-scripts\") pod \"ceilometer-0\" (UID: \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\") " pod="openstack/ceilometer-0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.605925 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\") " pod="openstack/ceilometer-0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.605957 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-config-data\") pod \"ceilometer-0\" (UID: \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\") " pod="openstack/ceilometer-0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.606008 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk6xj\" (UniqueName: \"kubernetes.io/projected/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-kube-api-access-nk6xj\") pod \"ceilometer-0\" (UID: \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\") " pod="openstack/ceilometer-0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.606143 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-run-httpd\") pod \"ceilometer-0\" (UID: \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\") " pod="openstack/ceilometer-0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.606239 4830 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.606261 4830 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af2a66cf-2d32-4beb-9df1-e3958a2ff5de-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.680698 4830 scope.go:117] "RemoveContainer" containerID="abd3e78a4500ff21131db49133175683c761aea36090c7ec2532bfe04b4515dd" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.688492 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.701105 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.709015 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\") " pod="openstack/ceilometer-0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.709085 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-scripts\") pod \"ceilometer-0\" (UID: \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\") " pod="openstack/ceilometer-0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.709140 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\") " pod="openstack/ceilometer-0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.709167 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-config-data\") pod \"ceilometer-0\" (UID: \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\") " pod="openstack/ceilometer-0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.709220 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk6xj\" (UniqueName: \"kubernetes.io/projected/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-kube-api-access-nk6xj\") pod \"ceilometer-0\" (UID: \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\") " pod="openstack/ceilometer-0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.709270 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-run-httpd\") pod \"ceilometer-0\" (UID: \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\") " pod="openstack/ceilometer-0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.709330 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-log-httpd\") pod \"ceilometer-0\" (UID: \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\") " pod="openstack/ceilometer-0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.709943 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-log-httpd\") pod \"ceilometer-0\" (UID: \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\") " pod="openstack/ceilometer-0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.714131 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\") " pod="openstack/ceilometer-0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.716587 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-run-httpd\") pod \"ceilometer-0\" (UID: \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\") " pod="openstack/ceilometer-0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.718595 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\") " pod="openstack/ceilometer-0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.721823 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-config-data\") pod \"ceilometer-0\" (UID: \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\") " pod="openstack/ceilometer-0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.722879 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-scripts\") pod \"ceilometer-0\" (UID: \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\") " pod="openstack/ceilometer-0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.737359 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk6xj\" (UniqueName: \"kubernetes.io/projected/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-kube-api-access-nk6xj\") pod \"ceilometer-0\" (UID: \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\") " pod="openstack/ceilometer-0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.737439 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.738829 4830 scope.go:117] "RemoveContainer" containerID="fbc292b60f63f5f72c9ea3eddd30e732c09373740daf20121910f96cdd19ff76" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.739434 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.745050 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.745261 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.745666 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.794683 4830 scope.go:117] "RemoveContainer" containerID="a7ca8bdc24e0f3b4cb0d43e9dae73d8a9dad5aeaf424cfc2b8a27bbad63affe0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.818991 4830 scope.go:117] "RemoveContainer" containerID="260cb0ed5c756d6068933a7b79022d8709a37f96f1dc54ceeb55708fca7420fc" Mar 18 18:22:33 crc kubenswrapper[4830]: E0318 18:22:33.820112 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"260cb0ed5c756d6068933a7b79022d8709a37f96f1dc54ceeb55708fca7420fc\": container with ID starting with 260cb0ed5c756d6068933a7b79022d8709a37f96f1dc54ceeb55708fca7420fc not found: ID does not exist" containerID="260cb0ed5c756d6068933a7b79022d8709a37f96f1dc54ceeb55708fca7420fc" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.820143 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"260cb0ed5c756d6068933a7b79022d8709a37f96f1dc54ceeb55708fca7420fc"} err="failed to get container status \"260cb0ed5c756d6068933a7b79022d8709a37f96f1dc54ceeb55708fca7420fc\": rpc error: code = NotFound desc = could not find container \"260cb0ed5c756d6068933a7b79022d8709a37f96f1dc54ceeb55708fca7420fc\": container with ID starting with 260cb0ed5c756d6068933a7b79022d8709a37f96f1dc54ceeb55708fca7420fc not found: ID does not exist" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.820163 4830 scope.go:117] "RemoveContainer" containerID="abd3e78a4500ff21131db49133175683c761aea36090c7ec2532bfe04b4515dd" Mar 18 18:22:33 crc kubenswrapper[4830]: E0318 18:22:33.820369 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abd3e78a4500ff21131db49133175683c761aea36090c7ec2532bfe04b4515dd\": container with ID starting with abd3e78a4500ff21131db49133175683c761aea36090c7ec2532bfe04b4515dd not found: ID does not exist" containerID="abd3e78a4500ff21131db49133175683c761aea36090c7ec2532bfe04b4515dd" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.820386 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abd3e78a4500ff21131db49133175683c761aea36090c7ec2532bfe04b4515dd"} err="failed to get container status \"abd3e78a4500ff21131db49133175683c761aea36090c7ec2532bfe04b4515dd\": rpc error: code = NotFound desc = could not find container \"abd3e78a4500ff21131db49133175683c761aea36090c7ec2532bfe04b4515dd\": container with ID starting with abd3e78a4500ff21131db49133175683c761aea36090c7ec2532bfe04b4515dd not found: ID does not exist" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.820397 4830 scope.go:117] "RemoveContainer" containerID="fbc292b60f63f5f72c9ea3eddd30e732c09373740daf20121910f96cdd19ff76" Mar 18 18:22:33 crc kubenswrapper[4830]: E0318 18:22:33.820541 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbc292b60f63f5f72c9ea3eddd30e732c09373740daf20121910f96cdd19ff76\": container with ID starting with fbc292b60f63f5f72c9ea3eddd30e732c09373740daf20121910f96cdd19ff76 not found: ID does not exist" containerID="fbc292b60f63f5f72c9ea3eddd30e732c09373740daf20121910f96cdd19ff76" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.820557 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbc292b60f63f5f72c9ea3eddd30e732c09373740daf20121910f96cdd19ff76"} err="failed to get container status \"fbc292b60f63f5f72c9ea3eddd30e732c09373740daf20121910f96cdd19ff76\": rpc error: code = NotFound desc = could not find container \"fbc292b60f63f5f72c9ea3eddd30e732c09373740daf20121910f96cdd19ff76\": container with ID starting with fbc292b60f63f5f72c9ea3eddd30e732c09373740daf20121910f96cdd19ff76 not found: ID does not exist" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.820567 4830 scope.go:117] "RemoveContainer" containerID="a7ca8bdc24e0f3b4cb0d43e9dae73d8a9dad5aeaf424cfc2b8a27bbad63affe0" Mar 18 18:22:33 crc kubenswrapper[4830]: E0318 18:22:33.820706 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7ca8bdc24e0f3b4cb0d43e9dae73d8a9dad5aeaf424cfc2b8a27bbad63affe0\": container with ID starting with a7ca8bdc24e0f3b4cb0d43e9dae73d8a9dad5aeaf424cfc2b8a27bbad63affe0 not found: ID does not exist" containerID="a7ca8bdc24e0f3b4cb0d43e9dae73d8a9dad5aeaf424cfc2b8a27bbad63affe0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.820721 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7ca8bdc24e0f3b4cb0d43e9dae73d8a9dad5aeaf424cfc2b8a27bbad63affe0"} err="failed to get container status \"a7ca8bdc24e0f3b4cb0d43e9dae73d8a9dad5aeaf424cfc2b8a27bbad63affe0\": rpc error: code = NotFound desc = could not find container \"a7ca8bdc24e0f3b4cb0d43e9dae73d8a9dad5aeaf424cfc2b8a27bbad63affe0\": container with ID starting with a7ca8bdc24e0f3b4cb0d43e9dae73d8a9dad5aeaf424cfc2b8a27bbad63affe0 not found: ID does not exist" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.820734 4830 scope.go:117] "RemoveContainer" containerID="260cb0ed5c756d6068933a7b79022d8709a37f96f1dc54ceeb55708fca7420fc" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.820947 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"260cb0ed5c756d6068933a7b79022d8709a37f96f1dc54ceeb55708fca7420fc"} err="failed to get container status \"260cb0ed5c756d6068933a7b79022d8709a37f96f1dc54ceeb55708fca7420fc\": rpc error: code = NotFound desc = could not find container \"260cb0ed5c756d6068933a7b79022d8709a37f96f1dc54ceeb55708fca7420fc\": container with ID starting with 260cb0ed5c756d6068933a7b79022d8709a37f96f1dc54ceeb55708fca7420fc not found: ID does not exist" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.820964 4830 scope.go:117] "RemoveContainer" containerID="abd3e78a4500ff21131db49133175683c761aea36090c7ec2532bfe04b4515dd" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.821091 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abd3e78a4500ff21131db49133175683c761aea36090c7ec2532bfe04b4515dd"} err="failed to get container status \"abd3e78a4500ff21131db49133175683c761aea36090c7ec2532bfe04b4515dd\": rpc error: code = NotFound desc = could not find container \"abd3e78a4500ff21131db49133175683c761aea36090c7ec2532bfe04b4515dd\": container with ID starting with abd3e78a4500ff21131db49133175683c761aea36090c7ec2532bfe04b4515dd not found: ID does not exist" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.821106 4830 scope.go:117] "RemoveContainer" containerID="fbc292b60f63f5f72c9ea3eddd30e732c09373740daf20121910f96cdd19ff76" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.821252 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbc292b60f63f5f72c9ea3eddd30e732c09373740daf20121910f96cdd19ff76"} err="failed to get container status \"fbc292b60f63f5f72c9ea3eddd30e732c09373740daf20121910f96cdd19ff76\": rpc error: code = NotFound desc = could not find container \"fbc292b60f63f5f72c9ea3eddd30e732c09373740daf20121910f96cdd19ff76\": container with ID starting with fbc292b60f63f5f72c9ea3eddd30e732c09373740daf20121910f96cdd19ff76 not found: ID does not exist" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.821268 4830 scope.go:117] "RemoveContainer" containerID="a7ca8bdc24e0f3b4cb0d43e9dae73d8a9dad5aeaf424cfc2b8a27bbad63affe0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.821404 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7ca8bdc24e0f3b4cb0d43e9dae73d8a9dad5aeaf424cfc2b8a27bbad63affe0"} err="failed to get container status \"a7ca8bdc24e0f3b4cb0d43e9dae73d8a9dad5aeaf424cfc2b8a27bbad63affe0\": rpc error: code = NotFound desc = could not find container \"a7ca8bdc24e0f3b4cb0d43e9dae73d8a9dad5aeaf424cfc2b8a27bbad63affe0\": container with ID starting with a7ca8bdc24e0f3b4cb0d43e9dae73d8a9dad5aeaf424cfc2b8a27bbad63affe0 not found: ID does not exist" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.891564 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.913497 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vngf\" (UniqueName: \"kubernetes.io/projected/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-kube-api-access-5vngf\") pod \"glance-default-external-api-0\" (UID: \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\") " pod="openstack/glance-default-external-api-0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.913595 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-scripts\") pod \"glance-default-external-api-0\" (UID: \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\") " pod="openstack/glance-default-external-api-0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.913640 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-logs\") pod \"glance-default-external-api-0\" (UID: \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\") " pod="openstack/glance-default-external-api-0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.913666 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\") " pod="openstack/glance-default-external-api-0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.913713 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-config-data\") pod \"glance-default-external-api-0\" (UID: \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\") " pod="openstack/glance-default-external-api-0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.913804 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\") " pod="openstack/glance-default-external-api-0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.913845 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\") " pod="openstack/glance-default-external-api-0" Mar 18 18:22:33 crc kubenswrapper[4830]: I0318 18:22:33.913874 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\") " pod="openstack/glance-default-external-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.015195 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-config-data\") pod \"glance-default-external-api-0\" (UID: \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\") " pod="openstack/glance-default-external-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.015548 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\") " pod="openstack/glance-default-external-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.015583 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\") " pod="openstack/glance-default-external-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.015606 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\") " pod="openstack/glance-default-external-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.015632 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vngf\" (UniqueName: \"kubernetes.io/projected/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-kube-api-access-5vngf\") pod \"glance-default-external-api-0\" (UID: \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\") " pod="openstack/glance-default-external-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.015668 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-scripts\") pod \"glance-default-external-api-0\" (UID: \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\") " pod="openstack/glance-default-external-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.015695 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-logs\") pod \"glance-default-external-api-0\" (UID: \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\") " pod="openstack/glance-default-external-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.015711 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\") " pod="openstack/glance-default-external-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.018357 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\") " pod="openstack/glance-default-external-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.019999 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\") " pod="openstack/glance-default-external-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.021024 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.022822 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-scripts\") pod \"glance-default-external-api-0\" (UID: \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\") " pod="openstack/glance-default-external-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.026388 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-logs\") pod \"glance-default-external-api-0\" (UID: \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\") " pod="openstack/glance-default-external-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.026783 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\") " pod="openstack/glance-default-external-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.027223 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-config-data\") pod \"glance-default-external-api-0\" (UID: \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\") " pod="openstack/glance-default-external-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.036726 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vngf\" (UniqueName: \"kubernetes.io/projected/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-kube-api-access-5vngf\") pod \"glance-default-external-api-0\" (UID: \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\") " pod="openstack/glance-default-external-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.057201 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\") " pod="openstack/glance-default-external-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.079413 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.096889 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.218680 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dee116fa-07c8-44cf-b7b9-8dd248a32d82-combined-ca-bundle\") pod \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\" (UID: \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\") " Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.218851 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dee116fa-07c8-44cf-b7b9-8dd248a32d82-scripts\") pod \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\" (UID: \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\") " Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.218919 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\" (UID: \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\") " Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.218937 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zsb6\" (UniqueName: \"kubernetes.io/projected/dee116fa-07c8-44cf-b7b9-8dd248a32d82-kube-api-access-9zsb6\") pod \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\" (UID: \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\") " Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.219018 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dee116fa-07c8-44cf-b7b9-8dd248a32d82-logs\") pod \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\" (UID: \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\") " Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.219051 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dee116fa-07c8-44cf-b7b9-8dd248a32d82-httpd-run\") pod \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\" (UID: \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\") " Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.219104 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dee116fa-07c8-44cf-b7b9-8dd248a32d82-internal-tls-certs\") pod \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\" (UID: \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\") " Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.219129 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dee116fa-07c8-44cf-b7b9-8dd248a32d82-config-data\") pod \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\" (UID: \"dee116fa-07c8-44cf-b7b9-8dd248a32d82\") " Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.220275 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dee116fa-07c8-44cf-b7b9-8dd248a32d82-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "dee116fa-07c8-44cf-b7b9-8dd248a32d82" (UID: "dee116fa-07c8-44cf-b7b9-8dd248a32d82"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.220541 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dee116fa-07c8-44cf-b7b9-8dd248a32d82-logs" (OuterVolumeSpecName: "logs") pod "dee116fa-07c8-44cf-b7b9-8dd248a32d82" (UID: "dee116fa-07c8-44cf-b7b9-8dd248a32d82"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.225252 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dee116fa-07c8-44cf-b7b9-8dd248a32d82-kube-api-access-9zsb6" (OuterVolumeSpecName: "kube-api-access-9zsb6") pod "dee116fa-07c8-44cf-b7b9-8dd248a32d82" (UID: "dee116fa-07c8-44cf-b7b9-8dd248a32d82"). InnerVolumeSpecName "kube-api-access-9zsb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.226214 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "dee116fa-07c8-44cf-b7b9-8dd248a32d82" (UID: "dee116fa-07c8-44cf-b7b9-8dd248a32d82"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.232166 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dee116fa-07c8-44cf-b7b9-8dd248a32d82-scripts" (OuterVolumeSpecName: "scripts") pod "dee116fa-07c8-44cf-b7b9-8dd248a32d82" (UID: "dee116fa-07c8-44cf-b7b9-8dd248a32d82"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.249378 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dee116fa-07c8-44cf-b7b9-8dd248a32d82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dee116fa-07c8-44cf-b7b9-8dd248a32d82" (UID: "dee116fa-07c8-44cf-b7b9-8dd248a32d82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.264110 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="931cef43-2b46-4774-9c28-1993c52f8131" path="/var/lib/kubelet/pods/931cef43-2b46-4774-9c28-1993c52f8131/volumes" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.265393 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af2a66cf-2d32-4beb-9df1-e3958a2ff5de" path="/var/lib/kubelet/pods/af2a66cf-2d32-4beb-9df1-e3958a2ff5de/volumes" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.315307 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dee116fa-07c8-44cf-b7b9-8dd248a32d82-config-data" (OuterVolumeSpecName: "config-data") pod "dee116fa-07c8-44cf-b7b9-8dd248a32d82" (UID: "dee116fa-07c8-44cf-b7b9-8dd248a32d82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.315383 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dee116fa-07c8-44cf-b7b9-8dd248a32d82-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "dee116fa-07c8-44cf-b7b9-8dd248a32d82" (UID: "dee116fa-07c8-44cf-b7b9-8dd248a32d82"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.321745 4830 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.321795 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zsb6\" (UniqueName: \"kubernetes.io/projected/dee116fa-07c8-44cf-b7b9-8dd248a32d82-kube-api-access-9zsb6\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.321806 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dee116fa-07c8-44cf-b7b9-8dd248a32d82-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.321815 4830 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dee116fa-07c8-44cf-b7b9-8dd248a32d82-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.321825 4830 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dee116fa-07c8-44cf-b7b9-8dd248a32d82-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.321834 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dee116fa-07c8-44cf-b7b9-8dd248a32d82-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.321844 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dee116fa-07c8-44cf-b7b9-8dd248a32d82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.321855 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dee116fa-07c8-44cf-b7b9-8dd248a32d82-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.424532 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.440234 4830 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.447948 4830 generic.go:334] "Generic (PLEG): container finished" podID="dee116fa-07c8-44cf-b7b9-8dd248a32d82" containerID="f2ae5ca0cd907c8d00efc0bdb756c6268dc3f31fdaf2ff901140788845cf5b79" exitCode=0 Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.448094 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.448180 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dee116fa-07c8-44cf-b7b9-8dd248a32d82","Type":"ContainerDied","Data":"f2ae5ca0cd907c8d00efc0bdb756c6268dc3f31fdaf2ff901140788845cf5b79"} Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.448246 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dee116fa-07c8-44cf-b7b9-8dd248a32d82","Type":"ContainerDied","Data":"28dbbaecfe46ada24a64776bfaff890b440b4d7fc6f29f59cf8e5ee8e83507a8"} Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.448269 4830 scope.go:117] "RemoveContainer" containerID="f2ae5ca0cd907c8d00efc0bdb756c6268dc3f31fdaf2ff901140788845cf5b79" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.497933 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.507383 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.510331 4830 scope.go:117] "RemoveContainer" containerID="9bcaf510219165dbbff0561363074075213557f9315c0786aed45b26b2d4e15e" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.520215 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:22:34 crc kubenswrapper[4830]: E0318 18:22:34.520628 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dee116fa-07c8-44cf-b7b9-8dd248a32d82" containerName="glance-log" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.520644 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="dee116fa-07c8-44cf-b7b9-8dd248a32d82" containerName="glance-log" Mar 18 18:22:34 crc kubenswrapper[4830]: E0318 18:22:34.520666 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dee116fa-07c8-44cf-b7b9-8dd248a32d82" containerName="glance-httpd" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.520682 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="dee116fa-07c8-44cf-b7b9-8dd248a32d82" containerName="glance-httpd" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.520927 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="dee116fa-07c8-44cf-b7b9-8dd248a32d82" containerName="glance-httpd" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.520950 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="dee116fa-07c8-44cf-b7b9-8dd248a32d82" containerName="glance-log" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.521874 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.524486 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.524670 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.525727 4830 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.530064 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.545877 4830 scope.go:117] "RemoveContainer" containerID="f2ae5ca0cd907c8d00efc0bdb756c6268dc3f31fdaf2ff901140788845cf5b79" Mar 18 18:22:34 crc kubenswrapper[4830]: E0318 18:22:34.548473 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2ae5ca0cd907c8d00efc0bdb756c6268dc3f31fdaf2ff901140788845cf5b79\": container with ID starting with f2ae5ca0cd907c8d00efc0bdb756c6268dc3f31fdaf2ff901140788845cf5b79 not found: ID does not exist" containerID="f2ae5ca0cd907c8d00efc0bdb756c6268dc3f31fdaf2ff901140788845cf5b79" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.548512 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2ae5ca0cd907c8d00efc0bdb756c6268dc3f31fdaf2ff901140788845cf5b79"} err="failed to get container status \"f2ae5ca0cd907c8d00efc0bdb756c6268dc3f31fdaf2ff901140788845cf5b79\": rpc error: code = NotFound desc = could not find container \"f2ae5ca0cd907c8d00efc0bdb756c6268dc3f31fdaf2ff901140788845cf5b79\": container with ID starting with f2ae5ca0cd907c8d00efc0bdb756c6268dc3f31fdaf2ff901140788845cf5b79 not found: ID does not exist" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.548539 4830 scope.go:117] "RemoveContainer" containerID="9bcaf510219165dbbff0561363074075213557f9315c0786aed45b26b2d4e15e" Mar 18 18:22:34 crc kubenswrapper[4830]: E0318 18:22:34.548831 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bcaf510219165dbbff0561363074075213557f9315c0786aed45b26b2d4e15e\": container with ID starting with 9bcaf510219165dbbff0561363074075213557f9315c0786aed45b26b2d4e15e not found: ID does not exist" containerID="9bcaf510219165dbbff0561363074075213557f9315c0786aed45b26b2d4e15e" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.548855 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bcaf510219165dbbff0561363074075213557f9315c0786aed45b26b2d4e15e"} err="failed to get container status \"9bcaf510219165dbbff0561363074075213557f9315c0786aed45b26b2d4e15e\": rpc error: code = NotFound desc = could not find container \"9bcaf510219165dbbff0561363074075213557f9315c0786aed45b26b2d4e15e\": container with ID starting with 9bcaf510219165dbbff0561363074075213557f9315c0786aed45b26b2d4e15e not found: ID does not exist" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.626977 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8631247-bdcb-45ff-a17d-ac7e7ff81800-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.627499 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8631247-bdcb-45ff-a17d-ac7e7ff81800-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.627607 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8631247-bdcb-45ff-a17d-ac7e7ff81800-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.627688 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8631247-bdcb-45ff-a17d-ac7e7ff81800-logs\") pod \"glance-default-internal-api-0\" (UID: \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.627823 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8631247-bdcb-45ff-a17d-ac7e7ff81800-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.627933 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8631247-bdcb-45ff-a17d-ac7e7ff81800-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.628055 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.628482 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d8cq\" (UniqueName: \"kubernetes.io/projected/e8631247-bdcb-45ff-a17d-ac7e7ff81800-kube-api-access-8d8cq\") pod \"glance-default-internal-api-0\" (UID: \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.729754 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8631247-bdcb-45ff-a17d-ac7e7ff81800-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.730446 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.730841 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d8cq\" (UniqueName: \"kubernetes.io/projected/e8631247-bdcb-45ff-a17d-ac7e7ff81800-kube-api-access-8d8cq\") pod \"glance-default-internal-api-0\" (UID: \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.730938 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8631247-bdcb-45ff-a17d-ac7e7ff81800-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.731043 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8631247-bdcb-45ff-a17d-ac7e7ff81800-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.731158 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8631247-bdcb-45ff-a17d-ac7e7ff81800-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.731225 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8631247-bdcb-45ff-a17d-ac7e7ff81800-logs\") pod \"glance-default-internal-api-0\" (UID: \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.731323 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8631247-bdcb-45ff-a17d-ac7e7ff81800-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.730199 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8631247-bdcb-45ff-a17d-ac7e7ff81800-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.741041 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8631247-bdcb-45ff-a17d-ac7e7ff81800-logs\") pod \"glance-default-internal-api-0\" (UID: \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.730807 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.749529 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8631247-bdcb-45ff-a17d-ac7e7ff81800-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.753362 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8631247-bdcb-45ff-a17d-ac7e7ff81800-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.776668 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8631247-bdcb-45ff-a17d-ac7e7ff81800-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.791973 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8631247-bdcb-45ff-a17d-ac7e7ff81800-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.793782 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d8cq\" (UniqueName: \"kubernetes.io/projected/e8631247-bdcb-45ff-a17d-ac7e7ff81800-kube-api-access-8d8cq\") pod \"glance-default-internal-api-0\" (UID: \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.808121 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.848117 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.862042 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 18:22:34 crc kubenswrapper[4830]: I0318 18:22:34.939058 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mbk9z" Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.035429 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vfp4\" (UniqueName: \"kubernetes.io/projected/0c12321e-7436-4126-9ad3-597fa7216bc8-kube-api-access-9vfp4\") pod \"0c12321e-7436-4126-9ad3-597fa7216bc8\" (UID: \"0c12321e-7436-4126-9ad3-597fa7216bc8\") " Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.035636 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c12321e-7436-4126-9ad3-597fa7216bc8-scripts\") pod \"0c12321e-7436-4126-9ad3-597fa7216bc8\" (UID: \"0c12321e-7436-4126-9ad3-597fa7216bc8\") " Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.035703 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c12321e-7436-4126-9ad3-597fa7216bc8-config-data\") pod \"0c12321e-7436-4126-9ad3-597fa7216bc8\" (UID: \"0c12321e-7436-4126-9ad3-597fa7216bc8\") " Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.035785 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c12321e-7436-4126-9ad3-597fa7216bc8-combined-ca-bundle\") pod \"0c12321e-7436-4126-9ad3-597fa7216bc8\" (UID: \"0c12321e-7436-4126-9ad3-597fa7216bc8\") " Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.041293 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c12321e-7436-4126-9ad3-597fa7216bc8-scripts" (OuterVolumeSpecName: "scripts") pod "0c12321e-7436-4126-9ad3-597fa7216bc8" (UID: "0c12321e-7436-4126-9ad3-597fa7216bc8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.041604 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c12321e-7436-4126-9ad3-597fa7216bc8-kube-api-access-9vfp4" (OuterVolumeSpecName: "kube-api-access-9vfp4") pod "0c12321e-7436-4126-9ad3-597fa7216bc8" (UID: "0c12321e-7436-4126-9ad3-597fa7216bc8"). InnerVolumeSpecName "kube-api-access-9vfp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.096553 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c12321e-7436-4126-9ad3-597fa7216bc8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c12321e-7436-4126-9ad3-597fa7216bc8" (UID: "0c12321e-7436-4126-9ad3-597fa7216bc8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.104899 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c12321e-7436-4126-9ad3-597fa7216bc8-config-data" (OuterVolumeSpecName: "config-data") pod "0c12321e-7436-4126-9ad3-597fa7216bc8" (UID: "0c12321e-7436-4126-9ad3-597fa7216bc8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.137538 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vfp4\" (UniqueName: \"kubernetes.io/projected/0c12321e-7436-4126-9ad3-597fa7216bc8-kube-api-access-9vfp4\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.137916 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c12321e-7436-4126-9ad3-597fa7216bc8-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.137929 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c12321e-7436-4126-9ad3-597fa7216bc8-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.137938 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c12321e-7436-4126-9ad3-597fa7216bc8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.467579 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e","Type":"ContainerStarted","Data":"ad2a9b53272ce4727ce4285e370a2c86cec16f7ce93b1bd7e3fbde719b45d425"} Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.467668 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e","Type":"ContainerStarted","Data":"5c035e7c801dc707638d118c501fd37160cfa0933be9e2ca3419a5991ba53d28"} Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.473931 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0ac8a4f8-88e7-4cd0-ab89-210fb088b137","Type":"ContainerStarted","Data":"a6a7b889d97bafd13659fbd280beab2f2e9328ce830ad8c489d224c19d4ad7f2"} Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.478148 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mbk9z" event={"ID":"0c12321e-7436-4126-9ad3-597fa7216bc8","Type":"ContainerDied","Data":"741fec637dedd1b67c3a0dea78ad608663d4bbcab737663721475d70f8ee784c"} Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.478190 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="741fec637dedd1b67c3a0dea78ad608663d4bbcab737663721475d70f8ee784c" Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.478199 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mbk9z" Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.492077 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.514207 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 18:22:35 crc kubenswrapper[4830]: E0318 18:22:35.514951 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c12321e-7436-4126-9ad3-597fa7216bc8" containerName="nova-cell0-conductor-db-sync" Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.514968 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c12321e-7436-4126-9ad3-597fa7216bc8" containerName="nova-cell0-conductor-db-sync" Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.515288 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c12321e-7436-4126-9ad3-597fa7216bc8" containerName="nova-cell0-conductor-db-sync" Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.516312 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.521148 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.521419 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-v7lpj" Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.526350 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.645496 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0e71339-fd75-44b3-bbb8-15d75455d90f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a0e71339-fd75-44b3-bbb8-15d75455d90f\") " pod="openstack/nova-cell0-conductor-0" Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.646059 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7ndk\" (UniqueName: \"kubernetes.io/projected/a0e71339-fd75-44b3-bbb8-15d75455d90f-kube-api-access-g7ndk\") pod \"nova-cell0-conductor-0\" (UID: \"a0e71339-fd75-44b3-bbb8-15d75455d90f\") " pod="openstack/nova-cell0-conductor-0" Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.646139 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0e71339-fd75-44b3-bbb8-15d75455d90f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a0e71339-fd75-44b3-bbb8-15d75455d90f\") " pod="openstack/nova-cell0-conductor-0" Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.749681 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0e71339-fd75-44b3-bbb8-15d75455d90f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a0e71339-fd75-44b3-bbb8-15d75455d90f\") " pod="openstack/nova-cell0-conductor-0" Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.750120 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7ndk\" (UniqueName: \"kubernetes.io/projected/a0e71339-fd75-44b3-bbb8-15d75455d90f-kube-api-access-g7ndk\") pod \"nova-cell0-conductor-0\" (UID: \"a0e71339-fd75-44b3-bbb8-15d75455d90f\") " pod="openstack/nova-cell0-conductor-0" Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.750216 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0e71339-fd75-44b3-bbb8-15d75455d90f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a0e71339-fd75-44b3-bbb8-15d75455d90f\") " pod="openstack/nova-cell0-conductor-0" Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.755538 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0e71339-fd75-44b3-bbb8-15d75455d90f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a0e71339-fd75-44b3-bbb8-15d75455d90f\") " pod="openstack/nova-cell0-conductor-0" Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.764367 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0e71339-fd75-44b3-bbb8-15d75455d90f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a0e71339-fd75-44b3-bbb8-15d75455d90f\") " pod="openstack/nova-cell0-conductor-0" Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.768023 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7ndk\" (UniqueName: \"kubernetes.io/projected/a0e71339-fd75-44b3-bbb8-15d75455d90f-kube-api-access-g7ndk\") pod \"nova-cell0-conductor-0\" (UID: \"a0e71339-fd75-44b3-bbb8-15d75455d90f\") " pod="openstack/nova-cell0-conductor-0" Mar 18 18:22:35 crc kubenswrapper[4830]: I0318 18:22:35.879795 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 18:22:36 crc kubenswrapper[4830]: I0318 18:22:36.259752 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dee116fa-07c8-44cf-b7b9-8dd248a32d82" path="/var/lib/kubelet/pods/dee116fa-07c8-44cf-b7b9-8dd248a32d82/volumes" Mar 18 18:22:36 crc kubenswrapper[4830]: I0318 18:22:36.345852 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 18:22:36 crc kubenswrapper[4830]: I0318 18:22:36.493036 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e8631247-bdcb-45ff-a17d-ac7e7ff81800","Type":"ContainerStarted","Data":"ae17ba4052b5b73e7f8747e0bbd64f898ebbc5356b7377e5822b10903adec77d"} Mar 18 18:22:36 crc kubenswrapper[4830]: I0318 18:22:36.493075 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e8631247-bdcb-45ff-a17d-ac7e7ff81800","Type":"ContainerStarted","Data":"82adec69ccede2e466b158cd9eeeee18db05fec83283556fdc16d31adf5888b0"} Mar 18 18:22:36 crc kubenswrapper[4830]: I0318 18:22:36.495057 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a0e71339-fd75-44b3-bbb8-15d75455d90f","Type":"ContainerStarted","Data":"fd2d24062fbfa06740b59dda44258239efc1b473de2f340be4737096134f19b3"} Mar 18 18:22:36 crc kubenswrapper[4830]: I0318 18:22:36.500372 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0ac8a4f8-88e7-4cd0-ab89-210fb088b137","Type":"ContainerStarted","Data":"13a949ebe12567f356b288e72620234deec79f64d460b08c050f70b6131858f4"} Mar 18 18:22:36 crc kubenswrapper[4830]: I0318 18:22:36.500426 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0ac8a4f8-88e7-4cd0-ab89-210fb088b137","Type":"ContainerStarted","Data":"0746b8bb66f5e7517a5d7f696d7212e472acad426b45aa47e1826fd52a0611e5"} Mar 18 18:22:36 crc kubenswrapper[4830]: I0318 18:22:36.505464 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e","Type":"ContainerStarted","Data":"ac2e5bc1c2fed9fc804f9cc6834aa51ea532d7c73bed28029f6a8d47dcdad890"} Mar 18 18:22:36 crc kubenswrapper[4830]: I0318 18:22:36.545629 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.545598052 podStartE2EDuration="3.545598052s" podCreationTimestamp="2026-03-18 18:22:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:22:36.526941448 +0000 UTC m=+1191.094571790" watchObservedRunningTime="2026-03-18 18:22:36.545598052 +0000 UTC m=+1191.113228384" Mar 18 18:22:37 crc kubenswrapper[4830]: I0318 18:22:37.518606 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e8631247-bdcb-45ff-a17d-ac7e7ff81800","Type":"ContainerStarted","Data":"41f23f0d4fef2bb42d4c0645e34a4042e362df833aa1814c1dd80e578b447069"} Mar 18 18:22:37 crc kubenswrapper[4830]: I0318 18:22:37.522693 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a0e71339-fd75-44b3-bbb8-15d75455d90f","Type":"ContainerStarted","Data":"b5f8b7f66219fddf66e22ef6b5a06dba84482b8f68cbbeea50a396ebe1d339d0"} Mar 18 18:22:37 crc kubenswrapper[4830]: I0318 18:22:37.523376 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 18 18:22:37 crc kubenswrapper[4830]: I0318 18:22:37.526381 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e","Type":"ContainerStarted","Data":"f05a0e14d67e6f266175e2596f5d09495ee04e7042a547872555de67589b64ec"} Mar 18 18:22:37 crc kubenswrapper[4830]: I0318 18:22:37.558517 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.558490709 podStartE2EDuration="3.558490709s" podCreationTimestamp="2026-03-18 18:22:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:22:37.552352296 +0000 UTC m=+1192.119982658" watchObservedRunningTime="2026-03-18 18:22:37.558490709 +0000 UTC m=+1192.126121051" Mar 18 18:22:37 crc kubenswrapper[4830]: I0318 18:22:37.590923 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.5909072699999998 podStartE2EDuration="2.59090727s" podCreationTimestamp="2026-03-18 18:22:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:22:37.581397363 +0000 UTC m=+1192.149027695" watchObservedRunningTime="2026-03-18 18:22:37.59090727 +0000 UTC m=+1192.158537602" Mar 18 18:22:39 crc kubenswrapper[4830]: I0318 18:22:39.568034 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e","Type":"ContainerStarted","Data":"3c380f95da70a75502bca3ec7796d5fe785517d13bb7466eac0bdb087b24b6da"} Mar 18 18:22:39 crc kubenswrapper[4830]: I0318 18:22:39.569930 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 18:22:39 crc kubenswrapper[4830]: I0318 18:22:39.609501 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.711375158 podStartE2EDuration="6.60948203s" podCreationTimestamp="2026-03-18 18:22:33 +0000 UTC" firstStartedPulling="2026-03-18 18:22:34.406440152 +0000 UTC m=+1188.974070474" lastFinishedPulling="2026-03-18 18:22:38.304546974 +0000 UTC m=+1192.872177346" observedRunningTime="2026-03-18 18:22:39.599031136 +0000 UTC m=+1194.166661488" watchObservedRunningTime="2026-03-18 18:22:39.60948203 +0000 UTC m=+1194.177112362" Mar 18 18:22:44 crc kubenswrapper[4830]: I0318 18:22:44.080116 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 18:22:44 crc kubenswrapper[4830]: I0318 18:22:44.080682 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 18:22:44 crc kubenswrapper[4830]: I0318 18:22:44.123424 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 18:22:44 crc kubenswrapper[4830]: I0318 18:22:44.127839 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 18:22:44 crc kubenswrapper[4830]: I0318 18:22:44.624670 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 18:22:44 crc kubenswrapper[4830]: I0318 18:22:44.624708 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 18:22:44 crc kubenswrapper[4830]: I0318 18:22:44.862952 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 18:22:44 crc kubenswrapper[4830]: I0318 18:22:44.863000 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 18:22:44 crc kubenswrapper[4830]: I0318 18:22:44.908500 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 18:22:44 crc kubenswrapper[4830]: I0318 18:22:44.925021 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 18:22:45 crc kubenswrapper[4830]: I0318 18:22:45.636344 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 18:22:45 crc kubenswrapper[4830]: I0318 18:22:45.637041 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 18:22:45 crc kubenswrapper[4830]: I0318 18:22:45.909083 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.425180 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-fl66n"] Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.426742 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fl66n" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.429139 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.429274 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.442126 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.442859 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-fl66n"] Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.519016 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.526660 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a030d7-4344-449d-8edb-805be7b5604f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-fl66n\" (UID: \"e0a030d7-4344-449d-8edb-805be7b5604f\") " pod="openstack/nova-cell0-cell-mapping-fl66n" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.526721 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0a030d7-4344-449d-8edb-805be7b5604f-config-data\") pod \"nova-cell0-cell-mapping-fl66n\" (UID: \"e0a030d7-4344-449d-8edb-805be7b5604f\") " pod="openstack/nova-cell0-cell-mapping-fl66n" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.527017 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6mvb\" (UniqueName: \"kubernetes.io/projected/e0a030d7-4344-449d-8edb-805be7b5604f-kube-api-access-c6mvb\") pod \"nova-cell0-cell-mapping-fl66n\" (UID: \"e0a030d7-4344-449d-8edb-805be7b5604f\") " pod="openstack/nova-cell0-cell-mapping-fl66n" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.527054 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0a030d7-4344-449d-8edb-805be7b5604f-scripts\") pod \"nova-cell0-cell-mapping-fl66n\" (UID: \"e0a030d7-4344-449d-8edb-805be7b5604f\") " pod="openstack/nova-cell0-cell-mapping-fl66n" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.630756 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0a030d7-4344-449d-8edb-805be7b5604f-config-data\") pod \"nova-cell0-cell-mapping-fl66n\" (UID: \"e0a030d7-4344-449d-8edb-805be7b5604f\") " pod="openstack/nova-cell0-cell-mapping-fl66n" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.636251 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6mvb\" (UniqueName: \"kubernetes.io/projected/e0a030d7-4344-449d-8edb-805be7b5604f-kube-api-access-c6mvb\") pod \"nova-cell0-cell-mapping-fl66n\" (UID: \"e0a030d7-4344-449d-8edb-805be7b5604f\") " pod="openstack/nova-cell0-cell-mapping-fl66n" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.636318 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0a030d7-4344-449d-8edb-805be7b5604f-scripts\") pod \"nova-cell0-cell-mapping-fl66n\" (UID: \"e0a030d7-4344-449d-8edb-805be7b5604f\") " pod="openstack/nova-cell0-cell-mapping-fl66n" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.636427 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a030d7-4344-449d-8edb-805be7b5604f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-fl66n\" (UID: \"e0a030d7-4344-449d-8edb-805be7b5604f\") " pod="openstack/nova-cell0-cell-mapping-fl66n" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.650582 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0a030d7-4344-449d-8edb-805be7b5604f-scripts\") pod \"nova-cell0-cell-mapping-fl66n\" (UID: \"e0a030d7-4344-449d-8edb-805be7b5604f\") " pod="openstack/nova-cell0-cell-mapping-fl66n" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.651985 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a030d7-4344-449d-8edb-805be7b5604f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-fl66n\" (UID: \"e0a030d7-4344-449d-8edb-805be7b5604f\") " pod="openstack/nova-cell0-cell-mapping-fl66n" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.652451 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0a030d7-4344-449d-8edb-805be7b5604f-config-data\") pod \"nova-cell0-cell-mapping-fl66n\" (UID: \"e0a030d7-4344-449d-8edb-805be7b5604f\") " pod="openstack/nova-cell0-cell-mapping-fl66n" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.652487 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.655828 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.657811 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.666481 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6mvb\" (UniqueName: \"kubernetes.io/projected/e0a030d7-4344-449d-8edb-805be7b5604f-kube-api-access-c6mvb\") pod \"nova-cell0-cell-mapping-fl66n\" (UID: \"e0a030d7-4344-449d-8edb-805be7b5604f\") " pod="openstack/nova-cell0-cell-mapping-fl66n" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.687859 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.722937 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.724976 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.731908 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.742324 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7035cb51-8209-41dc-8a05-7faecc3c0985-logs\") pod \"nova-api-0\" (UID: \"7035cb51-8209-41dc-8a05-7faecc3c0985\") " pod="openstack/nova-api-0" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.742658 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7035cb51-8209-41dc-8a05-7faecc3c0985-config-data\") pod \"nova-api-0\" (UID: \"7035cb51-8209-41dc-8a05-7faecc3c0985\") " pod="openstack/nova-api-0" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.742756 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwzpp\" (UniqueName: \"kubernetes.io/projected/7035cb51-8209-41dc-8a05-7faecc3c0985-kube-api-access-xwzpp\") pod \"nova-api-0\" (UID: \"7035cb51-8209-41dc-8a05-7faecc3c0985\") " pod="openstack/nova-api-0" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.742823 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7035cb51-8209-41dc-8a05-7faecc3c0985-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7035cb51-8209-41dc-8a05-7faecc3c0985\") " pod="openstack/nova-api-0" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.769081 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fl66n" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.782063 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.856539 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7035cb51-8209-41dc-8a05-7faecc3c0985-config-data\") pod \"nova-api-0\" (UID: \"7035cb51-8209-41dc-8a05-7faecc3c0985\") " pod="openstack/nova-api-0" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.863126 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9c48\" (UniqueName: \"kubernetes.io/projected/59e6ccc2-5823-4e3c-b3b6-cd18920a3e72-kube-api-access-s9c48\") pod \"nova-scheduler-0\" (UID: \"59e6ccc2-5823-4e3c-b3b6-cd18920a3e72\") " pod="openstack/nova-scheduler-0" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.863247 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwzpp\" (UniqueName: \"kubernetes.io/projected/7035cb51-8209-41dc-8a05-7faecc3c0985-kube-api-access-xwzpp\") pod \"nova-api-0\" (UID: \"7035cb51-8209-41dc-8a05-7faecc3c0985\") " pod="openstack/nova-api-0" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.863383 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7035cb51-8209-41dc-8a05-7faecc3c0985-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7035cb51-8209-41dc-8a05-7faecc3c0985\") " pod="openstack/nova-api-0" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.863510 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59e6ccc2-5823-4e3c-b3b6-cd18920a3e72-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"59e6ccc2-5823-4e3c-b3b6-cd18920a3e72\") " pod="openstack/nova-scheduler-0" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.863705 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59e6ccc2-5823-4e3c-b3b6-cd18920a3e72-config-data\") pod \"nova-scheduler-0\" (UID: \"59e6ccc2-5823-4e3c-b3b6-cd18920a3e72\") " pod="openstack/nova-scheduler-0" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.863826 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7035cb51-8209-41dc-8a05-7faecc3c0985-logs\") pod \"nova-api-0\" (UID: \"7035cb51-8209-41dc-8a05-7faecc3c0985\") " pod="openstack/nova-api-0" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.867676 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7035cb51-8209-41dc-8a05-7faecc3c0985-logs\") pod \"nova-api-0\" (UID: \"7035cb51-8209-41dc-8a05-7faecc3c0985\") " pod="openstack/nova-api-0" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.878813 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7035cb51-8209-41dc-8a05-7faecc3c0985-config-data\") pod \"nova-api-0\" (UID: \"7035cb51-8209-41dc-8a05-7faecc3c0985\") " pod="openstack/nova-api-0" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.887533 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.889763 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.891427 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7035cb51-8209-41dc-8a05-7faecc3c0985-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7035cb51-8209-41dc-8a05-7faecc3c0985\") " pod="openstack/nova-api-0" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.896011 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwzpp\" (UniqueName: \"kubernetes.io/projected/7035cb51-8209-41dc-8a05-7faecc3c0985-kube-api-access-xwzpp\") pod \"nova-api-0\" (UID: \"7035cb51-8209-41dc-8a05-7faecc3c0985\") " pod="openstack/nova-api-0" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.900213 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.950389 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.969595 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0470b0d-c2a4-4891-9a44-6142f0fe01d8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c0470b0d-c2a4-4891-9a44-6142f0fe01d8\") " pod="openstack/nova-metadata-0" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.969649 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9c48\" (UniqueName: \"kubernetes.io/projected/59e6ccc2-5823-4e3c-b3b6-cd18920a3e72-kube-api-access-s9c48\") pod \"nova-scheduler-0\" (UID: \"59e6ccc2-5823-4e3c-b3b6-cd18920a3e72\") " pod="openstack/nova-scheduler-0" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.969690 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59e6ccc2-5823-4e3c-b3b6-cd18920a3e72-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"59e6ccc2-5823-4e3c-b3b6-cd18920a3e72\") " pod="openstack/nova-scheduler-0" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.969730 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0470b0d-c2a4-4891-9a44-6142f0fe01d8-logs\") pod \"nova-metadata-0\" (UID: \"c0470b0d-c2a4-4891-9a44-6142f0fe01d8\") " pod="openstack/nova-metadata-0" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.969753 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59e6ccc2-5823-4e3c-b3b6-cd18920a3e72-config-data\") pod \"nova-scheduler-0\" (UID: \"59e6ccc2-5823-4e3c-b3b6-cd18920a3e72\") " pod="openstack/nova-scheduler-0" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.969793 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0470b0d-c2a4-4891-9a44-6142f0fe01d8-config-data\") pod \"nova-metadata-0\" (UID: \"c0470b0d-c2a4-4891-9a44-6142f0fe01d8\") " pod="openstack/nova-metadata-0" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.969815 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh7mm\" (UniqueName: \"kubernetes.io/projected/c0470b0d-c2a4-4891-9a44-6142f0fe01d8-kube-api-access-mh7mm\") pod \"nova-metadata-0\" (UID: \"c0470b0d-c2a4-4891-9a44-6142f0fe01d8\") " pod="openstack/nova-metadata-0" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.993850 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59e6ccc2-5823-4e3c-b3b6-cd18920a3e72-config-data\") pod \"nova-scheduler-0\" (UID: \"59e6ccc2-5823-4e3c-b3b6-cd18920a3e72\") " pod="openstack/nova-scheduler-0" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.998443 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59e6ccc2-5823-4e3c-b3b6-cd18920a3e72-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"59e6ccc2-5823-4e3c-b3b6-cd18920a3e72\") " pod="openstack/nova-scheduler-0" Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.998664 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b495b9cc7-kpxvt"] Mar 18 18:22:46 crc kubenswrapper[4830]: I0318 18:22:46.999139 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9c48\" (UniqueName: \"kubernetes.io/projected/59e6ccc2-5823-4e3c-b3b6-cd18920a3e72-kube-api-access-s9c48\") pod \"nova-scheduler-0\" (UID: \"59e6ccc2-5823-4e3c-b3b6-cd18920a3e72\") " pod="openstack/nova-scheduler-0" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.000256 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b495b9cc7-kpxvt" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.021213 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b495b9cc7-kpxvt"] Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.072184 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0470b0d-c2a4-4891-9a44-6142f0fe01d8-logs\") pod \"nova-metadata-0\" (UID: \"c0470b0d-c2a4-4891-9a44-6142f0fe01d8\") " pod="openstack/nova-metadata-0" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.072224 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25520247-79bc-4a87-abaf-57cd2e711a99-ovsdbserver-sb\") pod \"dnsmasq-dns-7b495b9cc7-kpxvt\" (UID: \"25520247-79bc-4a87-abaf-57cd2e711a99\") " pod="openstack/dnsmasq-dns-7b495b9cc7-kpxvt" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.072254 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0470b0d-c2a4-4891-9a44-6142f0fe01d8-config-data\") pod \"nova-metadata-0\" (UID: \"c0470b0d-c2a4-4891-9a44-6142f0fe01d8\") " pod="openstack/nova-metadata-0" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.072276 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh7mm\" (UniqueName: \"kubernetes.io/projected/c0470b0d-c2a4-4891-9a44-6142f0fe01d8-kube-api-access-mh7mm\") pod \"nova-metadata-0\" (UID: \"c0470b0d-c2a4-4891-9a44-6142f0fe01d8\") " pod="openstack/nova-metadata-0" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.072297 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p8zr\" (UniqueName: \"kubernetes.io/projected/25520247-79bc-4a87-abaf-57cd2e711a99-kube-api-access-5p8zr\") pod \"dnsmasq-dns-7b495b9cc7-kpxvt\" (UID: \"25520247-79bc-4a87-abaf-57cd2e711a99\") " pod="openstack/dnsmasq-dns-7b495b9cc7-kpxvt" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.072318 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25520247-79bc-4a87-abaf-57cd2e711a99-dns-svc\") pod \"dnsmasq-dns-7b495b9cc7-kpxvt\" (UID: \"25520247-79bc-4a87-abaf-57cd2e711a99\") " pod="openstack/dnsmasq-dns-7b495b9cc7-kpxvt" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.072345 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25520247-79bc-4a87-abaf-57cd2e711a99-dns-swift-storage-0\") pod \"dnsmasq-dns-7b495b9cc7-kpxvt\" (UID: \"25520247-79bc-4a87-abaf-57cd2e711a99\") " pod="openstack/dnsmasq-dns-7b495b9cc7-kpxvt" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.072378 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25520247-79bc-4a87-abaf-57cd2e711a99-config\") pod \"dnsmasq-dns-7b495b9cc7-kpxvt\" (UID: \"25520247-79bc-4a87-abaf-57cd2e711a99\") " pod="openstack/dnsmasq-dns-7b495b9cc7-kpxvt" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.072427 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0470b0d-c2a4-4891-9a44-6142f0fe01d8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c0470b0d-c2a4-4891-9a44-6142f0fe01d8\") " pod="openstack/nova-metadata-0" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.072454 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25520247-79bc-4a87-abaf-57cd2e711a99-ovsdbserver-nb\") pod \"dnsmasq-dns-7b495b9cc7-kpxvt\" (UID: \"25520247-79bc-4a87-abaf-57cd2e711a99\") " pod="openstack/dnsmasq-dns-7b495b9cc7-kpxvt" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.073038 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0470b0d-c2a4-4891-9a44-6142f0fe01d8-logs\") pod \"nova-metadata-0\" (UID: \"c0470b0d-c2a4-4891-9a44-6142f0fe01d8\") " pod="openstack/nova-metadata-0" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.079962 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.082875 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0470b0d-c2a4-4891-9a44-6142f0fe01d8-config-data\") pod \"nova-metadata-0\" (UID: \"c0470b0d-c2a4-4891-9a44-6142f0fe01d8\") " pod="openstack/nova-metadata-0" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.082904 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.083947 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0470b0d-c2a4-4891-9a44-6142f0fe01d8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c0470b0d-c2a4-4891-9a44-6142f0fe01d8\") " pod="openstack/nova-metadata-0" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.084387 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.087182 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.091022 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.107046 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh7mm\" (UniqueName: \"kubernetes.io/projected/c0470b0d-c2a4-4891-9a44-6142f0fe01d8-kube-api-access-mh7mm\") pod \"nova-metadata-0\" (UID: \"c0470b0d-c2a4-4891-9a44-6142f0fe01d8\") " pod="openstack/nova-metadata-0" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.137050 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.176869 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25520247-79bc-4a87-abaf-57cd2e711a99-ovsdbserver-nb\") pod \"dnsmasq-dns-7b495b9cc7-kpxvt\" (UID: \"25520247-79bc-4a87-abaf-57cd2e711a99\") " pod="openstack/dnsmasq-dns-7b495b9cc7-kpxvt" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.176924 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc92c3f3-06d7-4bab-8cd4-c68fed8308c5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bc92c3f3-06d7-4bab-8cd4-c68fed8308c5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.176968 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25520247-79bc-4a87-abaf-57cd2e711a99-ovsdbserver-sb\") pod \"dnsmasq-dns-7b495b9cc7-kpxvt\" (UID: \"25520247-79bc-4a87-abaf-57cd2e711a99\") " pod="openstack/dnsmasq-dns-7b495b9cc7-kpxvt" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.177013 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p8zr\" (UniqueName: \"kubernetes.io/projected/25520247-79bc-4a87-abaf-57cd2e711a99-kube-api-access-5p8zr\") pod \"dnsmasq-dns-7b495b9cc7-kpxvt\" (UID: \"25520247-79bc-4a87-abaf-57cd2e711a99\") " pod="openstack/dnsmasq-dns-7b495b9cc7-kpxvt" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.177034 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25520247-79bc-4a87-abaf-57cd2e711a99-dns-svc\") pod \"dnsmasq-dns-7b495b9cc7-kpxvt\" (UID: \"25520247-79bc-4a87-abaf-57cd2e711a99\") " pod="openstack/dnsmasq-dns-7b495b9cc7-kpxvt" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.177068 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25520247-79bc-4a87-abaf-57cd2e711a99-dns-swift-storage-0\") pod \"dnsmasq-dns-7b495b9cc7-kpxvt\" (UID: \"25520247-79bc-4a87-abaf-57cd2e711a99\") " pod="openstack/dnsmasq-dns-7b495b9cc7-kpxvt" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.177096 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46bpj\" (UniqueName: \"kubernetes.io/projected/bc92c3f3-06d7-4bab-8cd4-c68fed8308c5-kube-api-access-46bpj\") pod \"nova-cell1-novncproxy-0\" (UID: \"bc92c3f3-06d7-4bab-8cd4-c68fed8308c5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.177119 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc92c3f3-06d7-4bab-8cd4-c68fed8308c5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bc92c3f3-06d7-4bab-8cd4-c68fed8308c5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.177137 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25520247-79bc-4a87-abaf-57cd2e711a99-config\") pod \"dnsmasq-dns-7b495b9cc7-kpxvt\" (UID: \"25520247-79bc-4a87-abaf-57cd2e711a99\") " pod="openstack/dnsmasq-dns-7b495b9cc7-kpxvt" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.178432 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25520247-79bc-4a87-abaf-57cd2e711a99-ovsdbserver-nb\") pod \"dnsmasq-dns-7b495b9cc7-kpxvt\" (UID: \"25520247-79bc-4a87-abaf-57cd2e711a99\") " pod="openstack/dnsmasq-dns-7b495b9cc7-kpxvt" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.178465 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25520247-79bc-4a87-abaf-57cd2e711a99-ovsdbserver-sb\") pod \"dnsmasq-dns-7b495b9cc7-kpxvt\" (UID: \"25520247-79bc-4a87-abaf-57cd2e711a99\") " pod="openstack/dnsmasq-dns-7b495b9cc7-kpxvt" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.179574 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25520247-79bc-4a87-abaf-57cd2e711a99-dns-swift-storage-0\") pod \"dnsmasq-dns-7b495b9cc7-kpxvt\" (UID: \"25520247-79bc-4a87-abaf-57cd2e711a99\") " pod="openstack/dnsmasq-dns-7b495b9cc7-kpxvt" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.181477 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25520247-79bc-4a87-abaf-57cd2e711a99-dns-svc\") pod \"dnsmasq-dns-7b495b9cc7-kpxvt\" (UID: \"25520247-79bc-4a87-abaf-57cd2e711a99\") " pod="openstack/dnsmasq-dns-7b495b9cc7-kpxvt" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.182331 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25520247-79bc-4a87-abaf-57cd2e711a99-config\") pod \"dnsmasq-dns-7b495b9cc7-kpxvt\" (UID: \"25520247-79bc-4a87-abaf-57cd2e711a99\") " pod="openstack/dnsmasq-dns-7b495b9cc7-kpxvt" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.206624 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p8zr\" (UniqueName: \"kubernetes.io/projected/25520247-79bc-4a87-abaf-57cd2e711a99-kube-api-access-5p8zr\") pod \"dnsmasq-dns-7b495b9cc7-kpxvt\" (UID: \"25520247-79bc-4a87-abaf-57cd2e711a99\") " pod="openstack/dnsmasq-dns-7b495b9cc7-kpxvt" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.279086 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46bpj\" (UniqueName: \"kubernetes.io/projected/bc92c3f3-06d7-4bab-8cd4-c68fed8308c5-kube-api-access-46bpj\") pod \"nova-cell1-novncproxy-0\" (UID: \"bc92c3f3-06d7-4bab-8cd4-c68fed8308c5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.279135 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc92c3f3-06d7-4bab-8cd4-c68fed8308c5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bc92c3f3-06d7-4bab-8cd4-c68fed8308c5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.279308 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc92c3f3-06d7-4bab-8cd4-c68fed8308c5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bc92c3f3-06d7-4bab-8cd4-c68fed8308c5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.283474 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc92c3f3-06d7-4bab-8cd4-c68fed8308c5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bc92c3f3-06d7-4bab-8cd4-c68fed8308c5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.285142 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc92c3f3-06d7-4bab-8cd4-c68fed8308c5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bc92c3f3-06d7-4bab-8cd4-c68fed8308c5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.299863 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46bpj\" (UniqueName: \"kubernetes.io/projected/bc92c3f3-06d7-4bab-8cd4-c68fed8308c5-kube-api-access-46bpj\") pod \"nova-cell1-novncproxy-0\" (UID: \"bc92c3f3-06d7-4bab-8cd4-c68fed8308c5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.312273 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.323491 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b495b9cc7-kpxvt" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.410340 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.545397 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-fl66n"] Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.675361 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.691188 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-p8g2m"] Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.692394 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-p8g2m" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.697115 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.697235 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 18:22:47 crc kubenswrapper[4830]: W0318 18:22:47.714905 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7035cb51_8209_41dc_8a05_7faecc3c0985.slice/crio-79bf7cd01c6a7883a346f32f4b35ca0e66a968d7bfbb78988dcf53c4f0fd0213 WatchSource:0}: Error finding container 79bf7cd01c6a7883a346f32f4b35ca0e66a968d7bfbb78988dcf53c4f0fd0213: Status 404 returned error can't find the container with id 79bf7cd01c6a7883a346f32f4b35ca0e66a968d7bfbb78988dcf53c4f0fd0213 Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.719412 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-p8g2m"] Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.739685 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fl66n" event={"ID":"e0a030d7-4344-449d-8edb-805be7b5604f","Type":"ContainerStarted","Data":"e0191c03f27883fa7746c5f2179aefb2dd877fcf7fff8c51acdb614f9dca27f5"} Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.764432 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.787367 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/381d2049-85b1-49f5-a548-c3f5449fee4d-config-data\") pod \"nova-cell1-conductor-db-sync-p8g2m\" (UID: \"381d2049-85b1-49f5-a548-c3f5449fee4d\") " pod="openstack/nova-cell1-conductor-db-sync-p8g2m" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.787739 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd2xk\" (UniqueName: \"kubernetes.io/projected/381d2049-85b1-49f5-a548-c3f5449fee4d-kube-api-access-rd2xk\") pod \"nova-cell1-conductor-db-sync-p8g2m\" (UID: \"381d2049-85b1-49f5-a548-c3f5449fee4d\") " pod="openstack/nova-cell1-conductor-db-sync-p8g2m" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.787850 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381d2049-85b1-49f5-a548-c3f5449fee4d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-p8g2m\" (UID: \"381d2049-85b1-49f5-a548-c3f5449fee4d\") " pod="openstack/nova-cell1-conductor-db-sync-p8g2m" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.787922 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/381d2049-85b1-49f5-a548-c3f5449fee4d-scripts\") pod \"nova-cell1-conductor-db-sync-p8g2m\" (UID: \"381d2049-85b1-49f5-a548-c3f5449fee4d\") " pod="openstack/nova-cell1-conductor-db-sync-p8g2m" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.889623 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/381d2049-85b1-49f5-a548-c3f5449fee4d-config-data\") pod \"nova-cell1-conductor-db-sync-p8g2m\" (UID: \"381d2049-85b1-49f5-a548-c3f5449fee4d\") " pod="openstack/nova-cell1-conductor-db-sync-p8g2m" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.889731 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd2xk\" (UniqueName: \"kubernetes.io/projected/381d2049-85b1-49f5-a548-c3f5449fee4d-kube-api-access-rd2xk\") pod \"nova-cell1-conductor-db-sync-p8g2m\" (UID: \"381d2049-85b1-49f5-a548-c3f5449fee4d\") " pod="openstack/nova-cell1-conductor-db-sync-p8g2m" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.889756 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381d2049-85b1-49f5-a548-c3f5449fee4d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-p8g2m\" (UID: \"381d2049-85b1-49f5-a548-c3f5449fee4d\") " pod="openstack/nova-cell1-conductor-db-sync-p8g2m" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.889799 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/381d2049-85b1-49f5-a548-c3f5449fee4d-scripts\") pod \"nova-cell1-conductor-db-sync-p8g2m\" (UID: \"381d2049-85b1-49f5-a548-c3f5449fee4d\") " pod="openstack/nova-cell1-conductor-db-sync-p8g2m" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.895319 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381d2049-85b1-49f5-a548-c3f5449fee4d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-p8g2m\" (UID: \"381d2049-85b1-49f5-a548-c3f5449fee4d\") " pod="openstack/nova-cell1-conductor-db-sync-p8g2m" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.895764 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/381d2049-85b1-49f5-a548-c3f5449fee4d-config-data\") pod \"nova-cell1-conductor-db-sync-p8g2m\" (UID: \"381d2049-85b1-49f5-a548-c3f5449fee4d\") " pod="openstack/nova-cell1-conductor-db-sync-p8g2m" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.906402 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd2xk\" (UniqueName: \"kubernetes.io/projected/381d2049-85b1-49f5-a548-c3f5449fee4d-kube-api-access-rd2xk\") pod \"nova-cell1-conductor-db-sync-p8g2m\" (UID: \"381d2049-85b1-49f5-a548-c3f5449fee4d\") " pod="openstack/nova-cell1-conductor-db-sync-p8g2m" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.916823 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/381d2049-85b1-49f5-a548-c3f5449fee4d-scripts\") pod \"nova-cell1-conductor-db-sync-p8g2m\" (UID: \"381d2049-85b1-49f5-a548-c3f5449fee4d\") " pod="openstack/nova-cell1-conductor-db-sync-p8g2m" Mar 18 18:22:47 crc kubenswrapper[4830]: I0318 18:22:47.959620 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b495b9cc7-kpxvt"] Mar 18 18:22:47 crc kubenswrapper[4830]: W0318 18:22:47.960550 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25520247_79bc_4a87_abaf_57cd2e711a99.slice/crio-26f6d2b50b3ead96107aae571ab80074596a2330376ba03711041cab6812c723 WatchSource:0}: Error finding container 26f6d2b50b3ead96107aae571ab80074596a2330376ba03711041cab6812c723: Status 404 returned error can't find the container with id 26f6d2b50b3ead96107aae571ab80074596a2330376ba03711041cab6812c723 Mar 18 18:22:48 crc kubenswrapper[4830]: I0318 18:22:48.040898 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 18:22:48 crc kubenswrapper[4830]: W0318 18:22:48.047791 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0470b0d_c2a4_4891_9a44_6142f0fe01d8.slice/crio-0f629936f68e3dd69370f90700f39bed7b9f25a246f4ab9a140d916612a65eaf WatchSource:0}: Error finding container 0f629936f68e3dd69370f90700f39bed7b9f25a246f4ab9a140d916612a65eaf: Status 404 returned error can't find the container with id 0f629936f68e3dd69370f90700f39bed7b9f25a246f4ab9a140d916612a65eaf Mar 18 18:22:48 crc kubenswrapper[4830]: I0318 18:22:48.053312 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:22:48 crc kubenswrapper[4830]: I0318 18:22:48.055093 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 18:22:48 crc kubenswrapper[4830]: I0318 18:22:48.055127 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 18:22:48 crc kubenswrapper[4830]: I0318 18:22:48.129949 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-p8g2m" Mar 18 18:22:48 crc kubenswrapper[4830]: I0318 18:22:48.613491 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-p8g2m"] Mar 18 18:22:48 crc kubenswrapper[4830]: I0318 18:22:48.769409 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0470b0d-c2a4-4891-9a44-6142f0fe01d8","Type":"ContainerStarted","Data":"0f629936f68e3dd69370f90700f39bed7b9f25a246f4ab9a140d916612a65eaf"} Mar 18 18:22:48 crc kubenswrapper[4830]: I0318 18:22:48.771102 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"59e6ccc2-5823-4e3c-b3b6-cd18920a3e72","Type":"ContainerStarted","Data":"7ad7bb7f875f34038111604f86d39bbaf0805565a495b4d7eae5d4d9c2b94d87"} Mar 18 18:22:48 crc kubenswrapper[4830]: I0318 18:22:48.775026 4830 generic.go:334] "Generic (PLEG): container finished" podID="25520247-79bc-4a87-abaf-57cd2e711a99" containerID="b113e35a62fc7a7325b784db3f281726e66cfb7fbc5fc1c68c20b2099dadd208" exitCode=0 Mar 18 18:22:48 crc kubenswrapper[4830]: I0318 18:22:48.775227 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b495b9cc7-kpxvt" event={"ID":"25520247-79bc-4a87-abaf-57cd2e711a99","Type":"ContainerDied","Data":"b113e35a62fc7a7325b784db3f281726e66cfb7fbc5fc1c68c20b2099dadd208"} Mar 18 18:22:48 crc kubenswrapper[4830]: I0318 18:22:48.775314 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b495b9cc7-kpxvt" event={"ID":"25520247-79bc-4a87-abaf-57cd2e711a99","Type":"ContainerStarted","Data":"26f6d2b50b3ead96107aae571ab80074596a2330376ba03711041cab6812c723"} Mar 18 18:22:48 crc kubenswrapper[4830]: I0318 18:22:48.781104 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-p8g2m" event={"ID":"381d2049-85b1-49f5-a548-c3f5449fee4d","Type":"ContainerStarted","Data":"3e219f2eca3ba65bb3c075802abaabd515d557179705c102027947c6e29b7ca2"} Mar 18 18:22:48 crc kubenswrapper[4830]: I0318 18:22:48.785972 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bc92c3f3-06d7-4bab-8cd4-c68fed8308c5","Type":"ContainerStarted","Data":"ceb527ac84080dd62d9eb820bdeb1b575f889cffdd1f681b2083b2261594ed8f"} Mar 18 18:22:48 crc kubenswrapper[4830]: I0318 18:22:48.797268 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fl66n" event={"ID":"e0a030d7-4344-449d-8edb-805be7b5604f","Type":"ContainerStarted","Data":"0e020237f20895e44cd22804f80d8e0f9db0bf852624cfadd9a1771f880e84b4"} Mar 18 18:22:48 crc kubenswrapper[4830]: I0318 18:22:48.804156 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7035cb51-8209-41dc-8a05-7faecc3c0985","Type":"ContainerStarted","Data":"79bf7cd01c6a7883a346f32f4b35ca0e66a968d7bfbb78988dcf53c4f0fd0213"} Mar 18 18:22:48 crc kubenswrapper[4830]: I0318 18:22:48.817019 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-fl66n" podStartSLOduration=2.817002012 podStartE2EDuration="2.817002012s" podCreationTimestamp="2026-03-18 18:22:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:22:48.815753927 +0000 UTC m=+1203.383384259" watchObservedRunningTime="2026-03-18 18:22:48.817002012 +0000 UTC m=+1203.384632344" Mar 18 18:22:49 crc kubenswrapper[4830]: I0318 18:22:49.828886 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b495b9cc7-kpxvt" event={"ID":"25520247-79bc-4a87-abaf-57cd2e711a99","Type":"ContainerStarted","Data":"50be01a9e054905f2176b43b44c6f84b5ef7a1a1c4414136310d958c154087b5"} Mar 18 18:22:49 crc kubenswrapper[4830]: I0318 18:22:49.830698 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b495b9cc7-kpxvt" Mar 18 18:22:49 crc kubenswrapper[4830]: I0318 18:22:49.843850 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-p8g2m" event={"ID":"381d2049-85b1-49f5-a548-c3f5449fee4d","Type":"ContainerStarted","Data":"21a4f27449c9ea1b7a1149cfd63216765da9dd6af1f60659938e91d068c9e30d"} Mar 18 18:22:49 crc kubenswrapper[4830]: I0318 18:22:49.860207 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b495b9cc7-kpxvt" podStartSLOduration=3.86019036 podStartE2EDuration="3.86019036s" podCreationTimestamp="2026-03-18 18:22:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:22:49.859653515 +0000 UTC m=+1204.427283847" watchObservedRunningTime="2026-03-18 18:22:49.86019036 +0000 UTC m=+1204.427820692" Mar 18 18:22:49 crc kubenswrapper[4830]: I0318 18:22:49.889490 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-p8g2m" podStartSLOduration=2.889465353 podStartE2EDuration="2.889465353s" podCreationTimestamp="2026-03-18 18:22:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:22:49.876133508 +0000 UTC m=+1204.443763830" watchObservedRunningTime="2026-03-18 18:22:49.889465353 +0000 UTC m=+1204.457095685" Mar 18 18:22:50 crc kubenswrapper[4830]: I0318 18:22:50.516971 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:22:50 crc kubenswrapper[4830]: I0318 18:22:50.539443 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 18:22:51 crc kubenswrapper[4830]: I0318 18:22:51.863845 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0470b0d-c2a4-4891-9a44-6142f0fe01d8","Type":"ContainerStarted","Data":"1debd17fa7673f524a8cf265f2aa8afca0fdf2579c6e2cf9636ac62c5487eff6"} Mar 18 18:22:51 crc kubenswrapper[4830]: I0318 18:22:51.864151 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0470b0d-c2a4-4891-9a44-6142f0fe01d8","Type":"ContainerStarted","Data":"4d9b64c32139e08fb3367328317250b7882e54837663f56af183d26ef7741dcf"} Mar 18 18:22:51 crc kubenswrapper[4830]: I0318 18:22:51.864066 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c0470b0d-c2a4-4891-9a44-6142f0fe01d8" containerName="nova-metadata-metadata" containerID="cri-o://1debd17fa7673f524a8cf265f2aa8afca0fdf2579c6e2cf9636ac62c5487eff6" gracePeriod=30 Mar 18 18:22:51 crc kubenswrapper[4830]: I0318 18:22:51.864015 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c0470b0d-c2a4-4891-9a44-6142f0fe01d8" containerName="nova-metadata-log" containerID="cri-o://4d9b64c32139e08fb3367328317250b7882e54837663f56af183d26ef7741dcf" gracePeriod=30 Mar 18 18:22:51 crc kubenswrapper[4830]: I0318 18:22:51.870623 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"59e6ccc2-5823-4e3c-b3b6-cd18920a3e72","Type":"ContainerStarted","Data":"111db09911f989aafe8e57a7dcb456dafb974dda53468842a3118674e3a02ec7"} Mar 18 18:22:51 crc kubenswrapper[4830]: I0318 18:22:51.876044 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bc92c3f3-06d7-4bab-8cd4-c68fed8308c5","Type":"ContainerStarted","Data":"12d65b613d7c7bb4a1bf18d31ba8c25084c1c109f742d8bc96be864bbf2d7fb4"} Mar 18 18:22:51 crc kubenswrapper[4830]: I0318 18:22:51.876133 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="bc92c3f3-06d7-4bab-8cd4-c68fed8308c5" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://12d65b613d7c7bb4a1bf18d31ba8c25084c1c109f742d8bc96be864bbf2d7fb4" gracePeriod=30 Mar 18 18:22:51 crc kubenswrapper[4830]: I0318 18:22:51.881490 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7035cb51-8209-41dc-8a05-7faecc3c0985","Type":"ContainerStarted","Data":"b60c175c22ac7e53fe71b2e37caf616851ec9cee94f547bc491df94aa3933b54"} Mar 18 18:22:51 crc kubenswrapper[4830]: I0318 18:22:51.881542 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7035cb51-8209-41dc-8a05-7faecc3c0985","Type":"ContainerStarted","Data":"701ea6d1b0fdb51a72c68cdb182a16279c852e27d1fa57098972a41a762217b5"} Mar 18 18:22:51 crc kubenswrapper[4830]: I0318 18:22:51.898546 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.94890794 podStartE2EDuration="5.898524126s" podCreationTimestamp="2026-03-18 18:22:46 +0000 UTC" firstStartedPulling="2026-03-18 18:22:48.054728271 +0000 UTC m=+1202.622358603" lastFinishedPulling="2026-03-18 18:22:51.004344457 +0000 UTC m=+1205.571974789" observedRunningTime="2026-03-18 18:22:51.884589775 +0000 UTC m=+1206.452220097" watchObservedRunningTime="2026-03-18 18:22:51.898524126 +0000 UTC m=+1206.466154448" Mar 18 18:22:51 crc kubenswrapper[4830]: I0318 18:22:51.908550 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.679451055 podStartE2EDuration="5.908530288s" podCreationTimestamp="2026-03-18 18:22:46 +0000 UTC" firstStartedPulling="2026-03-18 18:22:47.729373234 +0000 UTC m=+1202.297003556" lastFinishedPulling="2026-03-18 18:22:50.958452447 +0000 UTC m=+1205.526082789" observedRunningTime="2026-03-18 18:22:51.900464431 +0000 UTC m=+1206.468094763" watchObservedRunningTime="2026-03-18 18:22:51.908530288 +0000 UTC m=+1206.476160620" Mar 18 18:22:51 crc kubenswrapper[4830]: I0318 18:22:51.934674 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.736443548 podStartE2EDuration="5.934646142s" podCreationTimestamp="2026-03-18 18:22:46 +0000 UTC" firstStartedPulling="2026-03-18 18:22:47.767056554 +0000 UTC m=+1202.334686876" lastFinishedPulling="2026-03-18 18:22:50.965259138 +0000 UTC m=+1205.532889470" observedRunningTime="2026-03-18 18:22:51.929678732 +0000 UTC m=+1206.497309064" watchObservedRunningTime="2026-03-18 18:22:51.934646142 +0000 UTC m=+1206.502276474" Mar 18 18:22:51 crc kubenswrapper[4830]: I0318 18:22:51.957868 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.9804251370000001 podStartE2EDuration="4.957847464s" podCreationTimestamp="2026-03-18 18:22:47 +0000 UTC" firstStartedPulling="2026-03-18 18:22:48.043712822 +0000 UTC m=+1202.611343154" lastFinishedPulling="2026-03-18 18:22:51.021135149 +0000 UTC m=+1205.588765481" observedRunningTime="2026-03-18 18:22:51.952166424 +0000 UTC m=+1206.519796766" watchObservedRunningTime="2026-03-18 18:22:51.957847464 +0000 UTC m=+1206.525477786" Mar 18 18:22:52 crc kubenswrapper[4830]: I0318 18:22:52.137598 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 18:22:52 crc kubenswrapper[4830]: I0318 18:22:52.410576 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:22:52 crc kubenswrapper[4830]: I0318 18:22:52.641709 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:22:52 crc kubenswrapper[4830]: I0318 18:22:52.728476 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mh7mm\" (UniqueName: \"kubernetes.io/projected/c0470b0d-c2a4-4891-9a44-6142f0fe01d8-kube-api-access-mh7mm\") pod \"c0470b0d-c2a4-4891-9a44-6142f0fe01d8\" (UID: \"c0470b0d-c2a4-4891-9a44-6142f0fe01d8\") " Mar 18 18:22:52 crc kubenswrapper[4830]: I0318 18:22:52.728562 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0470b0d-c2a4-4891-9a44-6142f0fe01d8-combined-ca-bundle\") pod \"c0470b0d-c2a4-4891-9a44-6142f0fe01d8\" (UID: \"c0470b0d-c2a4-4891-9a44-6142f0fe01d8\") " Mar 18 18:22:52 crc kubenswrapper[4830]: I0318 18:22:52.728608 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0470b0d-c2a4-4891-9a44-6142f0fe01d8-config-data\") pod \"c0470b0d-c2a4-4891-9a44-6142f0fe01d8\" (UID: \"c0470b0d-c2a4-4891-9a44-6142f0fe01d8\") " Mar 18 18:22:52 crc kubenswrapper[4830]: I0318 18:22:52.728833 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0470b0d-c2a4-4891-9a44-6142f0fe01d8-logs\") pod \"c0470b0d-c2a4-4891-9a44-6142f0fe01d8\" (UID: \"c0470b0d-c2a4-4891-9a44-6142f0fe01d8\") " Mar 18 18:22:52 crc kubenswrapper[4830]: I0318 18:22:52.729138 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0470b0d-c2a4-4891-9a44-6142f0fe01d8-logs" (OuterVolumeSpecName: "logs") pod "c0470b0d-c2a4-4891-9a44-6142f0fe01d8" (UID: "c0470b0d-c2a4-4891-9a44-6142f0fe01d8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:22:52 crc kubenswrapper[4830]: I0318 18:22:52.729449 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0470b0d-c2a4-4891-9a44-6142f0fe01d8-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:52 crc kubenswrapper[4830]: I0318 18:22:52.734851 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0470b0d-c2a4-4891-9a44-6142f0fe01d8-kube-api-access-mh7mm" (OuterVolumeSpecName: "kube-api-access-mh7mm") pod "c0470b0d-c2a4-4891-9a44-6142f0fe01d8" (UID: "c0470b0d-c2a4-4891-9a44-6142f0fe01d8"). InnerVolumeSpecName "kube-api-access-mh7mm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:52 crc kubenswrapper[4830]: I0318 18:22:52.759701 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0470b0d-c2a4-4891-9a44-6142f0fe01d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0470b0d-c2a4-4891-9a44-6142f0fe01d8" (UID: "c0470b0d-c2a4-4891-9a44-6142f0fe01d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:52 crc kubenswrapper[4830]: I0318 18:22:52.761238 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0470b0d-c2a4-4891-9a44-6142f0fe01d8-config-data" (OuterVolumeSpecName: "config-data") pod "c0470b0d-c2a4-4891-9a44-6142f0fe01d8" (UID: "c0470b0d-c2a4-4891-9a44-6142f0fe01d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:52 crc kubenswrapper[4830]: I0318 18:22:52.831404 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mh7mm\" (UniqueName: \"kubernetes.io/projected/c0470b0d-c2a4-4891-9a44-6142f0fe01d8-kube-api-access-mh7mm\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:52 crc kubenswrapper[4830]: I0318 18:22:52.831434 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0470b0d-c2a4-4891-9a44-6142f0fe01d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:52 crc kubenswrapper[4830]: I0318 18:22:52.831442 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0470b0d-c2a4-4891-9a44-6142f0fe01d8-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:52 crc kubenswrapper[4830]: I0318 18:22:52.894989 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:22:52 crc kubenswrapper[4830]: I0318 18:22:52.895078 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0470b0d-c2a4-4891-9a44-6142f0fe01d8","Type":"ContainerDied","Data":"1debd17fa7673f524a8cf265f2aa8afca0fdf2579c6e2cf9636ac62c5487eff6"} Mar 18 18:22:52 crc kubenswrapper[4830]: I0318 18:22:52.895208 4830 scope.go:117] "RemoveContainer" containerID="1debd17fa7673f524a8cf265f2aa8afca0fdf2579c6e2cf9636ac62c5487eff6" Mar 18 18:22:52 crc kubenswrapper[4830]: I0318 18:22:52.894921 4830 generic.go:334] "Generic (PLEG): container finished" podID="c0470b0d-c2a4-4891-9a44-6142f0fe01d8" containerID="1debd17fa7673f524a8cf265f2aa8afca0fdf2579c6e2cf9636ac62c5487eff6" exitCode=0 Mar 18 18:22:52 crc kubenswrapper[4830]: I0318 18:22:52.901101 4830 generic.go:334] "Generic (PLEG): container finished" podID="c0470b0d-c2a4-4891-9a44-6142f0fe01d8" containerID="4d9b64c32139e08fb3367328317250b7882e54837663f56af183d26ef7741dcf" exitCode=143 Mar 18 18:22:52 crc kubenswrapper[4830]: I0318 18:22:52.901462 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0470b0d-c2a4-4891-9a44-6142f0fe01d8","Type":"ContainerDied","Data":"4d9b64c32139e08fb3367328317250b7882e54837663f56af183d26ef7741dcf"} Mar 18 18:22:52 crc kubenswrapper[4830]: I0318 18:22:52.901520 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0470b0d-c2a4-4891-9a44-6142f0fe01d8","Type":"ContainerDied","Data":"0f629936f68e3dd69370f90700f39bed7b9f25a246f4ab9a140d916612a65eaf"} Mar 18 18:22:52 crc kubenswrapper[4830]: I0318 18:22:52.942365 4830 scope.go:117] "RemoveContainer" containerID="4d9b64c32139e08fb3367328317250b7882e54837663f56af183d26ef7741dcf" Mar 18 18:22:52 crc kubenswrapper[4830]: I0318 18:22:52.951147 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:22:52 crc kubenswrapper[4830]: I0318 18:22:52.965763 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:22:52 crc kubenswrapper[4830]: I0318 18:22:52.979370 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:22:52 crc kubenswrapper[4830]: E0318 18:22:52.980009 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0470b0d-c2a4-4891-9a44-6142f0fe01d8" containerName="nova-metadata-log" Mar 18 18:22:52 crc kubenswrapper[4830]: I0318 18:22:52.980050 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0470b0d-c2a4-4891-9a44-6142f0fe01d8" containerName="nova-metadata-log" Mar 18 18:22:52 crc kubenswrapper[4830]: E0318 18:22:52.980072 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0470b0d-c2a4-4891-9a44-6142f0fe01d8" containerName="nova-metadata-metadata" Mar 18 18:22:52 crc kubenswrapper[4830]: I0318 18:22:52.980078 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0470b0d-c2a4-4891-9a44-6142f0fe01d8" containerName="nova-metadata-metadata" Mar 18 18:22:52 crc kubenswrapper[4830]: I0318 18:22:52.980327 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0470b0d-c2a4-4891-9a44-6142f0fe01d8" containerName="nova-metadata-metadata" Mar 18 18:22:52 crc kubenswrapper[4830]: I0318 18:22:52.980348 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0470b0d-c2a4-4891-9a44-6142f0fe01d8" containerName="nova-metadata-log" Mar 18 18:22:52 crc kubenswrapper[4830]: I0318 18:22:52.981525 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:22:52 crc kubenswrapper[4830]: I0318 18:22:52.986001 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 18:22:52 crc kubenswrapper[4830]: I0318 18:22:52.986148 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 18:22:53 crc kubenswrapper[4830]: I0318 18:22:53.011715 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:22:53 crc kubenswrapper[4830]: I0318 18:22:53.015353 4830 scope.go:117] "RemoveContainer" containerID="1debd17fa7673f524a8cf265f2aa8afca0fdf2579c6e2cf9636ac62c5487eff6" Mar 18 18:22:53 crc kubenswrapper[4830]: E0318 18:22:53.018082 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1debd17fa7673f524a8cf265f2aa8afca0fdf2579c6e2cf9636ac62c5487eff6\": container with ID starting with 1debd17fa7673f524a8cf265f2aa8afca0fdf2579c6e2cf9636ac62c5487eff6 not found: ID does not exist" containerID="1debd17fa7673f524a8cf265f2aa8afca0fdf2579c6e2cf9636ac62c5487eff6" Mar 18 18:22:53 crc kubenswrapper[4830]: I0318 18:22:53.018114 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1debd17fa7673f524a8cf265f2aa8afca0fdf2579c6e2cf9636ac62c5487eff6"} err="failed to get container status \"1debd17fa7673f524a8cf265f2aa8afca0fdf2579c6e2cf9636ac62c5487eff6\": rpc error: code = NotFound desc = could not find container \"1debd17fa7673f524a8cf265f2aa8afca0fdf2579c6e2cf9636ac62c5487eff6\": container with ID starting with 1debd17fa7673f524a8cf265f2aa8afca0fdf2579c6e2cf9636ac62c5487eff6 not found: ID does not exist" Mar 18 18:22:53 crc kubenswrapper[4830]: I0318 18:22:53.018136 4830 scope.go:117] "RemoveContainer" containerID="4d9b64c32139e08fb3367328317250b7882e54837663f56af183d26ef7741dcf" Mar 18 18:22:53 crc kubenswrapper[4830]: E0318 18:22:53.018451 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d9b64c32139e08fb3367328317250b7882e54837663f56af183d26ef7741dcf\": container with ID starting with 4d9b64c32139e08fb3367328317250b7882e54837663f56af183d26ef7741dcf not found: ID does not exist" containerID="4d9b64c32139e08fb3367328317250b7882e54837663f56af183d26ef7741dcf" Mar 18 18:22:53 crc kubenswrapper[4830]: I0318 18:22:53.018480 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d9b64c32139e08fb3367328317250b7882e54837663f56af183d26ef7741dcf"} err="failed to get container status \"4d9b64c32139e08fb3367328317250b7882e54837663f56af183d26ef7741dcf\": rpc error: code = NotFound desc = could not find container \"4d9b64c32139e08fb3367328317250b7882e54837663f56af183d26ef7741dcf\": container with ID starting with 4d9b64c32139e08fb3367328317250b7882e54837663f56af183d26ef7741dcf not found: ID does not exist" Mar 18 18:22:53 crc kubenswrapper[4830]: I0318 18:22:53.018500 4830 scope.go:117] "RemoveContainer" containerID="1debd17fa7673f524a8cf265f2aa8afca0fdf2579c6e2cf9636ac62c5487eff6" Mar 18 18:22:53 crc kubenswrapper[4830]: I0318 18:22:53.018971 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1debd17fa7673f524a8cf265f2aa8afca0fdf2579c6e2cf9636ac62c5487eff6"} err="failed to get container status \"1debd17fa7673f524a8cf265f2aa8afca0fdf2579c6e2cf9636ac62c5487eff6\": rpc error: code = NotFound desc = could not find container \"1debd17fa7673f524a8cf265f2aa8afca0fdf2579c6e2cf9636ac62c5487eff6\": container with ID starting with 1debd17fa7673f524a8cf265f2aa8afca0fdf2579c6e2cf9636ac62c5487eff6 not found: ID does not exist" Mar 18 18:22:53 crc kubenswrapper[4830]: I0318 18:22:53.018989 4830 scope.go:117] "RemoveContainer" containerID="4d9b64c32139e08fb3367328317250b7882e54837663f56af183d26ef7741dcf" Mar 18 18:22:53 crc kubenswrapper[4830]: I0318 18:22:53.019175 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d9b64c32139e08fb3367328317250b7882e54837663f56af183d26ef7741dcf"} err="failed to get container status \"4d9b64c32139e08fb3367328317250b7882e54837663f56af183d26ef7741dcf\": rpc error: code = NotFound desc = could not find container \"4d9b64c32139e08fb3367328317250b7882e54837663f56af183d26ef7741dcf\": container with ID starting with 4d9b64c32139e08fb3367328317250b7882e54837663f56af183d26ef7741dcf not found: ID does not exist" Mar 18 18:22:53 crc kubenswrapper[4830]: I0318 18:22:53.039354 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a92d8ef1-bc89-48a8-b089-51c749d2d3d4-config-data\") pod \"nova-metadata-0\" (UID: \"a92d8ef1-bc89-48a8-b089-51c749d2d3d4\") " pod="openstack/nova-metadata-0" Mar 18 18:22:53 crc kubenswrapper[4830]: I0318 18:22:53.039393 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a92d8ef1-bc89-48a8-b089-51c749d2d3d4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a92d8ef1-bc89-48a8-b089-51c749d2d3d4\") " pod="openstack/nova-metadata-0" Mar 18 18:22:53 crc kubenswrapper[4830]: I0318 18:22:53.039419 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrbmn\" (UniqueName: \"kubernetes.io/projected/a92d8ef1-bc89-48a8-b089-51c749d2d3d4-kube-api-access-lrbmn\") pod \"nova-metadata-0\" (UID: \"a92d8ef1-bc89-48a8-b089-51c749d2d3d4\") " pod="openstack/nova-metadata-0" Mar 18 18:22:53 crc kubenswrapper[4830]: I0318 18:22:53.039502 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a92d8ef1-bc89-48a8-b089-51c749d2d3d4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a92d8ef1-bc89-48a8-b089-51c749d2d3d4\") " pod="openstack/nova-metadata-0" Mar 18 18:22:53 crc kubenswrapper[4830]: I0318 18:22:53.039595 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a92d8ef1-bc89-48a8-b089-51c749d2d3d4-logs\") pod \"nova-metadata-0\" (UID: \"a92d8ef1-bc89-48a8-b089-51c749d2d3d4\") " pod="openstack/nova-metadata-0" Mar 18 18:22:53 crc kubenswrapper[4830]: I0318 18:22:53.141710 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a92d8ef1-bc89-48a8-b089-51c749d2d3d4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a92d8ef1-bc89-48a8-b089-51c749d2d3d4\") " pod="openstack/nova-metadata-0" Mar 18 18:22:53 crc kubenswrapper[4830]: I0318 18:22:53.142092 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a92d8ef1-bc89-48a8-b089-51c749d2d3d4-logs\") pod \"nova-metadata-0\" (UID: \"a92d8ef1-bc89-48a8-b089-51c749d2d3d4\") " pod="openstack/nova-metadata-0" Mar 18 18:22:53 crc kubenswrapper[4830]: I0318 18:22:53.142237 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a92d8ef1-bc89-48a8-b089-51c749d2d3d4-config-data\") pod \"nova-metadata-0\" (UID: \"a92d8ef1-bc89-48a8-b089-51c749d2d3d4\") " pod="openstack/nova-metadata-0" Mar 18 18:22:53 crc kubenswrapper[4830]: I0318 18:22:53.142309 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a92d8ef1-bc89-48a8-b089-51c749d2d3d4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a92d8ef1-bc89-48a8-b089-51c749d2d3d4\") " pod="openstack/nova-metadata-0" Mar 18 18:22:53 crc kubenswrapper[4830]: I0318 18:22:53.142404 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrbmn\" (UniqueName: \"kubernetes.io/projected/a92d8ef1-bc89-48a8-b089-51c749d2d3d4-kube-api-access-lrbmn\") pod \"nova-metadata-0\" (UID: \"a92d8ef1-bc89-48a8-b089-51c749d2d3d4\") " pod="openstack/nova-metadata-0" Mar 18 18:22:53 crc kubenswrapper[4830]: I0318 18:22:53.142527 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a92d8ef1-bc89-48a8-b089-51c749d2d3d4-logs\") pod \"nova-metadata-0\" (UID: \"a92d8ef1-bc89-48a8-b089-51c749d2d3d4\") " pod="openstack/nova-metadata-0" Mar 18 18:22:53 crc kubenswrapper[4830]: I0318 18:22:53.146524 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a92d8ef1-bc89-48a8-b089-51c749d2d3d4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a92d8ef1-bc89-48a8-b089-51c749d2d3d4\") " pod="openstack/nova-metadata-0" Mar 18 18:22:53 crc kubenswrapper[4830]: I0318 18:22:53.147015 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a92d8ef1-bc89-48a8-b089-51c749d2d3d4-config-data\") pod \"nova-metadata-0\" (UID: \"a92d8ef1-bc89-48a8-b089-51c749d2d3d4\") " pod="openstack/nova-metadata-0" Mar 18 18:22:53 crc kubenswrapper[4830]: I0318 18:22:53.147212 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a92d8ef1-bc89-48a8-b089-51c749d2d3d4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a92d8ef1-bc89-48a8-b089-51c749d2d3d4\") " pod="openstack/nova-metadata-0" Mar 18 18:22:53 crc kubenswrapper[4830]: I0318 18:22:53.157439 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrbmn\" (UniqueName: \"kubernetes.io/projected/a92d8ef1-bc89-48a8-b089-51c749d2d3d4-kube-api-access-lrbmn\") pod \"nova-metadata-0\" (UID: \"a92d8ef1-bc89-48a8-b089-51c749d2d3d4\") " pod="openstack/nova-metadata-0" Mar 18 18:22:53 crc kubenswrapper[4830]: I0318 18:22:53.355946 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:22:53 crc kubenswrapper[4830]: I0318 18:22:53.859309 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:22:53 crc kubenswrapper[4830]: W0318 18:22:53.867086 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda92d8ef1_bc89_48a8_b089_51c749d2d3d4.slice/crio-91107b2f99557f1e5b77bbd55df92bb8bcc9df5e160c5e35c95847bb4df3e332 WatchSource:0}: Error finding container 91107b2f99557f1e5b77bbd55df92bb8bcc9df5e160c5e35c95847bb4df3e332: Status 404 returned error can't find the container with id 91107b2f99557f1e5b77bbd55df92bb8bcc9df5e160c5e35c95847bb4df3e332 Mar 18 18:22:53 crc kubenswrapper[4830]: I0318 18:22:53.912614 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a92d8ef1-bc89-48a8-b089-51c749d2d3d4","Type":"ContainerStarted","Data":"91107b2f99557f1e5b77bbd55df92bb8bcc9df5e160c5e35c95847bb4df3e332"} Mar 18 18:22:54 crc kubenswrapper[4830]: I0318 18:22:54.248332 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0470b0d-c2a4-4891-9a44-6142f0fe01d8" path="/var/lib/kubelet/pods/c0470b0d-c2a4-4891-9a44-6142f0fe01d8/volumes" Mar 18 18:22:54 crc kubenswrapper[4830]: I0318 18:22:54.929489 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a92d8ef1-bc89-48a8-b089-51c749d2d3d4","Type":"ContainerStarted","Data":"5c32b7d95bf5e43f2cd95245b496163720c2cde70afecc1a4c3a7c253d34b4f6"} Mar 18 18:22:54 crc kubenswrapper[4830]: I0318 18:22:54.931431 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a92d8ef1-bc89-48a8-b089-51c749d2d3d4","Type":"ContainerStarted","Data":"5a7150f4cc38a159b5207291090254ccef7582ace44b36cc2edd55784fa00da7"} Mar 18 18:22:55 crc kubenswrapper[4830]: E0318 18:22:55.485756 4830 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0a030d7_4344_449d_8edb_805be7b5604f.slice/crio-conmon-0e020237f20895e44cd22804f80d8e0f9db0bf852624cfadd9a1771f880e84b4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0a030d7_4344_449d_8edb_805be7b5604f.slice/crio-0e020237f20895e44cd22804f80d8e0f9db0bf852624cfadd9a1771f880e84b4.scope\": RecentStats: unable to find data in memory cache]" Mar 18 18:22:55 crc kubenswrapper[4830]: I0318 18:22:55.943247 4830 generic.go:334] "Generic (PLEG): container finished" podID="e0a030d7-4344-449d-8edb-805be7b5604f" containerID="0e020237f20895e44cd22804f80d8e0f9db0bf852624cfadd9a1771f880e84b4" exitCode=0 Mar 18 18:22:55 crc kubenswrapper[4830]: I0318 18:22:55.943344 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fl66n" event={"ID":"e0a030d7-4344-449d-8edb-805be7b5604f","Type":"ContainerDied","Data":"0e020237f20895e44cd22804f80d8e0f9db0bf852624cfadd9a1771f880e84b4"} Mar 18 18:22:55 crc kubenswrapper[4830]: I0318 18:22:55.961853 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.961832423 podStartE2EDuration="3.961832423s" podCreationTimestamp="2026-03-18 18:22:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:22:54.959250726 +0000 UTC m=+1209.526881078" watchObservedRunningTime="2026-03-18 18:22:55.961832423 +0000 UTC m=+1210.529462765" Mar 18 18:22:56 crc kubenswrapper[4830]: I0318 18:22:56.960150 4830 generic.go:334] "Generic (PLEG): container finished" podID="381d2049-85b1-49f5-a548-c3f5449fee4d" containerID="21a4f27449c9ea1b7a1149cfd63216765da9dd6af1f60659938e91d068c9e30d" exitCode=0 Mar 18 18:22:56 crc kubenswrapper[4830]: I0318 18:22:56.960218 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-p8g2m" event={"ID":"381d2049-85b1-49f5-a548-c3f5449fee4d","Type":"ContainerDied","Data":"21a4f27449c9ea1b7a1149cfd63216765da9dd6af1f60659938e91d068c9e30d"} Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.081390 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.081449 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.138442 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.219170 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.325146 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b495b9cc7-kpxvt" Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.425312 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8995fbb57-qhlkl"] Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.426448 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8995fbb57-qhlkl" podUID="d92164cb-de18-4223-9379-203b3e0cb28b" containerName="dnsmasq-dns" containerID="cri-o://17b3766b09800ce3b7c53d51148704ff79130830e4c4453df9d069a9ae638654" gracePeriod=10 Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.433946 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fl66n" Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.558488 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0a030d7-4344-449d-8edb-805be7b5604f-scripts\") pod \"e0a030d7-4344-449d-8edb-805be7b5604f\" (UID: \"e0a030d7-4344-449d-8edb-805be7b5604f\") " Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.558622 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6mvb\" (UniqueName: \"kubernetes.io/projected/e0a030d7-4344-449d-8edb-805be7b5604f-kube-api-access-c6mvb\") pod \"e0a030d7-4344-449d-8edb-805be7b5604f\" (UID: \"e0a030d7-4344-449d-8edb-805be7b5604f\") " Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.558765 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a030d7-4344-449d-8edb-805be7b5604f-combined-ca-bundle\") pod \"e0a030d7-4344-449d-8edb-805be7b5604f\" (UID: \"e0a030d7-4344-449d-8edb-805be7b5604f\") " Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.558813 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0a030d7-4344-449d-8edb-805be7b5604f-config-data\") pod \"e0a030d7-4344-449d-8edb-805be7b5604f\" (UID: \"e0a030d7-4344-449d-8edb-805be7b5604f\") " Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.567188 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0a030d7-4344-449d-8edb-805be7b5604f-scripts" (OuterVolumeSpecName: "scripts") pod "e0a030d7-4344-449d-8edb-805be7b5604f" (UID: "e0a030d7-4344-449d-8edb-805be7b5604f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.568161 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0a030d7-4344-449d-8edb-805be7b5604f-kube-api-access-c6mvb" (OuterVolumeSpecName: "kube-api-access-c6mvb") pod "e0a030d7-4344-449d-8edb-805be7b5604f" (UID: "e0a030d7-4344-449d-8edb-805be7b5604f"). InnerVolumeSpecName "kube-api-access-c6mvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.602067 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0a030d7-4344-449d-8edb-805be7b5604f-config-data" (OuterVolumeSpecName: "config-data") pod "e0a030d7-4344-449d-8edb-805be7b5604f" (UID: "e0a030d7-4344-449d-8edb-805be7b5604f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.605954 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0a030d7-4344-449d-8edb-805be7b5604f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0a030d7-4344-449d-8edb-805be7b5604f" (UID: "e0a030d7-4344-449d-8edb-805be7b5604f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.661054 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0a030d7-4344-449d-8edb-805be7b5604f-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.661084 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6mvb\" (UniqueName: \"kubernetes.io/projected/e0a030d7-4344-449d-8edb-805be7b5604f-kube-api-access-c6mvb\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.661097 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a030d7-4344-449d-8edb-805be7b5604f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.661106 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0a030d7-4344-449d-8edb-805be7b5604f-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.870435 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8995fbb57-qhlkl" Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.966179 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d92164cb-de18-4223-9379-203b3e0cb28b-ovsdbserver-nb\") pod \"d92164cb-de18-4223-9379-203b3e0cb28b\" (UID: \"d92164cb-de18-4223-9379-203b3e0cb28b\") " Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.966293 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d92164cb-de18-4223-9379-203b3e0cb28b-ovsdbserver-sb\") pod \"d92164cb-de18-4223-9379-203b3e0cb28b\" (UID: \"d92164cb-de18-4223-9379-203b3e0cb28b\") " Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.966354 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d92164cb-de18-4223-9379-203b3e0cb28b-dns-svc\") pod \"d92164cb-de18-4223-9379-203b3e0cb28b\" (UID: \"d92164cb-de18-4223-9379-203b3e0cb28b\") " Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.966456 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9m7qp\" (UniqueName: \"kubernetes.io/projected/d92164cb-de18-4223-9379-203b3e0cb28b-kube-api-access-9m7qp\") pod \"d92164cb-de18-4223-9379-203b3e0cb28b\" (UID: \"d92164cb-de18-4223-9379-203b3e0cb28b\") " Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.966503 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d92164cb-de18-4223-9379-203b3e0cb28b-dns-swift-storage-0\") pod \"d92164cb-de18-4223-9379-203b3e0cb28b\" (UID: \"d92164cb-de18-4223-9379-203b3e0cb28b\") " Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.966527 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d92164cb-de18-4223-9379-203b3e0cb28b-config\") pod \"d92164cb-de18-4223-9379-203b3e0cb28b\" (UID: \"d92164cb-de18-4223-9379-203b3e0cb28b\") " Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.973609 4830 generic.go:334] "Generic (PLEG): container finished" podID="d92164cb-de18-4223-9379-203b3e0cb28b" containerID="17b3766b09800ce3b7c53d51148704ff79130830e4c4453df9d069a9ae638654" exitCode=0 Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.973665 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8995fbb57-qhlkl" event={"ID":"d92164cb-de18-4223-9379-203b3e0cb28b","Type":"ContainerDied","Data":"17b3766b09800ce3b7c53d51148704ff79130830e4c4453df9d069a9ae638654"} Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.973693 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8995fbb57-qhlkl" event={"ID":"d92164cb-de18-4223-9379-203b3e0cb28b","Type":"ContainerDied","Data":"44dc4f57d6d0146271c47a40e81d34f33506e97a5c4eef49b45308a63a9bd0e3"} Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.973708 4830 scope.go:117] "RemoveContainer" containerID="17b3766b09800ce3b7c53d51148704ff79130830e4c4453df9d069a9ae638654" Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.973838 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8995fbb57-qhlkl" Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.974081 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d92164cb-de18-4223-9379-203b3e0cb28b-kube-api-access-9m7qp" (OuterVolumeSpecName: "kube-api-access-9m7qp") pod "d92164cb-de18-4223-9379-203b3e0cb28b" (UID: "d92164cb-de18-4223-9379-203b3e0cb28b"). InnerVolumeSpecName "kube-api-access-9m7qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.979868 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fl66n" Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.980717 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fl66n" event={"ID":"e0a030d7-4344-449d-8edb-805be7b5604f","Type":"ContainerDied","Data":"e0191c03f27883fa7746c5f2179aefb2dd877fcf7fff8c51acdb614f9dca27f5"} Mar 18 18:22:57 crc kubenswrapper[4830]: I0318 18:22:57.980812 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0191c03f27883fa7746c5f2179aefb2dd877fcf7fff8c51acdb614f9dca27f5" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.034426 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.035662 4830 scope.go:117] "RemoveContainer" containerID="122fa7e3ad46f72955996c1f7fccf4cf69a1c2cf68c1bb4bba2014756023ef88" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.039148 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d92164cb-de18-4223-9379-203b3e0cb28b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d92164cb-de18-4223-9379-203b3e0cb28b" (UID: "d92164cb-de18-4223-9379-203b3e0cb28b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.058730 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d92164cb-de18-4223-9379-203b3e0cb28b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d92164cb-de18-4223-9379-203b3e0cb28b" (UID: "d92164cb-de18-4223-9379-203b3e0cb28b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.066918 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d92164cb-de18-4223-9379-203b3e0cb28b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d92164cb-de18-4223-9379-203b3e0cb28b" (UID: "d92164cb-de18-4223-9379-203b3e0cb28b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.071376 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d92164cb-de18-4223-9379-203b3e0cb28b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.084834 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9m7qp\" (UniqueName: \"kubernetes.io/projected/d92164cb-de18-4223-9379-203b3e0cb28b-kube-api-access-9m7qp\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.084978 4830 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d92164cb-de18-4223-9379-203b3e0cb28b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.085064 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d92164cb-de18-4223-9379-203b3e0cb28b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.084079 4830 scope.go:117] "RemoveContainer" containerID="17b3766b09800ce3b7c53d51148704ff79130830e4c4453df9d069a9ae638654" Mar 18 18:22:58 crc kubenswrapper[4830]: E0318 18:22:58.085636 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17b3766b09800ce3b7c53d51148704ff79130830e4c4453df9d069a9ae638654\": container with ID starting with 17b3766b09800ce3b7c53d51148704ff79130830e4c4453df9d069a9ae638654 not found: ID does not exist" containerID="17b3766b09800ce3b7c53d51148704ff79130830e4c4453df9d069a9ae638654" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.085674 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17b3766b09800ce3b7c53d51148704ff79130830e4c4453df9d069a9ae638654"} err="failed to get container status \"17b3766b09800ce3b7c53d51148704ff79130830e4c4453df9d069a9ae638654\": rpc error: code = NotFound desc = could not find container \"17b3766b09800ce3b7c53d51148704ff79130830e4c4453df9d069a9ae638654\": container with ID starting with 17b3766b09800ce3b7c53d51148704ff79130830e4c4453df9d069a9ae638654 not found: ID does not exist" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.085695 4830 scope.go:117] "RemoveContainer" containerID="122fa7e3ad46f72955996c1f7fccf4cf69a1c2cf68c1bb4bba2014756023ef88" Mar 18 18:22:58 crc kubenswrapper[4830]: E0318 18:22:58.085921 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"122fa7e3ad46f72955996c1f7fccf4cf69a1c2cf68c1bb4bba2014756023ef88\": container with ID starting with 122fa7e3ad46f72955996c1f7fccf4cf69a1c2cf68c1bb4bba2014756023ef88 not found: ID does not exist" containerID="122fa7e3ad46f72955996c1f7fccf4cf69a1c2cf68c1bb4bba2014756023ef88" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.085937 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"122fa7e3ad46f72955996c1f7fccf4cf69a1c2cf68c1bb4bba2014756023ef88"} err="failed to get container status \"122fa7e3ad46f72955996c1f7fccf4cf69a1c2cf68c1bb4bba2014756023ef88\": rpc error: code = NotFound desc = could not find container \"122fa7e3ad46f72955996c1f7fccf4cf69a1c2cf68c1bb4bba2014756023ef88\": container with ID starting with 122fa7e3ad46f72955996c1f7fccf4cf69a1c2cf68c1bb4bba2014756023ef88 not found: ID does not exist" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.090710 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d92164cb-de18-4223-9379-203b3e0cb28b-config" (OuterVolumeSpecName: "config") pod "d92164cb-de18-4223-9379-203b3e0cb28b" (UID: "d92164cb-de18-4223-9379-203b3e0cb28b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.104067 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d92164cb-de18-4223-9379-203b3e0cb28b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d92164cb-de18-4223-9379-203b3e0cb28b" (UID: "d92164cb-de18-4223-9379-203b3e0cb28b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.118610 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.119083 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7035cb51-8209-41dc-8a05-7faecc3c0985" containerName="nova-api-log" containerID="cri-o://701ea6d1b0fdb51a72c68cdb182a16279c852e27d1fa57098972a41a762217b5" gracePeriod=30 Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.119289 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7035cb51-8209-41dc-8a05-7faecc3c0985" containerName="nova-api-api" containerID="cri-o://b60c175c22ac7e53fe71b2e37caf616851ec9cee94f547bc491df94aa3933b54" gracePeriod=30 Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.126090 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7035cb51-8209-41dc-8a05-7faecc3c0985" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.191:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.126207 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7035cb51-8209-41dc-8a05-7faecc3c0985" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.191:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.162666 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.175650 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.175957 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a92d8ef1-bc89-48a8-b089-51c749d2d3d4" containerName="nova-metadata-log" containerID="cri-o://5a7150f4cc38a159b5207291090254ccef7582ace44b36cc2edd55784fa00da7" gracePeriod=30 Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.176492 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a92d8ef1-bc89-48a8-b089-51c749d2d3d4" containerName="nova-metadata-metadata" containerID="cri-o://5c32b7d95bf5e43f2cd95245b496163720c2cde70afecc1a4c3a7c253d34b4f6" gracePeriod=30 Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.187102 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d92164cb-de18-4223-9379-203b3e0cb28b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.187127 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d92164cb-de18-4223-9379-203b3e0cb28b-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.303799 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8995fbb57-qhlkl"] Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.310222 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-p8g2m" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.311474 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8995fbb57-qhlkl"] Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.390247 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/381d2049-85b1-49f5-a548-c3f5449fee4d-config-data\") pod \"381d2049-85b1-49f5-a548-c3f5449fee4d\" (UID: \"381d2049-85b1-49f5-a548-c3f5449fee4d\") " Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.390376 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rd2xk\" (UniqueName: \"kubernetes.io/projected/381d2049-85b1-49f5-a548-c3f5449fee4d-kube-api-access-rd2xk\") pod \"381d2049-85b1-49f5-a548-c3f5449fee4d\" (UID: \"381d2049-85b1-49f5-a548-c3f5449fee4d\") " Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.390470 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/381d2049-85b1-49f5-a548-c3f5449fee4d-scripts\") pod \"381d2049-85b1-49f5-a548-c3f5449fee4d\" (UID: \"381d2049-85b1-49f5-a548-c3f5449fee4d\") " Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.390504 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381d2049-85b1-49f5-a548-c3f5449fee4d-combined-ca-bundle\") pod \"381d2049-85b1-49f5-a548-c3f5449fee4d\" (UID: \"381d2049-85b1-49f5-a548-c3f5449fee4d\") " Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.397279 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/381d2049-85b1-49f5-a548-c3f5449fee4d-scripts" (OuterVolumeSpecName: "scripts") pod "381d2049-85b1-49f5-a548-c3f5449fee4d" (UID: "381d2049-85b1-49f5-a548-c3f5449fee4d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.398261 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/381d2049-85b1-49f5-a548-c3f5449fee4d-kube-api-access-rd2xk" (OuterVolumeSpecName: "kube-api-access-rd2xk") pod "381d2049-85b1-49f5-a548-c3f5449fee4d" (UID: "381d2049-85b1-49f5-a548-c3f5449fee4d"). InnerVolumeSpecName "kube-api-access-rd2xk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.429160 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/381d2049-85b1-49f5-a548-c3f5449fee4d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "381d2049-85b1-49f5-a548-c3f5449fee4d" (UID: "381d2049-85b1-49f5-a548-c3f5449fee4d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.431389 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/381d2049-85b1-49f5-a548-c3f5449fee4d-config-data" (OuterVolumeSpecName: "config-data") pod "381d2049-85b1-49f5-a548-c3f5449fee4d" (UID: "381d2049-85b1-49f5-a548-c3f5449fee4d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.492793 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381d2049-85b1-49f5-a548-c3f5449fee4d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.492833 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/381d2049-85b1-49f5-a548-c3f5449fee4d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.492847 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rd2xk\" (UniqueName: \"kubernetes.io/projected/381d2049-85b1-49f5-a548-c3f5449fee4d-kube-api-access-rd2xk\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.492860 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/381d2049-85b1-49f5-a548-c3f5449fee4d-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.789834 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.900738 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a92d8ef1-bc89-48a8-b089-51c749d2d3d4-nova-metadata-tls-certs\") pod \"a92d8ef1-bc89-48a8-b089-51c749d2d3d4\" (UID: \"a92d8ef1-bc89-48a8-b089-51c749d2d3d4\") " Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.900815 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a92d8ef1-bc89-48a8-b089-51c749d2d3d4-logs\") pod \"a92d8ef1-bc89-48a8-b089-51c749d2d3d4\" (UID: \"a92d8ef1-bc89-48a8-b089-51c749d2d3d4\") " Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.900881 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a92d8ef1-bc89-48a8-b089-51c749d2d3d4-config-data\") pod \"a92d8ef1-bc89-48a8-b089-51c749d2d3d4\" (UID: \"a92d8ef1-bc89-48a8-b089-51c749d2d3d4\") " Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.900969 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrbmn\" (UniqueName: \"kubernetes.io/projected/a92d8ef1-bc89-48a8-b089-51c749d2d3d4-kube-api-access-lrbmn\") pod \"a92d8ef1-bc89-48a8-b089-51c749d2d3d4\" (UID: \"a92d8ef1-bc89-48a8-b089-51c749d2d3d4\") " Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.901024 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a92d8ef1-bc89-48a8-b089-51c749d2d3d4-combined-ca-bundle\") pod \"a92d8ef1-bc89-48a8-b089-51c749d2d3d4\" (UID: \"a92d8ef1-bc89-48a8-b089-51c749d2d3d4\") " Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.901196 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a92d8ef1-bc89-48a8-b089-51c749d2d3d4-logs" (OuterVolumeSpecName: "logs") pod "a92d8ef1-bc89-48a8-b089-51c749d2d3d4" (UID: "a92d8ef1-bc89-48a8-b089-51c749d2d3d4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.901658 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a92d8ef1-bc89-48a8-b089-51c749d2d3d4-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.904430 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a92d8ef1-bc89-48a8-b089-51c749d2d3d4-kube-api-access-lrbmn" (OuterVolumeSpecName: "kube-api-access-lrbmn") pod "a92d8ef1-bc89-48a8-b089-51c749d2d3d4" (UID: "a92d8ef1-bc89-48a8-b089-51c749d2d3d4"). InnerVolumeSpecName "kube-api-access-lrbmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.925724 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a92d8ef1-bc89-48a8-b089-51c749d2d3d4-config-data" (OuterVolumeSpecName: "config-data") pod "a92d8ef1-bc89-48a8-b089-51c749d2d3d4" (UID: "a92d8ef1-bc89-48a8-b089-51c749d2d3d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.926250 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a92d8ef1-bc89-48a8-b089-51c749d2d3d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a92d8ef1-bc89-48a8-b089-51c749d2d3d4" (UID: "a92d8ef1-bc89-48a8-b089-51c749d2d3d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.954146 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a92d8ef1-bc89-48a8-b089-51c749d2d3d4-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "a92d8ef1-bc89-48a8-b089-51c749d2d3d4" (UID: "a92d8ef1-bc89-48a8-b089-51c749d2d3d4"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.989482 4830 generic.go:334] "Generic (PLEG): container finished" podID="a92d8ef1-bc89-48a8-b089-51c749d2d3d4" containerID="5c32b7d95bf5e43f2cd95245b496163720c2cde70afecc1a4c3a7c253d34b4f6" exitCode=0 Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.989515 4830 generic.go:334] "Generic (PLEG): container finished" podID="a92d8ef1-bc89-48a8-b089-51c749d2d3d4" containerID="5a7150f4cc38a159b5207291090254ccef7582ace44b36cc2edd55784fa00da7" exitCode=143 Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.989536 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.989564 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a92d8ef1-bc89-48a8-b089-51c749d2d3d4","Type":"ContainerDied","Data":"5c32b7d95bf5e43f2cd95245b496163720c2cde70afecc1a4c3a7c253d34b4f6"} Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.989593 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a92d8ef1-bc89-48a8-b089-51c749d2d3d4","Type":"ContainerDied","Data":"5a7150f4cc38a159b5207291090254ccef7582ace44b36cc2edd55784fa00da7"} Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.989602 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a92d8ef1-bc89-48a8-b089-51c749d2d3d4","Type":"ContainerDied","Data":"91107b2f99557f1e5b77bbd55df92bb8bcc9df5e160c5e35c95847bb4df3e332"} Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.989618 4830 scope.go:117] "RemoveContainer" containerID="5c32b7d95bf5e43f2cd95245b496163720c2cde70afecc1a4c3a7c253d34b4f6" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.995806 4830 generic.go:334] "Generic (PLEG): container finished" podID="7035cb51-8209-41dc-8a05-7faecc3c0985" containerID="701ea6d1b0fdb51a72c68cdb182a16279c852e27d1fa57098972a41a762217b5" exitCode=143 Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.995909 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7035cb51-8209-41dc-8a05-7faecc3c0985","Type":"ContainerDied","Data":"701ea6d1b0fdb51a72c68cdb182a16279c852e27d1fa57098972a41a762217b5"} Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.997401 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-p8g2m" event={"ID":"381d2049-85b1-49f5-a548-c3f5449fee4d","Type":"ContainerDied","Data":"3e219f2eca3ba65bb3c075802abaabd515d557179705c102027947c6e29b7ca2"} Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.997433 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e219f2eca3ba65bb3c075802abaabd515d557179705c102027947c6e29b7ca2" Mar 18 18:22:58 crc kubenswrapper[4830]: I0318 18:22:58.997497 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-p8g2m" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.004060 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a92d8ef1-bc89-48a8-b089-51c749d2d3d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.004082 4830 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a92d8ef1-bc89-48a8-b089-51c749d2d3d4-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.004092 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a92d8ef1-bc89-48a8-b089-51c749d2d3d4-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.004104 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrbmn\" (UniqueName: \"kubernetes.io/projected/a92d8ef1-bc89-48a8-b089-51c749d2d3d4-kube-api-access-lrbmn\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.020370 4830 scope.go:117] "RemoveContainer" containerID="5a7150f4cc38a159b5207291090254ccef7582ace44b36cc2edd55784fa00da7" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.046471 4830 scope.go:117] "RemoveContainer" containerID="5c32b7d95bf5e43f2cd95245b496163720c2cde70afecc1a4c3a7c253d34b4f6" Mar 18 18:22:59 crc kubenswrapper[4830]: E0318 18:22:59.051175 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c32b7d95bf5e43f2cd95245b496163720c2cde70afecc1a4c3a7c253d34b4f6\": container with ID starting with 5c32b7d95bf5e43f2cd95245b496163720c2cde70afecc1a4c3a7c253d34b4f6 not found: ID does not exist" containerID="5c32b7d95bf5e43f2cd95245b496163720c2cde70afecc1a4c3a7c253d34b4f6" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.051212 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c32b7d95bf5e43f2cd95245b496163720c2cde70afecc1a4c3a7c253d34b4f6"} err="failed to get container status \"5c32b7d95bf5e43f2cd95245b496163720c2cde70afecc1a4c3a7c253d34b4f6\": rpc error: code = NotFound desc = could not find container \"5c32b7d95bf5e43f2cd95245b496163720c2cde70afecc1a4c3a7c253d34b4f6\": container with ID starting with 5c32b7d95bf5e43f2cd95245b496163720c2cde70afecc1a4c3a7c253d34b4f6 not found: ID does not exist" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.051245 4830 scope.go:117] "RemoveContainer" containerID="5a7150f4cc38a159b5207291090254ccef7582ace44b36cc2edd55784fa00da7" Mar 18 18:22:59 crc kubenswrapper[4830]: E0318 18:22:59.051630 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a7150f4cc38a159b5207291090254ccef7582ace44b36cc2edd55784fa00da7\": container with ID starting with 5a7150f4cc38a159b5207291090254ccef7582ace44b36cc2edd55784fa00da7 not found: ID does not exist" containerID="5a7150f4cc38a159b5207291090254ccef7582ace44b36cc2edd55784fa00da7" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.051679 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a7150f4cc38a159b5207291090254ccef7582ace44b36cc2edd55784fa00da7"} err="failed to get container status \"5a7150f4cc38a159b5207291090254ccef7582ace44b36cc2edd55784fa00da7\": rpc error: code = NotFound desc = could not find container \"5a7150f4cc38a159b5207291090254ccef7582ace44b36cc2edd55784fa00da7\": container with ID starting with 5a7150f4cc38a159b5207291090254ccef7582ace44b36cc2edd55784fa00da7 not found: ID does not exist" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.051707 4830 scope.go:117] "RemoveContainer" containerID="5c32b7d95bf5e43f2cd95245b496163720c2cde70afecc1a4c3a7c253d34b4f6" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.054296 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c32b7d95bf5e43f2cd95245b496163720c2cde70afecc1a4c3a7c253d34b4f6"} err="failed to get container status \"5c32b7d95bf5e43f2cd95245b496163720c2cde70afecc1a4c3a7c253d34b4f6\": rpc error: code = NotFound desc = could not find container \"5c32b7d95bf5e43f2cd95245b496163720c2cde70afecc1a4c3a7c253d34b4f6\": container with ID starting with 5c32b7d95bf5e43f2cd95245b496163720c2cde70afecc1a4c3a7c253d34b4f6 not found: ID does not exist" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.054318 4830 scope.go:117] "RemoveContainer" containerID="5a7150f4cc38a159b5207291090254ccef7582ace44b36cc2edd55784fa00da7" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.054749 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a7150f4cc38a159b5207291090254ccef7582ace44b36cc2edd55784fa00da7"} err="failed to get container status \"5a7150f4cc38a159b5207291090254ccef7582ace44b36cc2edd55784fa00da7\": rpc error: code = NotFound desc = could not find container \"5a7150f4cc38a159b5207291090254ccef7582ace44b36cc2edd55784fa00da7\": container with ID starting with 5a7150f4cc38a159b5207291090254ccef7582ace44b36cc2edd55784fa00da7 not found: ID does not exist" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.055423 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.065309 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.073081 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 18:22:59 crc kubenswrapper[4830]: E0318 18:22:59.073438 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92164cb-de18-4223-9379-203b3e0cb28b" containerName="init" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.073453 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92164cb-de18-4223-9379-203b3e0cb28b" containerName="init" Mar 18 18:22:59 crc kubenswrapper[4830]: E0318 18:22:59.073484 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0a030d7-4344-449d-8edb-805be7b5604f" containerName="nova-manage" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.073489 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0a030d7-4344-449d-8edb-805be7b5604f" containerName="nova-manage" Mar 18 18:22:59 crc kubenswrapper[4830]: E0318 18:22:59.073505 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a92d8ef1-bc89-48a8-b089-51c749d2d3d4" containerName="nova-metadata-metadata" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.073511 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a92d8ef1-bc89-48a8-b089-51c749d2d3d4" containerName="nova-metadata-metadata" Mar 18 18:22:59 crc kubenswrapper[4830]: E0318 18:22:59.073522 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a92d8ef1-bc89-48a8-b089-51c749d2d3d4" containerName="nova-metadata-log" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.073528 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a92d8ef1-bc89-48a8-b089-51c749d2d3d4" containerName="nova-metadata-log" Mar 18 18:22:59 crc kubenswrapper[4830]: E0318 18:22:59.073541 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92164cb-de18-4223-9379-203b3e0cb28b" containerName="dnsmasq-dns" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.073546 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92164cb-de18-4223-9379-203b3e0cb28b" containerName="dnsmasq-dns" Mar 18 18:22:59 crc kubenswrapper[4830]: E0318 18:22:59.073556 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="381d2049-85b1-49f5-a548-c3f5449fee4d" containerName="nova-cell1-conductor-db-sync" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.073561 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="381d2049-85b1-49f5-a548-c3f5449fee4d" containerName="nova-cell1-conductor-db-sync" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.073712 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="d92164cb-de18-4223-9379-203b3e0cb28b" containerName="dnsmasq-dns" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.073721 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0a030d7-4344-449d-8edb-805be7b5604f" containerName="nova-manage" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.073729 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="a92d8ef1-bc89-48a8-b089-51c749d2d3d4" containerName="nova-metadata-metadata" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.073740 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="a92d8ef1-bc89-48a8-b089-51c749d2d3d4" containerName="nova-metadata-log" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.073760 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="381d2049-85b1-49f5-a548-c3f5449fee4d" containerName="nova-cell1-conductor-db-sync" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.074319 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.075928 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.082898 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.084254 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.086185 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.086256 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.096846 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.115017 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.208833 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44872ddd-52a8-4ca8-a07e-f84111475b8f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"44872ddd-52a8-4ca8-a07e-f84111475b8f\") " pod="openstack/nova-cell1-conductor-0" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.209001 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44872ddd-52a8-4ca8-a07e-f84111475b8f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"44872ddd-52a8-4ca8-a07e-f84111475b8f\") " pod="openstack/nova-cell1-conductor-0" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.209202 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d\") " pod="openstack/nova-metadata-0" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.209252 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d\") " pod="openstack/nova-metadata-0" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.209272 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdwh9\" (UniqueName: \"kubernetes.io/projected/44872ddd-52a8-4ca8-a07e-f84111475b8f-kube-api-access-jdwh9\") pod \"nova-cell1-conductor-0\" (UID: \"44872ddd-52a8-4ca8-a07e-f84111475b8f\") " pod="openstack/nova-cell1-conductor-0" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.209381 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtklf\" (UniqueName: \"kubernetes.io/projected/6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d-kube-api-access-mtklf\") pod \"nova-metadata-0\" (UID: \"6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d\") " pod="openstack/nova-metadata-0" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.209422 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d-logs\") pod \"nova-metadata-0\" (UID: \"6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d\") " pod="openstack/nova-metadata-0" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.209483 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d-config-data\") pod \"nova-metadata-0\" (UID: \"6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d\") " pod="openstack/nova-metadata-0" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.311129 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44872ddd-52a8-4ca8-a07e-f84111475b8f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"44872ddd-52a8-4ca8-a07e-f84111475b8f\") " pod="openstack/nova-cell1-conductor-0" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.311200 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44872ddd-52a8-4ca8-a07e-f84111475b8f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"44872ddd-52a8-4ca8-a07e-f84111475b8f\") " pod="openstack/nova-cell1-conductor-0" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.311265 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d\") " pod="openstack/nova-metadata-0" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.311293 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d\") " pod="openstack/nova-metadata-0" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.311324 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdwh9\" (UniqueName: \"kubernetes.io/projected/44872ddd-52a8-4ca8-a07e-f84111475b8f-kube-api-access-jdwh9\") pod \"nova-cell1-conductor-0\" (UID: \"44872ddd-52a8-4ca8-a07e-f84111475b8f\") " pod="openstack/nova-cell1-conductor-0" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.311374 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtklf\" (UniqueName: \"kubernetes.io/projected/6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d-kube-api-access-mtklf\") pod \"nova-metadata-0\" (UID: \"6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d\") " pod="openstack/nova-metadata-0" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.311399 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d-logs\") pod \"nova-metadata-0\" (UID: \"6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d\") " pod="openstack/nova-metadata-0" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.311428 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d-config-data\") pod \"nova-metadata-0\" (UID: \"6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d\") " pod="openstack/nova-metadata-0" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.314308 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d-logs\") pod \"nova-metadata-0\" (UID: \"6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d\") " pod="openstack/nova-metadata-0" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.315998 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d-config-data\") pod \"nova-metadata-0\" (UID: \"6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d\") " pod="openstack/nova-metadata-0" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.317120 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44872ddd-52a8-4ca8-a07e-f84111475b8f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"44872ddd-52a8-4ca8-a07e-f84111475b8f\") " pod="openstack/nova-cell1-conductor-0" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.317470 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44872ddd-52a8-4ca8-a07e-f84111475b8f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"44872ddd-52a8-4ca8-a07e-f84111475b8f\") " pod="openstack/nova-cell1-conductor-0" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.317764 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d\") " pod="openstack/nova-metadata-0" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.324450 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d\") " pod="openstack/nova-metadata-0" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.334423 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtklf\" (UniqueName: \"kubernetes.io/projected/6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d-kube-api-access-mtklf\") pod \"nova-metadata-0\" (UID: \"6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d\") " pod="openstack/nova-metadata-0" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.336306 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdwh9\" (UniqueName: \"kubernetes.io/projected/44872ddd-52a8-4ca8-a07e-f84111475b8f-kube-api-access-jdwh9\") pod \"nova-cell1-conductor-0\" (UID: \"44872ddd-52a8-4ca8-a07e-f84111475b8f\") " pod="openstack/nova-cell1-conductor-0" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.396709 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.422479 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.510177 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:22:59 crc kubenswrapper[4830]: I0318 18:22:59.510507 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:23:00 crc kubenswrapper[4830]: I0318 18:23:00.006646 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 18:23:00 crc kubenswrapper[4830]: I0318 18:23:00.016205 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:23:00 crc kubenswrapper[4830]: I0318 18:23:00.043380 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"44872ddd-52a8-4ca8-a07e-f84111475b8f","Type":"ContainerStarted","Data":"ac97e09895ec9d7d87458ffd0aeb3d5d9139d115aea8746f3981aee77198c5a4"} Mar 18 18:23:00 crc kubenswrapper[4830]: I0318 18:23:00.043559 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="59e6ccc2-5823-4e3c-b3b6-cd18920a3e72" containerName="nova-scheduler-scheduler" containerID="cri-o://111db09911f989aafe8e57a7dcb456dafb974dda53468842a3118674e3a02ec7" gracePeriod=30 Mar 18 18:23:00 crc kubenswrapper[4830]: I0318 18:23:00.245173 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a92d8ef1-bc89-48a8-b089-51c749d2d3d4" path="/var/lib/kubelet/pods/a92d8ef1-bc89-48a8-b089-51c749d2d3d4/volumes" Mar 18 18:23:00 crc kubenswrapper[4830]: I0318 18:23:00.245871 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d92164cb-de18-4223-9379-203b3e0cb28b" path="/var/lib/kubelet/pods/d92164cb-de18-4223-9379-203b3e0cb28b/volumes" Mar 18 18:23:01 crc kubenswrapper[4830]: I0318 18:23:01.057598 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d","Type":"ContainerStarted","Data":"adee33e11c5c3bc1e71277096d004012a317b9250abfca3711a234ab58e6379e"} Mar 18 18:23:01 crc kubenswrapper[4830]: I0318 18:23:01.058603 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d","Type":"ContainerStarted","Data":"3e8a58d9287f21f167fefaeca6f3af608e8ac6f3b70f32f6d87d0fa5a94baf68"} Mar 18 18:23:01 crc kubenswrapper[4830]: I0318 18:23:01.058623 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d","Type":"ContainerStarted","Data":"4f964093b91b5f5c5ca440d51fb40c3661385f00ab648b3dcc670918940b3038"} Mar 18 18:23:01 crc kubenswrapper[4830]: I0318 18:23:01.062001 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"44872ddd-52a8-4ca8-a07e-f84111475b8f","Type":"ContainerStarted","Data":"d65ffc2b335a667737c6a18c2b396b9a709039acd32a58d2211316eb8df8aa6d"} Mar 18 18:23:01 crc kubenswrapper[4830]: I0318 18:23:01.062188 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 18 18:23:01 crc kubenswrapper[4830]: I0318 18:23:01.083132 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.083105222 podStartE2EDuration="2.083105222s" podCreationTimestamp="2026-03-18 18:22:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:23:01.073255995 +0000 UTC m=+1215.640886397" watchObservedRunningTime="2026-03-18 18:23:01.083105222 +0000 UTC m=+1215.650735554" Mar 18 18:23:01 crc kubenswrapper[4830]: I0318 18:23:01.096806 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.096788337 podStartE2EDuration="2.096788337s" podCreationTimestamp="2026-03-18 18:22:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:23:01.091606451 +0000 UTC m=+1215.659236783" watchObservedRunningTime="2026-03-18 18:23:01.096788337 +0000 UTC m=+1215.664418669" Mar 18 18:23:02 crc kubenswrapper[4830]: E0318 18:23:02.140446 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="111db09911f989aafe8e57a7dcb456dafb974dda53468842a3118674e3a02ec7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 18:23:02 crc kubenswrapper[4830]: E0318 18:23:02.146160 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="111db09911f989aafe8e57a7dcb456dafb974dda53468842a3118674e3a02ec7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 18:23:02 crc kubenswrapper[4830]: E0318 18:23:02.148409 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="111db09911f989aafe8e57a7dcb456dafb974dda53468842a3118674e3a02ec7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 18:23:02 crc kubenswrapper[4830]: E0318 18:23:02.148481 4830 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="59e6ccc2-5823-4e3c-b3b6-cd18920a3e72" containerName="nova-scheduler-scheduler" Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.013386 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.034172 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9c48\" (UniqueName: \"kubernetes.io/projected/59e6ccc2-5823-4e3c-b3b6-cd18920a3e72-kube-api-access-s9c48\") pod \"59e6ccc2-5823-4e3c-b3b6-cd18920a3e72\" (UID: \"59e6ccc2-5823-4e3c-b3b6-cd18920a3e72\") " Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.034320 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59e6ccc2-5823-4e3c-b3b6-cd18920a3e72-config-data\") pod \"59e6ccc2-5823-4e3c-b3b6-cd18920a3e72\" (UID: \"59e6ccc2-5823-4e3c-b3b6-cd18920a3e72\") " Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.034498 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59e6ccc2-5823-4e3c-b3b6-cd18920a3e72-combined-ca-bundle\") pod \"59e6ccc2-5823-4e3c-b3b6-cd18920a3e72\" (UID: \"59e6ccc2-5823-4e3c-b3b6-cd18920a3e72\") " Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.049662 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59e6ccc2-5823-4e3c-b3b6-cd18920a3e72-kube-api-access-s9c48" (OuterVolumeSpecName: "kube-api-access-s9c48") pod "59e6ccc2-5823-4e3c-b3b6-cd18920a3e72" (UID: "59e6ccc2-5823-4e3c-b3b6-cd18920a3e72"). InnerVolumeSpecName "kube-api-access-s9c48". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.068201 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59e6ccc2-5823-4e3c-b3b6-cd18920a3e72-config-data" (OuterVolumeSpecName: "config-data") pod "59e6ccc2-5823-4e3c-b3b6-cd18920a3e72" (UID: "59e6ccc2-5823-4e3c-b3b6-cd18920a3e72"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.076226 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59e6ccc2-5823-4e3c-b3b6-cd18920a3e72-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59e6ccc2-5823-4e3c-b3b6-cd18920a3e72" (UID: "59e6ccc2-5823-4e3c-b3b6-cd18920a3e72"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.100977 4830 generic.go:334] "Generic (PLEG): container finished" podID="59e6ccc2-5823-4e3c-b3b6-cd18920a3e72" containerID="111db09911f989aafe8e57a7dcb456dafb974dda53468842a3118674e3a02ec7" exitCode=0 Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.101036 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"59e6ccc2-5823-4e3c-b3b6-cd18920a3e72","Type":"ContainerDied","Data":"111db09911f989aafe8e57a7dcb456dafb974dda53468842a3118674e3a02ec7"} Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.101067 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"59e6ccc2-5823-4e3c-b3b6-cd18920a3e72","Type":"ContainerDied","Data":"7ad7bb7f875f34038111604f86d39bbaf0805565a495b4d7eae5d4d9c2b94d87"} Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.101101 4830 scope.go:117] "RemoveContainer" containerID="111db09911f989aafe8e57a7dcb456dafb974dda53468842a3118674e3a02ec7" Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.101297 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.137308 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9c48\" (UniqueName: \"kubernetes.io/projected/59e6ccc2-5823-4e3c-b3b6-cd18920a3e72-kube-api-access-s9c48\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.137359 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59e6ccc2-5823-4e3c-b3b6-cd18920a3e72-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.137382 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59e6ccc2-5823-4e3c-b3b6-cd18920a3e72-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.161455 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.169333 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.181881 4830 scope.go:117] "RemoveContainer" containerID="111db09911f989aafe8e57a7dcb456dafb974dda53468842a3118674e3a02ec7" Mar 18 18:23:03 crc kubenswrapper[4830]: E0318 18:23:03.182496 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"111db09911f989aafe8e57a7dcb456dafb974dda53468842a3118674e3a02ec7\": container with ID starting with 111db09911f989aafe8e57a7dcb456dafb974dda53468842a3118674e3a02ec7 not found: ID does not exist" containerID="111db09911f989aafe8e57a7dcb456dafb974dda53468842a3118674e3a02ec7" Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.182529 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"111db09911f989aafe8e57a7dcb456dafb974dda53468842a3118674e3a02ec7"} err="failed to get container status \"111db09911f989aafe8e57a7dcb456dafb974dda53468842a3118674e3a02ec7\": rpc error: code = NotFound desc = could not find container \"111db09911f989aafe8e57a7dcb456dafb974dda53468842a3118674e3a02ec7\": container with ID starting with 111db09911f989aafe8e57a7dcb456dafb974dda53468842a3118674e3a02ec7 not found: ID does not exist" Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.195930 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:23:03 crc kubenswrapper[4830]: E0318 18:23:03.196372 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59e6ccc2-5823-4e3c-b3b6-cd18920a3e72" containerName="nova-scheduler-scheduler" Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.196395 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="59e6ccc2-5823-4e3c-b3b6-cd18920a3e72" containerName="nova-scheduler-scheduler" Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.196604 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="59e6ccc2-5823-4e3c-b3b6-cd18920a3e72" containerName="nova-scheduler-scheduler" Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.197281 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.201569 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.223137 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.238936 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhmmn\" (UniqueName: \"kubernetes.io/projected/ff705200-15b1-471b-a5af-97566ce67516-kube-api-access-vhmmn\") pod \"nova-scheduler-0\" (UID: \"ff705200-15b1-471b-a5af-97566ce67516\") " pod="openstack/nova-scheduler-0" Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.239083 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff705200-15b1-471b-a5af-97566ce67516-config-data\") pod \"nova-scheduler-0\" (UID: \"ff705200-15b1-471b-a5af-97566ce67516\") " pod="openstack/nova-scheduler-0" Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.239146 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff705200-15b1-471b-a5af-97566ce67516-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ff705200-15b1-471b-a5af-97566ce67516\") " pod="openstack/nova-scheduler-0" Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.340396 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff705200-15b1-471b-a5af-97566ce67516-config-data\") pod \"nova-scheduler-0\" (UID: \"ff705200-15b1-471b-a5af-97566ce67516\") " pod="openstack/nova-scheduler-0" Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.340473 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff705200-15b1-471b-a5af-97566ce67516-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ff705200-15b1-471b-a5af-97566ce67516\") " pod="openstack/nova-scheduler-0" Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.340540 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhmmn\" (UniqueName: \"kubernetes.io/projected/ff705200-15b1-471b-a5af-97566ce67516-kube-api-access-vhmmn\") pod \"nova-scheduler-0\" (UID: \"ff705200-15b1-471b-a5af-97566ce67516\") " pod="openstack/nova-scheduler-0" Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.344323 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff705200-15b1-471b-a5af-97566ce67516-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ff705200-15b1-471b-a5af-97566ce67516\") " pod="openstack/nova-scheduler-0" Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.344670 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff705200-15b1-471b-a5af-97566ce67516-config-data\") pod \"nova-scheduler-0\" (UID: \"ff705200-15b1-471b-a5af-97566ce67516\") " pod="openstack/nova-scheduler-0" Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.357231 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhmmn\" (UniqueName: \"kubernetes.io/projected/ff705200-15b1-471b-a5af-97566ce67516-kube-api-access-vhmmn\") pod \"nova-scheduler-0\" (UID: \"ff705200-15b1-471b-a5af-97566ce67516\") " pod="openstack/nova-scheduler-0" Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.518828 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.899806 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 18:23:03 crc kubenswrapper[4830]: I0318 18:23:03.992553 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:23:04 crc kubenswrapper[4830]: I0318 18:23:04.113824 4830 generic.go:334] "Generic (PLEG): container finished" podID="7035cb51-8209-41dc-8a05-7faecc3c0985" containerID="b60c175c22ac7e53fe71b2e37caf616851ec9cee94f547bc491df94aa3933b54" exitCode=0 Mar 18 18:23:04 crc kubenswrapper[4830]: I0318 18:23:04.113922 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7035cb51-8209-41dc-8a05-7faecc3c0985","Type":"ContainerDied","Data":"b60c175c22ac7e53fe71b2e37caf616851ec9cee94f547bc491df94aa3933b54"} Mar 18 18:23:04 crc kubenswrapper[4830]: I0318 18:23:04.113967 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7035cb51-8209-41dc-8a05-7faecc3c0985","Type":"ContainerDied","Data":"79bf7cd01c6a7883a346f32f4b35ca0e66a968d7bfbb78988dcf53c4f0fd0213"} Mar 18 18:23:04 crc kubenswrapper[4830]: I0318 18:23:04.113982 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79bf7cd01c6a7883a346f32f4b35ca0e66a968d7bfbb78988dcf53c4f0fd0213" Mar 18 18:23:04 crc kubenswrapper[4830]: I0318 18:23:04.116533 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ff705200-15b1-471b-a5af-97566ce67516","Type":"ContainerStarted","Data":"0b3eb7284131a964ea82b013e3fd88b742c7bc0f6b9d679ac1eed4fd906617b3"} Mar 18 18:23:04 crc kubenswrapper[4830]: I0318 18:23:04.155142 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:23:04 crc kubenswrapper[4830]: I0318 18:23:04.245114 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59e6ccc2-5823-4e3c-b3b6-cd18920a3e72" path="/var/lib/kubelet/pods/59e6ccc2-5823-4e3c-b3b6-cd18920a3e72/volumes" Mar 18 18:23:04 crc kubenswrapper[4830]: I0318 18:23:04.357730 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwzpp\" (UniqueName: \"kubernetes.io/projected/7035cb51-8209-41dc-8a05-7faecc3c0985-kube-api-access-xwzpp\") pod \"7035cb51-8209-41dc-8a05-7faecc3c0985\" (UID: \"7035cb51-8209-41dc-8a05-7faecc3c0985\") " Mar 18 18:23:04 crc kubenswrapper[4830]: I0318 18:23:04.358333 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7035cb51-8209-41dc-8a05-7faecc3c0985-logs\") pod \"7035cb51-8209-41dc-8a05-7faecc3c0985\" (UID: \"7035cb51-8209-41dc-8a05-7faecc3c0985\") " Mar 18 18:23:04 crc kubenswrapper[4830]: I0318 18:23:04.358494 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7035cb51-8209-41dc-8a05-7faecc3c0985-combined-ca-bundle\") pod \"7035cb51-8209-41dc-8a05-7faecc3c0985\" (UID: \"7035cb51-8209-41dc-8a05-7faecc3c0985\") " Mar 18 18:23:04 crc kubenswrapper[4830]: I0318 18:23:04.358643 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7035cb51-8209-41dc-8a05-7faecc3c0985-config-data\") pod \"7035cb51-8209-41dc-8a05-7faecc3c0985\" (UID: \"7035cb51-8209-41dc-8a05-7faecc3c0985\") " Mar 18 18:23:04 crc kubenswrapper[4830]: I0318 18:23:04.358705 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7035cb51-8209-41dc-8a05-7faecc3c0985-logs" (OuterVolumeSpecName: "logs") pod "7035cb51-8209-41dc-8a05-7faecc3c0985" (UID: "7035cb51-8209-41dc-8a05-7faecc3c0985"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:23:04 crc kubenswrapper[4830]: I0318 18:23:04.363993 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7035cb51-8209-41dc-8a05-7faecc3c0985-kube-api-access-xwzpp" (OuterVolumeSpecName: "kube-api-access-xwzpp") pod "7035cb51-8209-41dc-8a05-7faecc3c0985" (UID: "7035cb51-8209-41dc-8a05-7faecc3c0985"). InnerVolumeSpecName "kube-api-access-xwzpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:23:04 crc kubenswrapper[4830]: I0318 18:23:04.382151 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7035cb51-8209-41dc-8a05-7faecc3c0985-config-data" (OuterVolumeSpecName: "config-data") pod "7035cb51-8209-41dc-8a05-7faecc3c0985" (UID: "7035cb51-8209-41dc-8a05-7faecc3c0985"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:04 crc kubenswrapper[4830]: I0318 18:23:04.383949 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7035cb51-8209-41dc-8a05-7faecc3c0985-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7035cb51-8209-41dc-8a05-7faecc3c0985" (UID: "7035cb51-8209-41dc-8a05-7faecc3c0985"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:04 crc kubenswrapper[4830]: I0318 18:23:04.460464 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7035cb51-8209-41dc-8a05-7faecc3c0985-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:04 crc kubenswrapper[4830]: I0318 18:23:04.460503 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7035cb51-8209-41dc-8a05-7faecc3c0985-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:04 crc kubenswrapper[4830]: I0318 18:23:04.460515 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwzpp\" (UniqueName: \"kubernetes.io/projected/7035cb51-8209-41dc-8a05-7faecc3c0985-kube-api-access-xwzpp\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:04 crc kubenswrapper[4830]: I0318 18:23:04.460528 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7035cb51-8209-41dc-8a05-7faecc3c0985-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:05 crc kubenswrapper[4830]: I0318 18:23:05.131576 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:23:05 crc kubenswrapper[4830]: I0318 18:23:05.132832 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ff705200-15b1-471b-a5af-97566ce67516","Type":"ContainerStarted","Data":"8b88e3c3ace0f597cf06b04b680fd9ca20902806c14182d7fd471f34d5addb83"} Mar 18 18:23:05 crc kubenswrapper[4830]: I0318 18:23:05.176191 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.176164344 podStartE2EDuration="2.176164344s" podCreationTimestamp="2026-03-18 18:23:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:23:05.160193995 +0000 UTC m=+1219.727824337" watchObservedRunningTime="2026-03-18 18:23:05.176164344 +0000 UTC m=+1219.743794706" Mar 18 18:23:05 crc kubenswrapper[4830]: I0318 18:23:05.198563 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:23:05 crc kubenswrapper[4830]: I0318 18:23:05.217512 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:23:05 crc kubenswrapper[4830]: I0318 18:23:05.234147 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 18:23:05 crc kubenswrapper[4830]: E0318 18:23:05.234691 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7035cb51-8209-41dc-8a05-7faecc3c0985" containerName="nova-api-api" Mar 18 18:23:05 crc kubenswrapper[4830]: I0318 18:23:05.234714 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="7035cb51-8209-41dc-8a05-7faecc3c0985" containerName="nova-api-api" Mar 18 18:23:05 crc kubenswrapper[4830]: E0318 18:23:05.234739 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7035cb51-8209-41dc-8a05-7faecc3c0985" containerName="nova-api-log" Mar 18 18:23:05 crc kubenswrapper[4830]: I0318 18:23:05.234748 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="7035cb51-8209-41dc-8a05-7faecc3c0985" containerName="nova-api-log" Mar 18 18:23:05 crc kubenswrapper[4830]: I0318 18:23:05.235021 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="7035cb51-8209-41dc-8a05-7faecc3c0985" containerName="nova-api-api" Mar 18 18:23:05 crc kubenswrapper[4830]: I0318 18:23:05.235047 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="7035cb51-8209-41dc-8a05-7faecc3c0985" containerName="nova-api-log" Mar 18 18:23:05 crc kubenswrapper[4830]: I0318 18:23:05.237186 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:23:05 crc kubenswrapper[4830]: I0318 18:23:05.247411 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 18:23:05 crc kubenswrapper[4830]: I0318 18:23:05.249222 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:23:05 crc kubenswrapper[4830]: I0318 18:23:05.280128 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6cc0c25-d795-4421-abbe-bfdcefb9db61-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a6cc0c25-d795-4421-abbe-bfdcefb9db61\") " pod="openstack/nova-api-0" Mar 18 18:23:05 crc kubenswrapper[4830]: I0318 18:23:05.280196 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6cc0c25-d795-4421-abbe-bfdcefb9db61-config-data\") pod \"nova-api-0\" (UID: \"a6cc0c25-d795-4421-abbe-bfdcefb9db61\") " pod="openstack/nova-api-0" Mar 18 18:23:05 crc kubenswrapper[4830]: I0318 18:23:05.280278 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6cc0c25-d795-4421-abbe-bfdcefb9db61-logs\") pod \"nova-api-0\" (UID: \"a6cc0c25-d795-4421-abbe-bfdcefb9db61\") " pod="openstack/nova-api-0" Mar 18 18:23:05 crc kubenswrapper[4830]: I0318 18:23:05.280413 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg58n\" (UniqueName: \"kubernetes.io/projected/a6cc0c25-d795-4421-abbe-bfdcefb9db61-kube-api-access-qg58n\") pod \"nova-api-0\" (UID: \"a6cc0c25-d795-4421-abbe-bfdcefb9db61\") " pod="openstack/nova-api-0" Mar 18 18:23:05 crc kubenswrapper[4830]: I0318 18:23:05.382723 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg58n\" (UniqueName: \"kubernetes.io/projected/a6cc0c25-d795-4421-abbe-bfdcefb9db61-kube-api-access-qg58n\") pod \"nova-api-0\" (UID: \"a6cc0c25-d795-4421-abbe-bfdcefb9db61\") " pod="openstack/nova-api-0" Mar 18 18:23:05 crc kubenswrapper[4830]: I0318 18:23:05.383412 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6cc0c25-d795-4421-abbe-bfdcefb9db61-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a6cc0c25-d795-4421-abbe-bfdcefb9db61\") " pod="openstack/nova-api-0" Mar 18 18:23:05 crc kubenswrapper[4830]: I0318 18:23:05.383437 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6cc0c25-d795-4421-abbe-bfdcefb9db61-config-data\") pod \"nova-api-0\" (UID: \"a6cc0c25-d795-4421-abbe-bfdcefb9db61\") " pod="openstack/nova-api-0" Mar 18 18:23:05 crc kubenswrapper[4830]: I0318 18:23:05.384209 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6cc0c25-d795-4421-abbe-bfdcefb9db61-logs\") pod \"nova-api-0\" (UID: \"a6cc0c25-d795-4421-abbe-bfdcefb9db61\") " pod="openstack/nova-api-0" Mar 18 18:23:05 crc kubenswrapper[4830]: I0318 18:23:05.384579 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6cc0c25-d795-4421-abbe-bfdcefb9db61-logs\") pod \"nova-api-0\" (UID: \"a6cc0c25-d795-4421-abbe-bfdcefb9db61\") " pod="openstack/nova-api-0" Mar 18 18:23:05 crc kubenswrapper[4830]: I0318 18:23:05.389509 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6cc0c25-d795-4421-abbe-bfdcefb9db61-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a6cc0c25-d795-4421-abbe-bfdcefb9db61\") " pod="openstack/nova-api-0" Mar 18 18:23:05 crc kubenswrapper[4830]: I0318 18:23:05.396604 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6cc0c25-d795-4421-abbe-bfdcefb9db61-config-data\") pod \"nova-api-0\" (UID: \"a6cc0c25-d795-4421-abbe-bfdcefb9db61\") " pod="openstack/nova-api-0" Mar 18 18:23:05 crc kubenswrapper[4830]: I0318 18:23:05.403643 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg58n\" (UniqueName: \"kubernetes.io/projected/a6cc0c25-d795-4421-abbe-bfdcefb9db61-kube-api-access-qg58n\") pod \"nova-api-0\" (UID: \"a6cc0c25-d795-4421-abbe-bfdcefb9db61\") " pod="openstack/nova-api-0" Mar 18 18:23:05 crc kubenswrapper[4830]: I0318 18:23:05.562429 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:23:06 crc kubenswrapper[4830]: I0318 18:23:06.043576 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:23:06 crc kubenswrapper[4830]: W0318 18:23:06.053474 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6cc0c25_d795_4421_abbe_bfdcefb9db61.slice/crio-dc058af0be962eec06a8d1b36a9585d165a0148144c020b313f616f0c4949ca9 WatchSource:0}: Error finding container dc058af0be962eec06a8d1b36a9585d165a0148144c020b313f616f0c4949ca9: Status 404 returned error can't find the container with id dc058af0be962eec06a8d1b36a9585d165a0148144c020b313f616f0c4949ca9 Mar 18 18:23:06 crc kubenswrapper[4830]: I0318 18:23:06.146024 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a6cc0c25-d795-4421-abbe-bfdcefb9db61","Type":"ContainerStarted","Data":"dc058af0be962eec06a8d1b36a9585d165a0148144c020b313f616f0c4949ca9"} Mar 18 18:23:06 crc kubenswrapper[4830]: I0318 18:23:06.245182 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7035cb51-8209-41dc-8a05-7faecc3c0985" path="/var/lib/kubelet/pods/7035cb51-8209-41dc-8a05-7faecc3c0985/volumes" Mar 18 18:23:07 crc kubenswrapper[4830]: I0318 18:23:07.154947 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a6cc0c25-d795-4421-abbe-bfdcefb9db61","Type":"ContainerStarted","Data":"ae5d24f64c1c35cc8a1935c826501d58649d4660951145043ac21c04591ba75f"} Mar 18 18:23:07 crc kubenswrapper[4830]: I0318 18:23:07.155392 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a6cc0c25-d795-4421-abbe-bfdcefb9db61","Type":"ContainerStarted","Data":"35816140a6f613f53804dd433976493f1668d481308131393e3b3babebfa4eb5"} Mar 18 18:23:07 crc kubenswrapper[4830]: I0318 18:23:07.229783 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.229747989 podStartE2EDuration="2.229747989s" podCreationTimestamp="2026-03-18 18:23:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:23:07.2237361 +0000 UTC m=+1221.791366432" watchObservedRunningTime="2026-03-18 18:23:07.229747989 +0000 UTC m=+1221.797378321" Mar 18 18:23:08 crc kubenswrapper[4830]: I0318 18:23:08.149026 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 18:23:08 crc kubenswrapper[4830]: I0318 18:23:08.149555 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="eceefe95-ad07-4228-ac93-a8f2484ba584" containerName="kube-state-metrics" containerID="cri-o://70f3b429e2562b27fda6ee8c73713da103efc539bc82849bd258e189d8678a3f" gracePeriod=30 Mar 18 18:23:08 crc kubenswrapper[4830]: I0318 18:23:08.519821 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 18:23:08 crc kubenswrapper[4830]: I0318 18:23:08.653527 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 18:23:08 crc kubenswrapper[4830]: I0318 18:23:08.846408 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ndww\" (UniqueName: \"kubernetes.io/projected/eceefe95-ad07-4228-ac93-a8f2484ba584-kube-api-access-9ndww\") pod \"eceefe95-ad07-4228-ac93-a8f2484ba584\" (UID: \"eceefe95-ad07-4228-ac93-a8f2484ba584\") " Mar 18 18:23:08 crc kubenswrapper[4830]: I0318 18:23:08.859985 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eceefe95-ad07-4228-ac93-a8f2484ba584-kube-api-access-9ndww" (OuterVolumeSpecName: "kube-api-access-9ndww") pod "eceefe95-ad07-4228-ac93-a8f2484ba584" (UID: "eceefe95-ad07-4228-ac93-a8f2484ba584"). InnerVolumeSpecName "kube-api-access-9ndww". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:23:08 crc kubenswrapper[4830]: I0318 18:23:08.948017 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ndww\" (UniqueName: \"kubernetes.io/projected/eceefe95-ad07-4228-ac93-a8f2484ba584-kube-api-access-9ndww\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.178885 4830 generic.go:334] "Generic (PLEG): container finished" podID="eceefe95-ad07-4228-ac93-a8f2484ba584" containerID="70f3b429e2562b27fda6ee8c73713da103efc539bc82849bd258e189d8678a3f" exitCode=2 Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.178970 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"eceefe95-ad07-4228-ac93-a8f2484ba584","Type":"ContainerDied","Data":"70f3b429e2562b27fda6ee8c73713da103efc539bc82849bd258e189d8678a3f"} Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.178983 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.179063 4830 scope.go:117] "RemoveContainer" containerID="70f3b429e2562b27fda6ee8c73713da103efc539bc82849bd258e189d8678a3f" Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.179045 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"eceefe95-ad07-4228-ac93-a8f2484ba584","Type":"ContainerDied","Data":"9051b9364ed790c8b9d8c8949d64fcbefd485723e7a773ed646932bcd1c0e6d0"} Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.210296 4830 scope.go:117] "RemoveContainer" containerID="70f3b429e2562b27fda6ee8c73713da103efc539bc82849bd258e189d8678a3f" Mar 18 18:23:09 crc kubenswrapper[4830]: E0318 18:23:09.210830 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70f3b429e2562b27fda6ee8c73713da103efc539bc82849bd258e189d8678a3f\": container with ID starting with 70f3b429e2562b27fda6ee8c73713da103efc539bc82849bd258e189d8678a3f not found: ID does not exist" containerID="70f3b429e2562b27fda6ee8c73713da103efc539bc82849bd258e189d8678a3f" Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.210868 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70f3b429e2562b27fda6ee8c73713da103efc539bc82849bd258e189d8678a3f"} err="failed to get container status \"70f3b429e2562b27fda6ee8c73713da103efc539bc82849bd258e189d8678a3f\": rpc error: code = NotFound desc = could not find container \"70f3b429e2562b27fda6ee8c73713da103efc539bc82849bd258e189d8678a3f\": container with ID starting with 70f3b429e2562b27fda6ee8c73713da103efc539bc82849bd258e189d8678a3f not found: ID does not exist" Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.231868 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.245262 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.259932 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 18:23:09 crc kubenswrapper[4830]: E0318 18:23:09.260523 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eceefe95-ad07-4228-ac93-a8f2484ba584" containerName="kube-state-metrics" Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.260593 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="eceefe95-ad07-4228-ac93-a8f2484ba584" containerName="kube-state-metrics" Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.260902 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="eceefe95-ad07-4228-ac93-a8f2484ba584" containerName="kube-state-metrics" Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.263671 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.267631 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.267850 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.271052 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.425367 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.425670 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.426003 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.455915 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/3e810512-a127-40b3-b1c2-559c3b86fcdb-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"3e810512-a127-40b3-b1c2-559c3b86fcdb\") " pod="openstack/kube-state-metrics-0" Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.456025 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scxbn\" (UniqueName: \"kubernetes.io/projected/3e810512-a127-40b3-b1c2-559c3b86fcdb-kube-api-access-scxbn\") pod \"kube-state-metrics-0\" (UID: \"3e810512-a127-40b3-b1c2-559c3b86fcdb\") " pod="openstack/kube-state-metrics-0" Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.456053 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e810512-a127-40b3-b1c2-559c3b86fcdb-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"3e810512-a127-40b3-b1c2-559c3b86fcdb\") " pod="openstack/kube-state-metrics-0" Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.456142 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e810512-a127-40b3-b1c2-559c3b86fcdb-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"3e810512-a127-40b3-b1c2-559c3b86fcdb\") " pod="openstack/kube-state-metrics-0" Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.557992 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e810512-a127-40b3-b1c2-559c3b86fcdb-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"3e810512-a127-40b3-b1c2-559c3b86fcdb\") " pod="openstack/kube-state-metrics-0" Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.558101 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/3e810512-a127-40b3-b1c2-559c3b86fcdb-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"3e810512-a127-40b3-b1c2-559c3b86fcdb\") " pod="openstack/kube-state-metrics-0" Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.558251 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scxbn\" (UniqueName: \"kubernetes.io/projected/3e810512-a127-40b3-b1c2-559c3b86fcdb-kube-api-access-scxbn\") pod \"kube-state-metrics-0\" (UID: \"3e810512-a127-40b3-b1c2-559c3b86fcdb\") " pod="openstack/kube-state-metrics-0" Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.558291 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e810512-a127-40b3-b1c2-559c3b86fcdb-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"3e810512-a127-40b3-b1c2-559c3b86fcdb\") " pod="openstack/kube-state-metrics-0" Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.568376 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e810512-a127-40b3-b1c2-559c3b86fcdb-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"3e810512-a127-40b3-b1c2-559c3b86fcdb\") " pod="openstack/kube-state-metrics-0" Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.572720 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e810512-a127-40b3-b1c2-559c3b86fcdb-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"3e810512-a127-40b3-b1c2-559c3b86fcdb\") " pod="openstack/kube-state-metrics-0" Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.573216 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/3e810512-a127-40b3-b1c2-559c3b86fcdb-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"3e810512-a127-40b3-b1c2-559c3b86fcdb\") " pod="openstack/kube-state-metrics-0" Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.588437 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scxbn\" (UniqueName: \"kubernetes.io/projected/3e810512-a127-40b3-b1c2-559c3b86fcdb-kube-api-access-scxbn\") pod \"kube-state-metrics-0\" (UID: \"3e810512-a127-40b3-b1c2-559c3b86fcdb\") " pod="openstack/kube-state-metrics-0" Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.881369 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.915569 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.916091 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e" containerName="proxy-httpd" containerID="cri-o://3c380f95da70a75502bca3ec7796d5fe785517d13bb7466eac0bdb087b24b6da" gracePeriod=30 Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.916142 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e" containerName="sg-core" containerID="cri-o://f05a0e14d67e6f266175e2596f5d09495ee04e7042a547872555de67589b64ec" gracePeriod=30 Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.916034 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e" containerName="ceilometer-central-agent" containerID="cri-o://ad2a9b53272ce4727ce4285e370a2c86cec16f7ce93b1bd7e3fbde719b45d425" gracePeriod=30 Mar 18 18:23:09 crc kubenswrapper[4830]: I0318 18:23:09.916113 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e" containerName="ceilometer-notification-agent" containerID="cri-o://ac2e5bc1c2fed9fc804f9cc6834aa51ea532d7c73bed28029f6a8d47dcdad890" gracePeriod=30 Mar 18 18:23:10 crc kubenswrapper[4830]: I0318 18:23:10.191157 4830 generic.go:334] "Generic (PLEG): container finished" podID="6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e" containerID="3c380f95da70a75502bca3ec7796d5fe785517d13bb7466eac0bdb087b24b6da" exitCode=0 Mar 18 18:23:10 crc kubenswrapper[4830]: I0318 18:23:10.191479 4830 generic.go:334] "Generic (PLEG): container finished" podID="6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e" containerID="f05a0e14d67e6f266175e2596f5d09495ee04e7042a547872555de67589b64ec" exitCode=2 Mar 18 18:23:10 crc kubenswrapper[4830]: I0318 18:23:10.191243 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e","Type":"ContainerDied","Data":"3c380f95da70a75502bca3ec7796d5fe785517d13bb7466eac0bdb087b24b6da"} Mar 18 18:23:10 crc kubenswrapper[4830]: I0318 18:23:10.191546 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e","Type":"ContainerDied","Data":"f05a0e14d67e6f266175e2596f5d09495ee04e7042a547872555de67589b64ec"} Mar 18 18:23:10 crc kubenswrapper[4830]: I0318 18:23:10.245482 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eceefe95-ad07-4228-ac93-a8f2484ba584" path="/var/lib/kubelet/pods/eceefe95-ad07-4228-ac93-a8f2484ba584/volumes" Mar 18 18:23:10 crc kubenswrapper[4830]: I0318 18:23:10.370573 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 18:23:10 crc kubenswrapper[4830]: W0318 18:23:10.374545 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e810512_a127_40b3_b1c2_559c3b86fcdb.slice/crio-bd8f14e621ac53fe72e937b4acbc8fa7a12ab0dee9801f4c8c752ff688d0876f WatchSource:0}: Error finding container bd8f14e621ac53fe72e937b4acbc8fa7a12ab0dee9801f4c8c752ff688d0876f: Status 404 returned error can't find the container with id bd8f14e621ac53fe72e937b4acbc8fa7a12ab0dee9801f4c8c752ff688d0876f Mar 18 18:23:10 crc kubenswrapper[4830]: I0318 18:23:10.437939 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 18:23:10 crc kubenswrapper[4830]: I0318 18:23:10.438007 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 18:23:11 crc kubenswrapper[4830]: I0318 18:23:11.207729 4830 generic.go:334] "Generic (PLEG): container finished" podID="6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e" containerID="ad2a9b53272ce4727ce4285e370a2c86cec16f7ce93b1bd7e3fbde719b45d425" exitCode=0 Mar 18 18:23:11 crc kubenswrapper[4830]: I0318 18:23:11.208119 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e","Type":"ContainerDied","Data":"ad2a9b53272ce4727ce4285e370a2c86cec16f7ce93b1bd7e3fbde719b45d425"} Mar 18 18:23:11 crc kubenswrapper[4830]: I0318 18:23:11.209882 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3e810512-a127-40b3-b1c2-559c3b86fcdb","Type":"ContainerStarted","Data":"966ee135f8e6e3d440939198bd3d2a3c627df5403d51fc43caced871d92ebe29"} Mar 18 18:23:11 crc kubenswrapper[4830]: I0318 18:23:11.209914 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3e810512-a127-40b3-b1c2-559c3b86fcdb","Type":"ContainerStarted","Data":"bd8f14e621ac53fe72e937b4acbc8fa7a12ab0dee9801f4c8c752ff688d0876f"} Mar 18 18:23:11 crc kubenswrapper[4830]: I0318 18:23:11.211052 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 18 18:23:11 crc kubenswrapper[4830]: I0318 18:23:11.240164 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.8493888539999999 podStartE2EDuration="2.240138408s" podCreationTimestamp="2026-03-18 18:23:09 +0000 UTC" firstStartedPulling="2026-03-18 18:23:10.376984902 +0000 UTC m=+1224.944615244" lastFinishedPulling="2026-03-18 18:23:10.767734476 +0000 UTC m=+1225.335364798" observedRunningTime="2026-03-18 18:23:11.22812315 +0000 UTC m=+1225.795753482" watchObservedRunningTime="2026-03-18 18:23:11.240138408 +0000 UTC m=+1225.807768760" Mar 18 18:23:12 crc kubenswrapper[4830]: I0318 18:23:12.975091 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.022845 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nk6xj\" (UniqueName: \"kubernetes.io/projected/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-kube-api-access-nk6xj\") pod \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\" (UID: \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\") " Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.022940 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-sg-core-conf-yaml\") pod \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\" (UID: \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\") " Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.022985 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-scripts\") pod \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\" (UID: \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\") " Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.023043 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-combined-ca-bundle\") pod \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\" (UID: \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\") " Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.023059 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-config-data\") pod \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\" (UID: \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\") " Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.023113 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-log-httpd\") pod \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\" (UID: \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\") " Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.023219 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-run-httpd\") pod \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\" (UID: \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\") " Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.023988 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e" (UID: "6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.024005 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e" (UID: "6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.028904 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-kube-api-access-nk6xj" (OuterVolumeSpecName: "kube-api-access-nk6xj") pod "6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e" (UID: "6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e"). InnerVolumeSpecName "kube-api-access-nk6xj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.031925 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-scripts" (OuterVolumeSpecName: "scripts") pod "6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e" (UID: "6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.061015 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e" (UID: "6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.097993 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e" (UID: "6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.123965 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-config-data" (OuterVolumeSpecName: "config-data") pod "6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e" (UID: "6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.124921 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-config-data\") pod \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\" (UID: \"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e\") " Mar 18 18:23:13 crc kubenswrapper[4830]: W0318 18:23:13.125044 4830 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e/volumes/kubernetes.io~secret/config-data Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.125066 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-config-data" (OuterVolumeSpecName: "config-data") pod "6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e" (UID: "6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.125980 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nk6xj\" (UniqueName: \"kubernetes.io/projected/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-kube-api-access-nk6xj\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.126081 4830 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.126178 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.126265 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.126423 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.126506 4830 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.126573 4830 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.255946 4830 generic.go:334] "Generic (PLEG): container finished" podID="6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e" containerID="ac2e5bc1c2fed9fc804f9cc6834aa51ea532d7c73bed28029f6a8d47dcdad890" exitCode=0 Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.256018 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.256075 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e","Type":"ContainerDied","Data":"ac2e5bc1c2fed9fc804f9cc6834aa51ea532d7c73bed28029f6a8d47dcdad890"} Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.256618 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e","Type":"ContainerDied","Data":"5c035e7c801dc707638d118c501fd37160cfa0933be9e2ca3419a5991ba53d28"} Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.256690 4830 scope.go:117] "RemoveContainer" containerID="3c380f95da70a75502bca3ec7796d5fe785517d13bb7466eac0bdb087b24b6da" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.297579 4830 scope.go:117] "RemoveContainer" containerID="f05a0e14d67e6f266175e2596f5d09495ee04e7042a547872555de67589b64ec" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.324957 4830 scope.go:117] "RemoveContainer" containerID="ac2e5bc1c2fed9fc804f9cc6834aa51ea532d7c73bed28029f6a8d47dcdad890" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.325085 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.336678 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.347439 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:23:13 crc kubenswrapper[4830]: E0318 18:23:13.347869 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e" containerName="ceilometer-central-agent" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.347887 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e" containerName="ceilometer-central-agent" Mar 18 18:23:13 crc kubenswrapper[4830]: E0318 18:23:13.347914 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e" containerName="proxy-httpd" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.347922 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e" containerName="proxy-httpd" Mar 18 18:23:13 crc kubenswrapper[4830]: E0318 18:23:13.347936 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e" containerName="sg-core" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.347941 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e" containerName="sg-core" Mar 18 18:23:13 crc kubenswrapper[4830]: E0318 18:23:13.347954 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e" containerName="ceilometer-notification-agent" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.347960 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e" containerName="ceilometer-notification-agent" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.348127 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e" containerName="sg-core" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.348148 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e" containerName="ceilometer-notification-agent" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.348160 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e" containerName="ceilometer-central-agent" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.348171 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e" containerName="proxy-httpd" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.349814 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.352827 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.352912 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.352947 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.372171 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.372545 4830 scope.go:117] "RemoveContainer" containerID="ad2a9b53272ce4727ce4285e370a2c86cec16f7ce93b1bd7e3fbde719b45d425" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.397790 4830 scope.go:117] "RemoveContainer" containerID="3c380f95da70a75502bca3ec7796d5fe785517d13bb7466eac0bdb087b24b6da" Mar 18 18:23:13 crc kubenswrapper[4830]: E0318 18:23:13.398226 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c380f95da70a75502bca3ec7796d5fe785517d13bb7466eac0bdb087b24b6da\": container with ID starting with 3c380f95da70a75502bca3ec7796d5fe785517d13bb7466eac0bdb087b24b6da not found: ID does not exist" containerID="3c380f95da70a75502bca3ec7796d5fe785517d13bb7466eac0bdb087b24b6da" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.398399 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c380f95da70a75502bca3ec7796d5fe785517d13bb7466eac0bdb087b24b6da"} err="failed to get container status \"3c380f95da70a75502bca3ec7796d5fe785517d13bb7466eac0bdb087b24b6da\": rpc error: code = NotFound desc = could not find container \"3c380f95da70a75502bca3ec7796d5fe785517d13bb7466eac0bdb087b24b6da\": container with ID starting with 3c380f95da70a75502bca3ec7796d5fe785517d13bb7466eac0bdb087b24b6da not found: ID does not exist" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.398528 4830 scope.go:117] "RemoveContainer" containerID="f05a0e14d67e6f266175e2596f5d09495ee04e7042a547872555de67589b64ec" Mar 18 18:23:13 crc kubenswrapper[4830]: E0318 18:23:13.398957 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f05a0e14d67e6f266175e2596f5d09495ee04e7042a547872555de67589b64ec\": container with ID starting with f05a0e14d67e6f266175e2596f5d09495ee04e7042a547872555de67589b64ec not found: ID does not exist" containerID="f05a0e14d67e6f266175e2596f5d09495ee04e7042a547872555de67589b64ec" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.398989 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f05a0e14d67e6f266175e2596f5d09495ee04e7042a547872555de67589b64ec"} err="failed to get container status \"f05a0e14d67e6f266175e2596f5d09495ee04e7042a547872555de67589b64ec\": rpc error: code = NotFound desc = could not find container \"f05a0e14d67e6f266175e2596f5d09495ee04e7042a547872555de67589b64ec\": container with ID starting with f05a0e14d67e6f266175e2596f5d09495ee04e7042a547872555de67589b64ec not found: ID does not exist" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.399014 4830 scope.go:117] "RemoveContainer" containerID="ac2e5bc1c2fed9fc804f9cc6834aa51ea532d7c73bed28029f6a8d47dcdad890" Mar 18 18:23:13 crc kubenswrapper[4830]: E0318 18:23:13.399324 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac2e5bc1c2fed9fc804f9cc6834aa51ea532d7c73bed28029f6a8d47dcdad890\": container with ID starting with ac2e5bc1c2fed9fc804f9cc6834aa51ea532d7c73bed28029f6a8d47dcdad890 not found: ID does not exist" containerID="ac2e5bc1c2fed9fc804f9cc6834aa51ea532d7c73bed28029f6a8d47dcdad890" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.399349 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac2e5bc1c2fed9fc804f9cc6834aa51ea532d7c73bed28029f6a8d47dcdad890"} err="failed to get container status \"ac2e5bc1c2fed9fc804f9cc6834aa51ea532d7c73bed28029f6a8d47dcdad890\": rpc error: code = NotFound desc = could not find container \"ac2e5bc1c2fed9fc804f9cc6834aa51ea532d7c73bed28029f6a8d47dcdad890\": container with ID starting with ac2e5bc1c2fed9fc804f9cc6834aa51ea532d7c73bed28029f6a8d47dcdad890 not found: ID does not exist" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.399365 4830 scope.go:117] "RemoveContainer" containerID="ad2a9b53272ce4727ce4285e370a2c86cec16f7ce93b1bd7e3fbde719b45d425" Mar 18 18:23:13 crc kubenswrapper[4830]: E0318 18:23:13.399593 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad2a9b53272ce4727ce4285e370a2c86cec16f7ce93b1bd7e3fbde719b45d425\": container with ID starting with ad2a9b53272ce4727ce4285e370a2c86cec16f7ce93b1bd7e3fbde719b45d425 not found: ID does not exist" containerID="ad2a9b53272ce4727ce4285e370a2c86cec16f7ce93b1bd7e3fbde719b45d425" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.399694 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad2a9b53272ce4727ce4285e370a2c86cec16f7ce93b1bd7e3fbde719b45d425"} err="failed to get container status \"ad2a9b53272ce4727ce4285e370a2c86cec16f7ce93b1bd7e3fbde719b45d425\": rpc error: code = NotFound desc = could not find container \"ad2a9b53272ce4727ce4285e370a2c86cec16f7ce93b1bd7e3fbde719b45d425\": container with ID starting with ad2a9b53272ce4727ce4285e370a2c86cec16f7ce93b1bd7e3fbde719b45d425 not found: ID does not exist" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.431382 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2631c83-8cce-42b7-88b4-226f2e8b4227-run-httpd\") pod \"ceilometer-0\" (UID: \"e2631c83-8cce-42b7-88b4-226f2e8b4227\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.431659 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2631c83-8cce-42b7-88b4-226f2e8b4227-scripts\") pod \"ceilometer-0\" (UID: \"e2631c83-8cce-42b7-88b4-226f2e8b4227\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.431848 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2631c83-8cce-42b7-88b4-226f2e8b4227-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e2631c83-8cce-42b7-88b4-226f2e8b4227\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.431969 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2631c83-8cce-42b7-88b4-226f2e8b4227-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e2631c83-8cce-42b7-88b4-226f2e8b4227\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.432183 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz6kw\" (UniqueName: \"kubernetes.io/projected/e2631c83-8cce-42b7-88b4-226f2e8b4227-kube-api-access-mz6kw\") pod \"ceilometer-0\" (UID: \"e2631c83-8cce-42b7-88b4-226f2e8b4227\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.432253 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2631c83-8cce-42b7-88b4-226f2e8b4227-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e2631c83-8cce-42b7-88b4-226f2e8b4227\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.432341 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2631c83-8cce-42b7-88b4-226f2e8b4227-config-data\") pod \"ceilometer-0\" (UID: \"e2631c83-8cce-42b7-88b4-226f2e8b4227\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.432357 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2631c83-8cce-42b7-88b4-226f2e8b4227-log-httpd\") pod \"ceilometer-0\" (UID: \"e2631c83-8cce-42b7-88b4-226f2e8b4227\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.519125 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.534588 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2631c83-8cce-42b7-88b4-226f2e8b4227-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e2631c83-8cce-42b7-88b4-226f2e8b4227\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.534669 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz6kw\" (UniqueName: \"kubernetes.io/projected/e2631c83-8cce-42b7-88b4-226f2e8b4227-kube-api-access-mz6kw\") pod \"ceilometer-0\" (UID: \"e2631c83-8cce-42b7-88b4-226f2e8b4227\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.534714 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2631c83-8cce-42b7-88b4-226f2e8b4227-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e2631c83-8cce-42b7-88b4-226f2e8b4227\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.534763 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2631c83-8cce-42b7-88b4-226f2e8b4227-config-data\") pod \"ceilometer-0\" (UID: \"e2631c83-8cce-42b7-88b4-226f2e8b4227\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.534808 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2631c83-8cce-42b7-88b4-226f2e8b4227-log-httpd\") pod \"ceilometer-0\" (UID: \"e2631c83-8cce-42b7-88b4-226f2e8b4227\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.534904 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2631c83-8cce-42b7-88b4-226f2e8b4227-run-httpd\") pod \"ceilometer-0\" (UID: \"e2631c83-8cce-42b7-88b4-226f2e8b4227\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.534924 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2631c83-8cce-42b7-88b4-226f2e8b4227-scripts\") pod \"ceilometer-0\" (UID: \"e2631c83-8cce-42b7-88b4-226f2e8b4227\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.534983 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2631c83-8cce-42b7-88b4-226f2e8b4227-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e2631c83-8cce-42b7-88b4-226f2e8b4227\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.536387 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2631c83-8cce-42b7-88b4-226f2e8b4227-run-httpd\") pod \"ceilometer-0\" (UID: \"e2631c83-8cce-42b7-88b4-226f2e8b4227\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.536426 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2631c83-8cce-42b7-88b4-226f2e8b4227-log-httpd\") pod \"ceilometer-0\" (UID: \"e2631c83-8cce-42b7-88b4-226f2e8b4227\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.539671 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2631c83-8cce-42b7-88b4-226f2e8b4227-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e2631c83-8cce-42b7-88b4-226f2e8b4227\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.539787 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2631c83-8cce-42b7-88b4-226f2e8b4227-scripts\") pod \"ceilometer-0\" (UID: \"e2631c83-8cce-42b7-88b4-226f2e8b4227\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.539914 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2631c83-8cce-42b7-88b4-226f2e8b4227-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e2631c83-8cce-42b7-88b4-226f2e8b4227\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.540251 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2631c83-8cce-42b7-88b4-226f2e8b4227-config-data\") pod \"ceilometer-0\" (UID: \"e2631c83-8cce-42b7-88b4-226f2e8b4227\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.540407 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2631c83-8cce-42b7-88b4-226f2e8b4227-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e2631c83-8cce-42b7-88b4-226f2e8b4227\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.552651 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz6kw\" (UniqueName: \"kubernetes.io/projected/e2631c83-8cce-42b7-88b4-226f2e8b4227-kube-api-access-mz6kw\") pod \"ceilometer-0\" (UID: \"e2631c83-8cce-42b7-88b4-226f2e8b4227\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.558886 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 18:23:13 crc kubenswrapper[4830]: I0318 18:23:13.685692 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:23:14 crc kubenswrapper[4830]: I0318 18:23:14.127057 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:23:14 crc kubenswrapper[4830]: W0318 18:23:14.135104 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2631c83_8cce_42b7_88b4_226f2e8b4227.slice/crio-ef41da036fcd4aeafd2210d8a4fbf0f45929a13b31fbed582927645957940e33 WatchSource:0}: Error finding container ef41da036fcd4aeafd2210d8a4fbf0f45929a13b31fbed582927645957940e33: Status 404 returned error can't find the container with id ef41da036fcd4aeafd2210d8a4fbf0f45929a13b31fbed582927645957940e33 Mar 18 18:23:14 crc kubenswrapper[4830]: I0318 18:23:14.250354 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e" path="/var/lib/kubelet/pods/6f62dfdf-86e9-4d8e-94fa-ba98e79a9b2e/volumes" Mar 18 18:23:14 crc kubenswrapper[4830]: I0318 18:23:14.269439 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2631c83-8cce-42b7-88b4-226f2e8b4227","Type":"ContainerStarted","Data":"ef41da036fcd4aeafd2210d8a4fbf0f45929a13b31fbed582927645957940e33"} Mar 18 18:23:14 crc kubenswrapper[4830]: I0318 18:23:14.305083 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 18:23:15 crc kubenswrapper[4830]: I0318 18:23:15.283897 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2631c83-8cce-42b7-88b4-226f2e8b4227","Type":"ContainerStarted","Data":"e3040cd8cfa1f3743082305952878da88f422fad07839805693e457987e1defd"} Mar 18 18:23:15 crc kubenswrapper[4830]: I0318 18:23:15.563041 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 18:23:15 crc kubenswrapper[4830]: I0318 18:23:15.564194 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 18:23:16 crc kubenswrapper[4830]: I0318 18:23:16.294980 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2631c83-8cce-42b7-88b4-226f2e8b4227","Type":"ContainerStarted","Data":"dfdda4d2ffef289ad7cfd39a3ea1e0257392b492df4a96ea229c8996f11d2023"} Mar 18 18:23:16 crc kubenswrapper[4830]: I0318 18:23:16.644978 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a6cc0c25-d795-4421-abbe-bfdcefb9db61" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 18:23:16 crc kubenswrapper[4830]: I0318 18:23:16.644996 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a6cc0c25-d795-4421-abbe-bfdcefb9db61" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 18:23:17 crc kubenswrapper[4830]: I0318 18:23:17.308063 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2631c83-8cce-42b7-88b4-226f2e8b4227","Type":"ContainerStarted","Data":"c25b569c99c695c49e370158227a2b977a290800356a8041ebb63d09f6153189"} Mar 18 18:23:17 crc kubenswrapper[4830]: I0318 18:23:17.425391 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 18:23:17 crc kubenswrapper[4830]: I0318 18:23:17.425436 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 18:23:19 crc kubenswrapper[4830]: I0318 18:23:19.436681 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 18:23:19 crc kubenswrapper[4830]: I0318 18:23:19.442591 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 18:23:19 crc kubenswrapper[4830]: I0318 18:23:19.455027 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 18:23:19 crc kubenswrapper[4830]: I0318 18:23:19.889910 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 18 18:23:20 crc kubenswrapper[4830]: I0318 18:23:20.336827 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2631c83-8cce-42b7-88b4-226f2e8b4227","Type":"ContainerStarted","Data":"2b6031c6b23e1c352ff4f06217ad561a0959af2fae57ac06547e777217b7bb94"} Mar 18 18:23:20 crc kubenswrapper[4830]: I0318 18:23:20.346722 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 18:23:20 crc kubenswrapper[4830]: I0318 18:23:20.375556 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.068001421 podStartE2EDuration="7.375531643s" podCreationTimestamp="2026-03-18 18:23:13 +0000 UTC" firstStartedPulling="2026-03-18 18:23:14.138088792 +0000 UTC m=+1228.705719144" lastFinishedPulling="2026-03-18 18:23:19.445619014 +0000 UTC m=+1234.013249366" observedRunningTime="2026-03-18 18:23:20.358860591 +0000 UTC m=+1234.926490933" watchObservedRunningTime="2026-03-18 18:23:20.375531643 +0000 UTC m=+1234.943161985" Mar 18 18:23:21 crc kubenswrapper[4830]: I0318 18:23:21.344359 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 18:23:22 crc kubenswrapper[4830]: I0318 18:23:22.337590 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:23:22 crc kubenswrapper[4830]: I0318 18:23:22.358023 4830 generic.go:334] "Generic (PLEG): container finished" podID="bc92c3f3-06d7-4bab-8cd4-c68fed8308c5" containerID="12d65b613d7c7bb4a1bf18d31ba8c25084c1c109f742d8bc96be864bbf2d7fb4" exitCode=137 Mar 18 18:23:22 crc kubenswrapper[4830]: I0318 18:23:22.358125 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:23:22 crc kubenswrapper[4830]: I0318 18:23:22.358229 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bc92c3f3-06d7-4bab-8cd4-c68fed8308c5","Type":"ContainerDied","Data":"12d65b613d7c7bb4a1bf18d31ba8c25084c1c109f742d8bc96be864bbf2d7fb4"} Mar 18 18:23:22 crc kubenswrapper[4830]: I0318 18:23:22.358421 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bc92c3f3-06d7-4bab-8cd4-c68fed8308c5","Type":"ContainerDied","Data":"ceb527ac84080dd62d9eb820bdeb1b575f889cffdd1f681b2083b2261594ed8f"} Mar 18 18:23:22 crc kubenswrapper[4830]: I0318 18:23:22.358445 4830 scope.go:117] "RemoveContainer" containerID="12d65b613d7c7bb4a1bf18d31ba8c25084c1c109f742d8bc96be864bbf2d7fb4" Mar 18 18:23:22 crc kubenswrapper[4830]: I0318 18:23:22.389950 4830 scope.go:117] "RemoveContainer" containerID="12d65b613d7c7bb4a1bf18d31ba8c25084c1c109f742d8bc96be864bbf2d7fb4" Mar 18 18:23:22 crc kubenswrapper[4830]: E0318 18:23:22.390504 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12d65b613d7c7bb4a1bf18d31ba8c25084c1c109f742d8bc96be864bbf2d7fb4\": container with ID starting with 12d65b613d7c7bb4a1bf18d31ba8c25084c1c109f742d8bc96be864bbf2d7fb4 not found: ID does not exist" containerID="12d65b613d7c7bb4a1bf18d31ba8c25084c1c109f742d8bc96be864bbf2d7fb4" Mar 18 18:23:22 crc kubenswrapper[4830]: I0318 18:23:22.390537 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12d65b613d7c7bb4a1bf18d31ba8c25084c1c109f742d8bc96be864bbf2d7fb4"} err="failed to get container status \"12d65b613d7c7bb4a1bf18d31ba8c25084c1c109f742d8bc96be864bbf2d7fb4\": rpc error: code = NotFound desc = could not find container \"12d65b613d7c7bb4a1bf18d31ba8c25084c1c109f742d8bc96be864bbf2d7fb4\": container with ID starting with 12d65b613d7c7bb4a1bf18d31ba8c25084c1c109f742d8bc96be864bbf2d7fb4 not found: ID does not exist" Mar 18 18:23:22 crc kubenswrapper[4830]: I0318 18:23:22.419002 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46bpj\" (UniqueName: \"kubernetes.io/projected/bc92c3f3-06d7-4bab-8cd4-c68fed8308c5-kube-api-access-46bpj\") pod \"bc92c3f3-06d7-4bab-8cd4-c68fed8308c5\" (UID: \"bc92c3f3-06d7-4bab-8cd4-c68fed8308c5\") " Mar 18 18:23:22 crc kubenswrapper[4830]: I0318 18:23:22.419089 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc92c3f3-06d7-4bab-8cd4-c68fed8308c5-combined-ca-bundle\") pod \"bc92c3f3-06d7-4bab-8cd4-c68fed8308c5\" (UID: \"bc92c3f3-06d7-4bab-8cd4-c68fed8308c5\") " Mar 18 18:23:22 crc kubenswrapper[4830]: I0318 18:23:22.419113 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc92c3f3-06d7-4bab-8cd4-c68fed8308c5-config-data\") pod \"bc92c3f3-06d7-4bab-8cd4-c68fed8308c5\" (UID: \"bc92c3f3-06d7-4bab-8cd4-c68fed8308c5\") " Mar 18 18:23:22 crc kubenswrapper[4830]: I0318 18:23:22.426362 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc92c3f3-06d7-4bab-8cd4-c68fed8308c5-kube-api-access-46bpj" (OuterVolumeSpecName: "kube-api-access-46bpj") pod "bc92c3f3-06d7-4bab-8cd4-c68fed8308c5" (UID: "bc92c3f3-06d7-4bab-8cd4-c68fed8308c5"). InnerVolumeSpecName "kube-api-access-46bpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:23:22 crc kubenswrapper[4830]: I0318 18:23:22.452020 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc92c3f3-06d7-4bab-8cd4-c68fed8308c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc92c3f3-06d7-4bab-8cd4-c68fed8308c5" (UID: "bc92c3f3-06d7-4bab-8cd4-c68fed8308c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:22 crc kubenswrapper[4830]: I0318 18:23:22.464004 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc92c3f3-06d7-4bab-8cd4-c68fed8308c5-config-data" (OuterVolumeSpecName: "config-data") pod "bc92c3f3-06d7-4bab-8cd4-c68fed8308c5" (UID: "bc92c3f3-06d7-4bab-8cd4-c68fed8308c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:22 crc kubenswrapper[4830]: I0318 18:23:22.520121 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46bpj\" (UniqueName: \"kubernetes.io/projected/bc92c3f3-06d7-4bab-8cd4-c68fed8308c5-kube-api-access-46bpj\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:22 crc kubenswrapper[4830]: I0318 18:23:22.520156 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc92c3f3-06d7-4bab-8cd4-c68fed8308c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:22 crc kubenswrapper[4830]: I0318 18:23:22.520182 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc92c3f3-06d7-4bab-8cd4-c68fed8308c5-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:22 crc kubenswrapper[4830]: I0318 18:23:22.693160 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 18:23:22 crc kubenswrapper[4830]: I0318 18:23:22.702473 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 18:23:22 crc kubenswrapper[4830]: I0318 18:23:22.725831 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 18:23:22 crc kubenswrapper[4830]: E0318 18:23:22.726536 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc92c3f3-06d7-4bab-8cd4-c68fed8308c5" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 18:23:22 crc kubenswrapper[4830]: I0318 18:23:22.726626 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc92c3f3-06d7-4bab-8cd4-c68fed8308c5" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 18:23:22 crc kubenswrapper[4830]: I0318 18:23:22.726959 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc92c3f3-06d7-4bab-8cd4-c68fed8308c5" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 18:23:22 crc kubenswrapper[4830]: I0318 18:23:22.727895 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:23:22 crc kubenswrapper[4830]: I0318 18:23:22.730313 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 18 18:23:22 crc kubenswrapper[4830]: I0318 18:23:22.731231 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 18 18:23:22 crc kubenswrapper[4830]: I0318 18:23:22.731442 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 18 18:23:22 crc kubenswrapper[4830]: I0318 18:23:22.735541 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 18:23:22 crc kubenswrapper[4830]: I0318 18:23:22.926604 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwdv6\" (UniqueName: \"kubernetes.io/projected/bab36094-736f-460a-83d1-bd298dee7774-kube-api-access-xwdv6\") pod \"nova-cell1-novncproxy-0\" (UID: \"bab36094-736f-460a-83d1-bd298dee7774\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:23:22 crc kubenswrapper[4830]: I0318 18:23:22.927153 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bab36094-736f-460a-83d1-bd298dee7774-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bab36094-736f-460a-83d1-bd298dee7774\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:23:22 crc kubenswrapper[4830]: I0318 18:23:22.927342 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bab36094-736f-460a-83d1-bd298dee7774-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bab36094-736f-460a-83d1-bd298dee7774\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:23:22 crc kubenswrapper[4830]: I0318 18:23:22.927526 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bab36094-736f-460a-83d1-bd298dee7774-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bab36094-736f-460a-83d1-bd298dee7774\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:23:22 crc kubenswrapper[4830]: I0318 18:23:22.927684 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bab36094-736f-460a-83d1-bd298dee7774-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bab36094-736f-460a-83d1-bd298dee7774\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:23:23 crc kubenswrapper[4830]: I0318 18:23:23.030120 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bab36094-736f-460a-83d1-bd298dee7774-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bab36094-736f-460a-83d1-bd298dee7774\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:23:23 crc kubenswrapper[4830]: I0318 18:23:23.030243 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bab36094-736f-460a-83d1-bd298dee7774-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bab36094-736f-460a-83d1-bd298dee7774\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:23:23 crc kubenswrapper[4830]: I0318 18:23:23.030290 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bab36094-736f-460a-83d1-bd298dee7774-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bab36094-736f-460a-83d1-bd298dee7774\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:23:23 crc kubenswrapper[4830]: I0318 18:23:23.030374 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwdv6\" (UniqueName: \"kubernetes.io/projected/bab36094-736f-460a-83d1-bd298dee7774-kube-api-access-xwdv6\") pod \"nova-cell1-novncproxy-0\" (UID: \"bab36094-736f-460a-83d1-bd298dee7774\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:23:23 crc kubenswrapper[4830]: I0318 18:23:23.030403 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bab36094-736f-460a-83d1-bd298dee7774-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bab36094-736f-460a-83d1-bd298dee7774\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:23:23 crc kubenswrapper[4830]: I0318 18:23:23.034379 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bab36094-736f-460a-83d1-bd298dee7774-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bab36094-736f-460a-83d1-bd298dee7774\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:23:23 crc kubenswrapper[4830]: I0318 18:23:23.035633 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bab36094-736f-460a-83d1-bd298dee7774-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bab36094-736f-460a-83d1-bd298dee7774\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:23:23 crc kubenswrapper[4830]: I0318 18:23:23.036074 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bab36094-736f-460a-83d1-bd298dee7774-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bab36094-736f-460a-83d1-bd298dee7774\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:23:23 crc kubenswrapper[4830]: I0318 18:23:23.036873 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bab36094-736f-460a-83d1-bd298dee7774-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bab36094-736f-460a-83d1-bd298dee7774\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:23:23 crc kubenswrapper[4830]: I0318 18:23:23.058378 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwdv6\" (UniqueName: \"kubernetes.io/projected/bab36094-736f-460a-83d1-bd298dee7774-kube-api-access-xwdv6\") pod \"nova-cell1-novncproxy-0\" (UID: \"bab36094-736f-460a-83d1-bd298dee7774\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:23:23 crc kubenswrapper[4830]: I0318 18:23:23.351617 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:23:23 crc kubenswrapper[4830]: I0318 18:23:23.563632 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 18:23:23 crc kubenswrapper[4830]: I0318 18:23:23.563930 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 18:23:23 crc kubenswrapper[4830]: W0318 18:23:23.842620 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbab36094_736f_460a_83d1_bd298dee7774.slice/crio-92cdf158326b48ba0322e8a532b52ede524b5b52f72cda66606bad3322557e82 WatchSource:0}: Error finding container 92cdf158326b48ba0322e8a532b52ede524b5b52f72cda66606bad3322557e82: Status 404 returned error can't find the container with id 92cdf158326b48ba0322e8a532b52ede524b5b52f72cda66606bad3322557e82 Mar 18 18:23:23 crc kubenswrapper[4830]: I0318 18:23:23.842631 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 18:23:24 crc kubenswrapper[4830]: I0318 18:23:24.246008 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc92c3f3-06d7-4bab-8cd4-c68fed8308c5" path="/var/lib/kubelet/pods/bc92c3f3-06d7-4bab-8cd4-c68fed8308c5/volumes" Mar 18 18:23:24 crc kubenswrapper[4830]: I0318 18:23:24.384506 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bab36094-736f-460a-83d1-bd298dee7774","Type":"ContainerStarted","Data":"62da465c43e927fc0029ba25702e0c328ba20afbde5a9049d2d8a147434c24a7"} Mar 18 18:23:24 crc kubenswrapper[4830]: I0318 18:23:24.384574 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bab36094-736f-460a-83d1-bd298dee7774","Type":"ContainerStarted","Data":"92cdf158326b48ba0322e8a532b52ede524b5b52f72cda66606bad3322557e82"} Mar 18 18:23:25 crc kubenswrapper[4830]: I0318 18:23:25.568763 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 18:23:25 crc kubenswrapper[4830]: I0318 18:23:25.574446 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 18:23:25 crc kubenswrapper[4830]: I0318 18:23:25.580212 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 18:23:25 crc kubenswrapper[4830]: I0318 18:23:25.598092 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.598066876 podStartE2EDuration="3.598066876s" podCreationTimestamp="2026-03-18 18:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:23:24.411206077 +0000 UTC m=+1238.978836459" watchObservedRunningTime="2026-03-18 18:23:25.598066876 +0000 UTC m=+1240.165697208" Mar 18 18:23:26 crc kubenswrapper[4830]: I0318 18:23:26.414850 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 18:23:26 crc kubenswrapper[4830]: I0318 18:23:26.625010 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bd85b459c-7ck7d"] Mar 18 18:23:26 crc kubenswrapper[4830]: I0318 18:23:26.627741 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bd85b459c-7ck7d" Mar 18 18:23:26 crc kubenswrapper[4830]: I0318 18:23:26.633155 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bd85b459c-7ck7d"] Mar 18 18:23:26 crc kubenswrapper[4830]: I0318 18:23:26.713285 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/86ecee90-92ea-4ef1-a871-49018c2ac648-dns-swift-storage-0\") pod \"dnsmasq-dns-6bd85b459c-7ck7d\" (UID: \"86ecee90-92ea-4ef1-a871-49018c2ac648\") " pod="openstack/dnsmasq-dns-6bd85b459c-7ck7d" Mar 18 18:23:26 crc kubenswrapper[4830]: I0318 18:23:26.713332 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86ecee90-92ea-4ef1-a871-49018c2ac648-ovsdbserver-sb\") pod \"dnsmasq-dns-6bd85b459c-7ck7d\" (UID: \"86ecee90-92ea-4ef1-a871-49018c2ac648\") " pod="openstack/dnsmasq-dns-6bd85b459c-7ck7d" Mar 18 18:23:26 crc kubenswrapper[4830]: I0318 18:23:26.713471 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twsxn\" (UniqueName: \"kubernetes.io/projected/86ecee90-92ea-4ef1-a871-49018c2ac648-kube-api-access-twsxn\") pod \"dnsmasq-dns-6bd85b459c-7ck7d\" (UID: \"86ecee90-92ea-4ef1-a871-49018c2ac648\") " pod="openstack/dnsmasq-dns-6bd85b459c-7ck7d" Mar 18 18:23:26 crc kubenswrapper[4830]: I0318 18:23:26.713692 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86ecee90-92ea-4ef1-a871-49018c2ac648-config\") pod \"dnsmasq-dns-6bd85b459c-7ck7d\" (UID: \"86ecee90-92ea-4ef1-a871-49018c2ac648\") " pod="openstack/dnsmasq-dns-6bd85b459c-7ck7d" Mar 18 18:23:26 crc kubenswrapper[4830]: I0318 18:23:26.713790 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86ecee90-92ea-4ef1-a871-49018c2ac648-ovsdbserver-nb\") pod \"dnsmasq-dns-6bd85b459c-7ck7d\" (UID: \"86ecee90-92ea-4ef1-a871-49018c2ac648\") " pod="openstack/dnsmasq-dns-6bd85b459c-7ck7d" Mar 18 18:23:26 crc kubenswrapper[4830]: I0318 18:23:26.713854 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86ecee90-92ea-4ef1-a871-49018c2ac648-dns-svc\") pod \"dnsmasq-dns-6bd85b459c-7ck7d\" (UID: \"86ecee90-92ea-4ef1-a871-49018c2ac648\") " pod="openstack/dnsmasq-dns-6bd85b459c-7ck7d" Mar 18 18:23:26 crc kubenswrapper[4830]: I0318 18:23:26.814732 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86ecee90-92ea-4ef1-a871-49018c2ac648-config\") pod \"dnsmasq-dns-6bd85b459c-7ck7d\" (UID: \"86ecee90-92ea-4ef1-a871-49018c2ac648\") " pod="openstack/dnsmasq-dns-6bd85b459c-7ck7d" Mar 18 18:23:26 crc kubenswrapper[4830]: I0318 18:23:26.815064 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86ecee90-92ea-4ef1-a871-49018c2ac648-ovsdbserver-nb\") pod \"dnsmasq-dns-6bd85b459c-7ck7d\" (UID: \"86ecee90-92ea-4ef1-a871-49018c2ac648\") " pod="openstack/dnsmasq-dns-6bd85b459c-7ck7d" Mar 18 18:23:26 crc kubenswrapper[4830]: I0318 18:23:26.815094 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86ecee90-92ea-4ef1-a871-49018c2ac648-dns-svc\") pod \"dnsmasq-dns-6bd85b459c-7ck7d\" (UID: \"86ecee90-92ea-4ef1-a871-49018c2ac648\") " pod="openstack/dnsmasq-dns-6bd85b459c-7ck7d" Mar 18 18:23:26 crc kubenswrapper[4830]: I0318 18:23:26.815126 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/86ecee90-92ea-4ef1-a871-49018c2ac648-dns-swift-storage-0\") pod \"dnsmasq-dns-6bd85b459c-7ck7d\" (UID: \"86ecee90-92ea-4ef1-a871-49018c2ac648\") " pod="openstack/dnsmasq-dns-6bd85b459c-7ck7d" Mar 18 18:23:26 crc kubenswrapper[4830]: I0318 18:23:26.815150 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86ecee90-92ea-4ef1-a871-49018c2ac648-ovsdbserver-sb\") pod \"dnsmasq-dns-6bd85b459c-7ck7d\" (UID: \"86ecee90-92ea-4ef1-a871-49018c2ac648\") " pod="openstack/dnsmasq-dns-6bd85b459c-7ck7d" Mar 18 18:23:26 crc kubenswrapper[4830]: I0318 18:23:26.815198 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twsxn\" (UniqueName: \"kubernetes.io/projected/86ecee90-92ea-4ef1-a871-49018c2ac648-kube-api-access-twsxn\") pod \"dnsmasq-dns-6bd85b459c-7ck7d\" (UID: \"86ecee90-92ea-4ef1-a871-49018c2ac648\") " pod="openstack/dnsmasq-dns-6bd85b459c-7ck7d" Mar 18 18:23:26 crc kubenswrapper[4830]: I0318 18:23:26.816102 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86ecee90-92ea-4ef1-a871-49018c2ac648-config\") pod \"dnsmasq-dns-6bd85b459c-7ck7d\" (UID: \"86ecee90-92ea-4ef1-a871-49018c2ac648\") " pod="openstack/dnsmasq-dns-6bd85b459c-7ck7d" Mar 18 18:23:26 crc kubenswrapper[4830]: I0318 18:23:26.816125 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/86ecee90-92ea-4ef1-a871-49018c2ac648-dns-swift-storage-0\") pod \"dnsmasq-dns-6bd85b459c-7ck7d\" (UID: \"86ecee90-92ea-4ef1-a871-49018c2ac648\") " pod="openstack/dnsmasq-dns-6bd85b459c-7ck7d" Mar 18 18:23:26 crc kubenswrapper[4830]: I0318 18:23:26.816125 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86ecee90-92ea-4ef1-a871-49018c2ac648-dns-svc\") pod \"dnsmasq-dns-6bd85b459c-7ck7d\" (UID: \"86ecee90-92ea-4ef1-a871-49018c2ac648\") " pod="openstack/dnsmasq-dns-6bd85b459c-7ck7d" Mar 18 18:23:26 crc kubenswrapper[4830]: I0318 18:23:26.816625 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86ecee90-92ea-4ef1-a871-49018c2ac648-ovsdbserver-nb\") pod \"dnsmasq-dns-6bd85b459c-7ck7d\" (UID: \"86ecee90-92ea-4ef1-a871-49018c2ac648\") " pod="openstack/dnsmasq-dns-6bd85b459c-7ck7d" Mar 18 18:23:26 crc kubenswrapper[4830]: I0318 18:23:26.816889 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86ecee90-92ea-4ef1-a871-49018c2ac648-ovsdbserver-sb\") pod \"dnsmasq-dns-6bd85b459c-7ck7d\" (UID: \"86ecee90-92ea-4ef1-a871-49018c2ac648\") " pod="openstack/dnsmasq-dns-6bd85b459c-7ck7d" Mar 18 18:23:26 crc kubenswrapper[4830]: I0318 18:23:26.832843 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twsxn\" (UniqueName: \"kubernetes.io/projected/86ecee90-92ea-4ef1-a871-49018c2ac648-kube-api-access-twsxn\") pod \"dnsmasq-dns-6bd85b459c-7ck7d\" (UID: \"86ecee90-92ea-4ef1-a871-49018c2ac648\") " pod="openstack/dnsmasq-dns-6bd85b459c-7ck7d" Mar 18 18:23:26 crc kubenswrapper[4830]: I0318 18:23:26.959211 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bd85b459c-7ck7d" Mar 18 18:23:27 crc kubenswrapper[4830]: I0318 18:23:27.412655 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bd85b459c-7ck7d"] Mar 18 18:23:27 crc kubenswrapper[4830]: W0318 18:23:27.415100 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86ecee90_92ea_4ef1_a871_49018c2ac648.slice/crio-dbaa861ff7070c74290ea330d4c08aa1519ccc9ef2b2603c86dae354a5f498fa WatchSource:0}: Error finding container dbaa861ff7070c74290ea330d4c08aa1519ccc9ef2b2603c86dae354a5f498fa: Status 404 returned error can't find the container with id dbaa861ff7070c74290ea330d4c08aa1519ccc9ef2b2603c86dae354a5f498fa Mar 18 18:23:28 crc kubenswrapper[4830]: I0318 18:23:28.223065 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:23:28 crc kubenswrapper[4830]: I0318 18:23:28.223554 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e2631c83-8cce-42b7-88b4-226f2e8b4227" containerName="ceilometer-central-agent" containerID="cri-o://e3040cd8cfa1f3743082305952878da88f422fad07839805693e457987e1defd" gracePeriod=30 Mar 18 18:23:28 crc kubenswrapper[4830]: I0318 18:23:28.223603 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e2631c83-8cce-42b7-88b4-226f2e8b4227" containerName="proxy-httpd" containerID="cri-o://2b6031c6b23e1c352ff4f06217ad561a0959af2fae57ac06547e777217b7bb94" gracePeriod=30 Mar 18 18:23:28 crc kubenswrapper[4830]: I0318 18:23:28.223659 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e2631c83-8cce-42b7-88b4-226f2e8b4227" containerName="sg-core" containerID="cri-o://c25b569c99c695c49e370158227a2b977a290800356a8041ebb63d09f6153189" gracePeriod=30 Mar 18 18:23:28 crc kubenswrapper[4830]: I0318 18:23:28.223713 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e2631c83-8cce-42b7-88b4-226f2e8b4227" containerName="ceilometer-notification-agent" containerID="cri-o://dfdda4d2ffef289ad7cfd39a3ea1e0257392b492df4a96ea229c8996f11d2023" gracePeriod=30 Mar 18 18:23:28 crc kubenswrapper[4830]: I0318 18:23:28.352226 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:23:28 crc kubenswrapper[4830]: I0318 18:23:28.427840 4830 generic.go:334] "Generic (PLEG): container finished" podID="86ecee90-92ea-4ef1-a871-49018c2ac648" containerID="5877e480f98ac8df8a3e7169161ee11df8c8ec531baa72e278500934c0be1c69" exitCode=0 Mar 18 18:23:28 crc kubenswrapper[4830]: I0318 18:23:28.427923 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bd85b459c-7ck7d" event={"ID":"86ecee90-92ea-4ef1-a871-49018c2ac648","Type":"ContainerDied","Data":"5877e480f98ac8df8a3e7169161ee11df8c8ec531baa72e278500934c0be1c69"} Mar 18 18:23:28 crc kubenswrapper[4830]: I0318 18:23:28.428163 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bd85b459c-7ck7d" event={"ID":"86ecee90-92ea-4ef1-a871-49018c2ac648","Type":"ContainerStarted","Data":"dbaa861ff7070c74290ea330d4c08aa1519ccc9ef2b2603c86dae354a5f498fa"} Mar 18 18:23:28 crc kubenswrapper[4830]: I0318 18:23:28.431048 4830 generic.go:334] "Generic (PLEG): container finished" podID="e2631c83-8cce-42b7-88b4-226f2e8b4227" containerID="2b6031c6b23e1c352ff4f06217ad561a0959af2fae57ac06547e777217b7bb94" exitCode=0 Mar 18 18:23:28 crc kubenswrapper[4830]: I0318 18:23:28.431080 4830 generic.go:334] "Generic (PLEG): container finished" podID="e2631c83-8cce-42b7-88b4-226f2e8b4227" containerID="c25b569c99c695c49e370158227a2b977a290800356a8041ebb63d09f6153189" exitCode=2 Mar 18 18:23:28 crc kubenswrapper[4830]: I0318 18:23:28.431097 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2631c83-8cce-42b7-88b4-226f2e8b4227","Type":"ContainerDied","Data":"2b6031c6b23e1c352ff4f06217ad561a0959af2fae57ac06547e777217b7bb94"} Mar 18 18:23:28 crc kubenswrapper[4830]: I0318 18:23:28.431119 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2631c83-8cce-42b7-88b4-226f2e8b4227","Type":"ContainerDied","Data":"c25b569c99c695c49e370158227a2b977a290800356a8041ebb63d09f6153189"} Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.101789 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.102224 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a6cc0c25-d795-4421-abbe-bfdcefb9db61" containerName="nova-api-log" containerID="cri-o://35816140a6f613f53804dd433976493f1668d481308131393e3b3babebfa4eb5" gracePeriod=30 Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.102357 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a6cc0c25-d795-4421-abbe-bfdcefb9db61" containerName="nova-api-api" containerID="cri-o://ae5d24f64c1c35cc8a1935c826501d58649d4660951145043ac21c04591ba75f" gracePeriod=30 Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.254251 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.374272 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2631c83-8cce-42b7-88b4-226f2e8b4227-combined-ca-bundle\") pod \"e2631c83-8cce-42b7-88b4-226f2e8b4227\" (UID: \"e2631c83-8cce-42b7-88b4-226f2e8b4227\") " Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.374406 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2631c83-8cce-42b7-88b4-226f2e8b4227-sg-core-conf-yaml\") pod \"e2631c83-8cce-42b7-88b4-226f2e8b4227\" (UID: \"e2631c83-8cce-42b7-88b4-226f2e8b4227\") " Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.374443 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2631c83-8cce-42b7-88b4-226f2e8b4227-config-data\") pod \"e2631c83-8cce-42b7-88b4-226f2e8b4227\" (UID: \"e2631c83-8cce-42b7-88b4-226f2e8b4227\") " Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.374517 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2631c83-8cce-42b7-88b4-226f2e8b4227-run-httpd\") pod \"e2631c83-8cce-42b7-88b4-226f2e8b4227\" (UID: \"e2631c83-8cce-42b7-88b4-226f2e8b4227\") " Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.374543 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2631c83-8cce-42b7-88b4-226f2e8b4227-ceilometer-tls-certs\") pod \"e2631c83-8cce-42b7-88b4-226f2e8b4227\" (UID: \"e2631c83-8cce-42b7-88b4-226f2e8b4227\") " Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.374600 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2631c83-8cce-42b7-88b4-226f2e8b4227-log-httpd\") pod \"e2631c83-8cce-42b7-88b4-226f2e8b4227\" (UID: \"e2631c83-8cce-42b7-88b4-226f2e8b4227\") " Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.374720 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mz6kw\" (UniqueName: \"kubernetes.io/projected/e2631c83-8cce-42b7-88b4-226f2e8b4227-kube-api-access-mz6kw\") pod \"e2631c83-8cce-42b7-88b4-226f2e8b4227\" (UID: \"e2631c83-8cce-42b7-88b4-226f2e8b4227\") " Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.374745 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2631c83-8cce-42b7-88b4-226f2e8b4227-scripts\") pod \"e2631c83-8cce-42b7-88b4-226f2e8b4227\" (UID: \"e2631c83-8cce-42b7-88b4-226f2e8b4227\") " Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.375211 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2631c83-8cce-42b7-88b4-226f2e8b4227-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e2631c83-8cce-42b7-88b4-226f2e8b4227" (UID: "e2631c83-8cce-42b7-88b4-226f2e8b4227"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.375357 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2631c83-8cce-42b7-88b4-226f2e8b4227-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e2631c83-8cce-42b7-88b4-226f2e8b4227" (UID: "e2631c83-8cce-42b7-88b4-226f2e8b4227"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.376040 4830 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2631c83-8cce-42b7-88b4-226f2e8b4227-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.376058 4830 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2631c83-8cce-42b7-88b4-226f2e8b4227-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.380073 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2631c83-8cce-42b7-88b4-226f2e8b4227-scripts" (OuterVolumeSpecName: "scripts") pod "e2631c83-8cce-42b7-88b4-226f2e8b4227" (UID: "e2631c83-8cce-42b7-88b4-226f2e8b4227"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.385032 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2631c83-8cce-42b7-88b4-226f2e8b4227-kube-api-access-mz6kw" (OuterVolumeSpecName: "kube-api-access-mz6kw") pod "e2631c83-8cce-42b7-88b4-226f2e8b4227" (UID: "e2631c83-8cce-42b7-88b4-226f2e8b4227"). InnerVolumeSpecName "kube-api-access-mz6kw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.423038 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2631c83-8cce-42b7-88b4-226f2e8b4227-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e2631c83-8cce-42b7-88b4-226f2e8b4227" (UID: "e2631c83-8cce-42b7-88b4-226f2e8b4227"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.448046 4830 generic.go:334] "Generic (PLEG): container finished" podID="a6cc0c25-d795-4421-abbe-bfdcefb9db61" containerID="35816140a6f613f53804dd433976493f1668d481308131393e3b3babebfa4eb5" exitCode=143 Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.448119 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a6cc0c25-d795-4421-abbe-bfdcefb9db61","Type":"ContainerDied","Data":"35816140a6f613f53804dd433976493f1668d481308131393e3b3babebfa4eb5"} Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.449834 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bd85b459c-7ck7d" event={"ID":"86ecee90-92ea-4ef1-a871-49018c2ac648","Type":"ContainerStarted","Data":"1eb0db0b8dfbe3a3b14e7bb26b25f620aed32ce646e43dd05cbe50fab52b6163"} Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.450968 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bd85b459c-7ck7d" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.454504 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2631c83-8cce-42b7-88b4-226f2e8b4227-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e2631c83-8cce-42b7-88b4-226f2e8b4227" (UID: "e2631c83-8cce-42b7-88b4-226f2e8b4227"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.457584 4830 generic.go:334] "Generic (PLEG): container finished" podID="e2631c83-8cce-42b7-88b4-226f2e8b4227" containerID="dfdda4d2ffef289ad7cfd39a3ea1e0257392b492df4a96ea229c8996f11d2023" exitCode=0 Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.457673 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.457730 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2631c83-8cce-42b7-88b4-226f2e8b4227","Type":"ContainerDied","Data":"dfdda4d2ffef289ad7cfd39a3ea1e0257392b492df4a96ea229c8996f11d2023"} Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.457790 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2631c83-8cce-42b7-88b4-226f2e8b4227","Type":"ContainerDied","Data":"e3040cd8cfa1f3743082305952878da88f422fad07839805693e457987e1defd"} Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.457811 4830 scope.go:117] "RemoveContainer" containerID="2b6031c6b23e1c352ff4f06217ad561a0959af2fae57ac06547e777217b7bb94" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.458139 4830 generic.go:334] "Generic (PLEG): container finished" podID="e2631c83-8cce-42b7-88b4-226f2e8b4227" containerID="e3040cd8cfa1f3743082305952878da88f422fad07839805693e457987e1defd" exitCode=0 Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.458178 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2631c83-8cce-42b7-88b4-226f2e8b4227","Type":"ContainerDied","Data":"ef41da036fcd4aeafd2210d8a4fbf0f45929a13b31fbed582927645957940e33"} Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.478560 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bd85b459c-7ck7d" podStartSLOduration=3.4784236809999998 podStartE2EDuration="3.478423681s" podCreationTimestamp="2026-03-18 18:23:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:23:29.469248702 +0000 UTC m=+1244.036879034" watchObservedRunningTime="2026-03-18 18:23:29.478423681 +0000 UTC m=+1244.046054013" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.479449 4830 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2631c83-8cce-42b7-88b4-226f2e8b4227-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.479478 4830 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2631c83-8cce-42b7-88b4-226f2e8b4227-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.479488 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mz6kw\" (UniqueName: \"kubernetes.io/projected/e2631c83-8cce-42b7-88b4-226f2e8b4227-kube-api-access-mz6kw\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.479500 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2631c83-8cce-42b7-88b4-226f2e8b4227-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.493017 4830 scope.go:117] "RemoveContainer" containerID="c25b569c99c695c49e370158227a2b977a290800356a8041ebb63d09f6153189" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.513919 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.513999 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.514072 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.515112 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"95c4e07cab8acd660c3305d62103b7c04d3c929938a23e2544d7e9b8fe0b847c"} pod="openshift-machine-config-operator/machine-config-daemon-plzpb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.515191 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" containerID="cri-o://95c4e07cab8acd660c3305d62103b7c04d3c929938a23e2544d7e9b8fe0b847c" gracePeriod=600 Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.531910 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2631c83-8cce-42b7-88b4-226f2e8b4227-config-data" (OuterVolumeSpecName: "config-data") pod "e2631c83-8cce-42b7-88b4-226f2e8b4227" (UID: "e2631c83-8cce-42b7-88b4-226f2e8b4227"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.535893 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2631c83-8cce-42b7-88b4-226f2e8b4227-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2631c83-8cce-42b7-88b4-226f2e8b4227" (UID: "e2631c83-8cce-42b7-88b4-226f2e8b4227"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.540243 4830 scope.go:117] "RemoveContainer" containerID="dfdda4d2ffef289ad7cfd39a3ea1e0257392b492df4a96ea229c8996f11d2023" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.561350 4830 scope.go:117] "RemoveContainer" containerID="e3040cd8cfa1f3743082305952878da88f422fad07839805693e457987e1defd" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.581039 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2631c83-8cce-42b7-88b4-226f2e8b4227-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.581234 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2631c83-8cce-42b7-88b4-226f2e8b4227-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.583167 4830 scope.go:117] "RemoveContainer" containerID="2b6031c6b23e1c352ff4f06217ad561a0959af2fae57ac06547e777217b7bb94" Mar 18 18:23:29 crc kubenswrapper[4830]: E0318 18:23:29.583463 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b6031c6b23e1c352ff4f06217ad561a0959af2fae57ac06547e777217b7bb94\": container with ID starting with 2b6031c6b23e1c352ff4f06217ad561a0959af2fae57ac06547e777217b7bb94 not found: ID does not exist" containerID="2b6031c6b23e1c352ff4f06217ad561a0959af2fae57ac06547e777217b7bb94" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.583493 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b6031c6b23e1c352ff4f06217ad561a0959af2fae57ac06547e777217b7bb94"} err="failed to get container status \"2b6031c6b23e1c352ff4f06217ad561a0959af2fae57ac06547e777217b7bb94\": rpc error: code = NotFound desc = could not find container \"2b6031c6b23e1c352ff4f06217ad561a0959af2fae57ac06547e777217b7bb94\": container with ID starting with 2b6031c6b23e1c352ff4f06217ad561a0959af2fae57ac06547e777217b7bb94 not found: ID does not exist" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.583512 4830 scope.go:117] "RemoveContainer" containerID="c25b569c99c695c49e370158227a2b977a290800356a8041ebb63d09f6153189" Mar 18 18:23:29 crc kubenswrapper[4830]: E0318 18:23:29.584023 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c25b569c99c695c49e370158227a2b977a290800356a8041ebb63d09f6153189\": container with ID starting with c25b569c99c695c49e370158227a2b977a290800356a8041ebb63d09f6153189 not found: ID does not exist" containerID="c25b569c99c695c49e370158227a2b977a290800356a8041ebb63d09f6153189" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.584064 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c25b569c99c695c49e370158227a2b977a290800356a8041ebb63d09f6153189"} err="failed to get container status \"c25b569c99c695c49e370158227a2b977a290800356a8041ebb63d09f6153189\": rpc error: code = NotFound desc = could not find container \"c25b569c99c695c49e370158227a2b977a290800356a8041ebb63d09f6153189\": container with ID starting with c25b569c99c695c49e370158227a2b977a290800356a8041ebb63d09f6153189 not found: ID does not exist" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.584107 4830 scope.go:117] "RemoveContainer" containerID="dfdda4d2ffef289ad7cfd39a3ea1e0257392b492df4a96ea229c8996f11d2023" Mar 18 18:23:29 crc kubenswrapper[4830]: E0318 18:23:29.584398 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfdda4d2ffef289ad7cfd39a3ea1e0257392b492df4a96ea229c8996f11d2023\": container with ID starting with dfdda4d2ffef289ad7cfd39a3ea1e0257392b492df4a96ea229c8996f11d2023 not found: ID does not exist" containerID="dfdda4d2ffef289ad7cfd39a3ea1e0257392b492df4a96ea229c8996f11d2023" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.597997 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfdda4d2ffef289ad7cfd39a3ea1e0257392b492df4a96ea229c8996f11d2023"} err="failed to get container status \"dfdda4d2ffef289ad7cfd39a3ea1e0257392b492df4a96ea229c8996f11d2023\": rpc error: code = NotFound desc = could not find container \"dfdda4d2ffef289ad7cfd39a3ea1e0257392b492df4a96ea229c8996f11d2023\": container with ID starting with dfdda4d2ffef289ad7cfd39a3ea1e0257392b492df4a96ea229c8996f11d2023 not found: ID does not exist" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.598036 4830 scope.go:117] "RemoveContainer" containerID="e3040cd8cfa1f3743082305952878da88f422fad07839805693e457987e1defd" Mar 18 18:23:29 crc kubenswrapper[4830]: E0318 18:23:29.598536 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3040cd8cfa1f3743082305952878da88f422fad07839805693e457987e1defd\": container with ID starting with e3040cd8cfa1f3743082305952878da88f422fad07839805693e457987e1defd not found: ID does not exist" containerID="e3040cd8cfa1f3743082305952878da88f422fad07839805693e457987e1defd" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.598561 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3040cd8cfa1f3743082305952878da88f422fad07839805693e457987e1defd"} err="failed to get container status \"e3040cd8cfa1f3743082305952878da88f422fad07839805693e457987e1defd\": rpc error: code = NotFound desc = could not find container \"e3040cd8cfa1f3743082305952878da88f422fad07839805693e457987e1defd\": container with ID starting with e3040cd8cfa1f3743082305952878da88f422fad07839805693e457987e1defd not found: ID does not exist" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.598576 4830 scope.go:117] "RemoveContainer" containerID="2b6031c6b23e1c352ff4f06217ad561a0959af2fae57ac06547e777217b7bb94" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.599528 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b6031c6b23e1c352ff4f06217ad561a0959af2fae57ac06547e777217b7bb94"} err="failed to get container status \"2b6031c6b23e1c352ff4f06217ad561a0959af2fae57ac06547e777217b7bb94\": rpc error: code = NotFound desc = could not find container \"2b6031c6b23e1c352ff4f06217ad561a0959af2fae57ac06547e777217b7bb94\": container with ID starting with 2b6031c6b23e1c352ff4f06217ad561a0959af2fae57ac06547e777217b7bb94 not found: ID does not exist" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.599550 4830 scope.go:117] "RemoveContainer" containerID="c25b569c99c695c49e370158227a2b977a290800356a8041ebb63d09f6153189" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.600062 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c25b569c99c695c49e370158227a2b977a290800356a8041ebb63d09f6153189"} err="failed to get container status \"c25b569c99c695c49e370158227a2b977a290800356a8041ebb63d09f6153189\": rpc error: code = NotFound desc = could not find container \"c25b569c99c695c49e370158227a2b977a290800356a8041ebb63d09f6153189\": container with ID starting with c25b569c99c695c49e370158227a2b977a290800356a8041ebb63d09f6153189 not found: ID does not exist" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.600080 4830 scope.go:117] "RemoveContainer" containerID="dfdda4d2ffef289ad7cfd39a3ea1e0257392b492df4a96ea229c8996f11d2023" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.600271 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfdda4d2ffef289ad7cfd39a3ea1e0257392b492df4a96ea229c8996f11d2023"} err="failed to get container status \"dfdda4d2ffef289ad7cfd39a3ea1e0257392b492df4a96ea229c8996f11d2023\": rpc error: code = NotFound desc = could not find container \"dfdda4d2ffef289ad7cfd39a3ea1e0257392b492df4a96ea229c8996f11d2023\": container with ID starting with dfdda4d2ffef289ad7cfd39a3ea1e0257392b492df4a96ea229c8996f11d2023 not found: ID does not exist" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.600293 4830 scope.go:117] "RemoveContainer" containerID="e3040cd8cfa1f3743082305952878da88f422fad07839805693e457987e1defd" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.600501 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3040cd8cfa1f3743082305952878da88f422fad07839805693e457987e1defd"} err="failed to get container status \"e3040cd8cfa1f3743082305952878da88f422fad07839805693e457987e1defd\": rpc error: code = NotFound desc = could not find container \"e3040cd8cfa1f3743082305952878da88f422fad07839805693e457987e1defd\": container with ID starting with e3040cd8cfa1f3743082305952878da88f422fad07839805693e457987e1defd not found: ID does not exist" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.796985 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.808660 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.819573 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:23:29 crc kubenswrapper[4830]: E0318 18:23:29.819941 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2631c83-8cce-42b7-88b4-226f2e8b4227" containerName="ceilometer-notification-agent" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.819955 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2631c83-8cce-42b7-88b4-226f2e8b4227" containerName="ceilometer-notification-agent" Mar 18 18:23:29 crc kubenswrapper[4830]: E0318 18:23:29.819978 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2631c83-8cce-42b7-88b4-226f2e8b4227" containerName="ceilometer-central-agent" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.819985 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2631c83-8cce-42b7-88b4-226f2e8b4227" containerName="ceilometer-central-agent" Mar 18 18:23:29 crc kubenswrapper[4830]: E0318 18:23:29.819997 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2631c83-8cce-42b7-88b4-226f2e8b4227" containerName="proxy-httpd" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.820003 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2631c83-8cce-42b7-88b4-226f2e8b4227" containerName="proxy-httpd" Mar 18 18:23:29 crc kubenswrapper[4830]: E0318 18:23:29.820013 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2631c83-8cce-42b7-88b4-226f2e8b4227" containerName="sg-core" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.820019 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2631c83-8cce-42b7-88b4-226f2e8b4227" containerName="sg-core" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.820183 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2631c83-8cce-42b7-88b4-226f2e8b4227" containerName="proxy-httpd" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.820192 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2631c83-8cce-42b7-88b4-226f2e8b4227" containerName="ceilometer-central-agent" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.820207 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2631c83-8cce-42b7-88b4-226f2e8b4227" containerName="sg-core" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.820215 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2631c83-8cce-42b7-88b4-226f2e8b4227" containerName="ceilometer-notification-agent" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.822018 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.824358 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.824470 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.824552 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.840056 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.987922 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3da95371-091a-4a62-b2c9-92ed39b8a65c-log-httpd\") pod \"ceilometer-0\" (UID: \"3da95371-091a-4a62-b2c9-92ed39b8a65c\") " pod="openstack/ceilometer-0" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.988019 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwv47\" (UniqueName: \"kubernetes.io/projected/3da95371-091a-4a62-b2c9-92ed39b8a65c-kube-api-access-mwv47\") pod \"ceilometer-0\" (UID: \"3da95371-091a-4a62-b2c9-92ed39b8a65c\") " pod="openstack/ceilometer-0" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.988053 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3da95371-091a-4a62-b2c9-92ed39b8a65c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3da95371-091a-4a62-b2c9-92ed39b8a65c\") " pod="openstack/ceilometer-0" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.988827 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3da95371-091a-4a62-b2c9-92ed39b8a65c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3da95371-091a-4a62-b2c9-92ed39b8a65c\") " pod="openstack/ceilometer-0" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.988946 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3da95371-091a-4a62-b2c9-92ed39b8a65c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3da95371-091a-4a62-b2c9-92ed39b8a65c\") " pod="openstack/ceilometer-0" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.988999 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3da95371-091a-4a62-b2c9-92ed39b8a65c-config-data\") pod \"ceilometer-0\" (UID: \"3da95371-091a-4a62-b2c9-92ed39b8a65c\") " pod="openstack/ceilometer-0" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.989085 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3da95371-091a-4a62-b2c9-92ed39b8a65c-scripts\") pod \"ceilometer-0\" (UID: \"3da95371-091a-4a62-b2c9-92ed39b8a65c\") " pod="openstack/ceilometer-0" Mar 18 18:23:29 crc kubenswrapper[4830]: I0318 18:23:29.989201 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3da95371-091a-4a62-b2c9-92ed39b8a65c-run-httpd\") pod \"ceilometer-0\" (UID: \"3da95371-091a-4a62-b2c9-92ed39b8a65c\") " pod="openstack/ceilometer-0" Mar 18 18:23:30 crc kubenswrapper[4830]: I0318 18:23:30.090846 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwv47\" (UniqueName: \"kubernetes.io/projected/3da95371-091a-4a62-b2c9-92ed39b8a65c-kube-api-access-mwv47\") pod \"ceilometer-0\" (UID: \"3da95371-091a-4a62-b2c9-92ed39b8a65c\") " pod="openstack/ceilometer-0" Mar 18 18:23:30 crc kubenswrapper[4830]: I0318 18:23:30.091191 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3da95371-091a-4a62-b2c9-92ed39b8a65c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3da95371-091a-4a62-b2c9-92ed39b8a65c\") " pod="openstack/ceilometer-0" Mar 18 18:23:30 crc kubenswrapper[4830]: I0318 18:23:30.091258 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3da95371-091a-4a62-b2c9-92ed39b8a65c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3da95371-091a-4a62-b2c9-92ed39b8a65c\") " pod="openstack/ceilometer-0" Mar 18 18:23:30 crc kubenswrapper[4830]: I0318 18:23:30.091289 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3da95371-091a-4a62-b2c9-92ed39b8a65c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3da95371-091a-4a62-b2c9-92ed39b8a65c\") " pod="openstack/ceilometer-0" Mar 18 18:23:30 crc kubenswrapper[4830]: I0318 18:23:30.091321 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3da95371-091a-4a62-b2c9-92ed39b8a65c-config-data\") pod \"ceilometer-0\" (UID: \"3da95371-091a-4a62-b2c9-92ed39b8a65c\") " pod="openstack/ceilometer-0" Mar 18 18:23:30 crc kubenswrapper[4830]: I0318 18:23:30.091375 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3da95371-091a-4a62-b2c9-92ed39b8a65c-scripts\") pod \"ceilometer-0\" (UID: \"3da95371-091a-4a62-b2c9-92ed39b8a65c\") " pod="openstack/ceilometer-0" Mar 18 18:23:30 crc kubenswrapper[4830]: I0318 18:23:30.091466 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3da95371-091a-4a62-b2c9-92ed39b8a65c-run-httpd\") pod \"ceilometer-0\" (UID: \"3da95371-091a-4a62-b2c9-92ed39b8a65c\") " pod="openstack/ceilometer-0" Mar 18 18:23:30 crc kubenswrapper[4830]: I0318 18:23:30.091495 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3da95371-091a-4a62-b2c9-92ed39b8a65c-log-httpd\") pod \"ceilometer-0\" (UID: \"3da95371-091a-4a62-b2c9-92ed39b8a65c\") " pod="openstack/ceilometer-0" Mar 18 18:23:30 crc kubenswrapper[4830]: I0318 18:23:30.092045 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3da95371-091a-4a62-b2c9-92ed39b8a65c-log-httpd\") pod \"ceilometer-0\" (UID: \"3da95371-091a-4a62-b2c9-92ed39b8a65c\") " pod="openstack/ceilometer-0" Mar 18 18:23:30 crc kubenswrapper[4830]: I0318 18:23:30.092083 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3da95371-091a-4a62-b2c9-92ed39b8a65c-run-httpd\") pod \"ceilometer-0\" (UID: \"3da95371-091a-4a62-b2c9-92ed39b8a65c\") " pod="openstack/ceilometer-0" Mar 18 18:23:30 crc kubenswrapper[4830]: I0318 18:23:30.096984 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3da95371-091a-4a62-b2c9-92ed39b8a65c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3da95371-091a-4a62-b2c9-92ed39b8a65c\") " pod="openstack/ceilometer-0" Mar 18 18:23:30 crc kubenswrapper[4830]: I0318 18:23:30.098470 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3da95371-091a-4a62-b2c9-92ed39b8a65c-scripts\") pod \"ceilometer-0\" (UID: \"3da95371-091a-4a62-b2c9-92ed39b8a65c\") " pod="openstack/ceilometer-0" Mar 18 18:23:30 crc kubenswrapper[4830]: I0318 18:23:30.098591 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3da95371-091a-4a62-b2c9-92ed39b8a65c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3da95371-091a-4a62-b2c9-92ed39b8a65c\") " pod="openstack/ceilometer-0" Mar 18 18:23:30 crc kubenswrapper[4830]: I0318 18:23:30.099106 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3da95371-091a-4a62-b2c9-92ed39b8a65c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3da95371-091a-4a62-b2c9-92ed39b8a65c\") " pod="openstack/ceilometer-0" Mar 18 18:23:30 crc kubenswrapper[4830]: I0318 18:23:30.112133 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3da95371-091a-4a62-b2c9-92ed39b8a65c-config-data\") pod \"ceilometer-0\" (UID: \"3da95371-091a-4a62-b2c9-92ed39b8a65c\") " pod="openstack/ceilometer-0" Mar 18 18:23:30 crc kubenswrapper[4830]: I0318 18:23:30.127040 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwv47\" (UniqueName: \"kubernetes.io/projected/3da95371-091a-4a62-b2c9-92ed39b8a65c-kube-api-access-mwv47\") pod \"ceilometer-0\" (UID: \"3da95371-091a-4a62-b2c9-92ed39b8a65c\") " pod="openstack/ceilometer-0" Mar 18 18:23:30 crc kubenswrapper[4830]: I0318 18:23:30.188586 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:23:30 crc kubenswrapper[4830]: I0318 18:23:30.201373 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:23:30 crc kubenswrapper[4830]: I0318 18:23:30.246788 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2631c83-8cce-42b7-88b4-226f2e8b4227" path="/var/lib/kubelet/pods/e2631c83-8cce-42b7-88b4-226f2e8b4227/volumes" Mar 18 18:23:30 crc kubenswrapper[4830]: I0318 18:23:30.469012 4830 generic.go:334] "Generic (PLEG): container finished" podID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerID="95c4e07cab8acd660c3305d62103b7c04d3c929938a23e2544d7e9b8fe0b847c" exitCode=0 Mar 18 18:23:30 crc kubenswrapper[4830]: I0318 18:23:30.469212 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerDied","Data":"95c4e07cab8acd660c3305d62103b7c04d3c929938a23e2544d7e9b8fe0b847c"} Mar 18 18:23:30 crc kubenswrapper[4830]: I0318 18:23:30.469336 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerStarted","Data":"7e673d7cc71d559a72795b6d3a15f56048a692df9a147924f348d8b7d4cd054a"} Mar 18 18:23:30 crc kubenswrapper[4830]: I0318 18:23:30.469358 4830 scope.go:117] "RemoveContainer" containerID="0f0582e7c69a5ff0a523a01804a4f3c9becc735481bb91df9516cfe7387f2359" Mar 18 18:23:30 crc kubenswrapper[4830]: I0318 18:23:30.652764 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:23:30 crc kubenswrapper[4830]: W0318 18:23:30.656972 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3da95371_091a_4a62_b2c9_92ed39b8a65c.slice/crio-34b427733a2d0e3f7fb6d3ab5bb46acc389a9f9b8cb44be41e45ff54230e07f5 WatchSource:0}: Error finding container 34b427733a2d0e3f7fb6d3ab5bb46acc389a9f9b8cb44be41e45ff54230e07f5: Status 404 returned error can't find the container with id 34b427733a2d0e3f7fb6d3ab5bb46acc389a9f9b8cb44be41e45ff54230e07f5 Mar 18 18:23:31 crc kubenswrapper[4830]: I0318 18:23:31.486237 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3da95371-091a-4a62-b2c9-92ed39b8a65c","Type":"ContainerStarted","Data":"5448f2fb01d7197c5bd824c6050a90bcc7f9a161ebc18200f7991c27b0486779"} Mar 18 18:23:31 crc kubenswrapper[4830]: I0318 18:23:31.486678 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3da95371-091a-4a62-b2c9-92ed39b8a65c","Type":"ContainerStarted","Data":"34b427733a2d0e3f7fb6d3ab5bb46acc389a9f9b8cb44be41e45ff54230e07f5"} Mar 18 18:23:32 crc kubenswrapper[4830]: I0318 18:23:32.496512 4830 generic.go:334] "Generic (PLEG): container finished" podID="a6cc0c25-d795-4421-abbe-bfdcefb9db61" containerID="ae5d24f64c1c35cc8a1935c826501d58649d4660951145043ac21c04591ba75f" exitCode=0 Mar 18 18:23:32 crc kubenswrapper[4830]: I0318 18:23:32.496933 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a6cc0c25-d795-4421-abbe-bfdcefb9db61","Type":"ContainerDied","Data":"ae5d24f64c1c35cc8a1935c826501d58649d4660951145043ac21c04591ba75f"} Mar 18 18:23:32 crc kubenswrapper[4830]: I0318 18:23:32.679942 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:23:32 crc kubenswrapper[4830]: I0318 18:23:32.772280 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6cc0c25-d795-4421-abbe-bfdcefb9db61-config-data\") pod \"a6cc0c25-d795-4421-abbe-bfdcefb9db61\" (UID: \"a6cc0c25-d795-4421-abbe-bfdcefb9db61\") " Mar 18 18:23:32 crc kubenswrapper[4830]: I0318 18:23:32.772342 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6cc0c25-d795-4421-abbe-bfdcefb9db61-combined-ca-bundle\") pod \"a6cc0c25-d795-4421-abbe-bfdcefb9db61\" (UID: \"a6cc0c25-d795-4421-abbe-bfdcefb9db61\") " Mar 18 18:23:32 crc kubenswrapper[4830]: I0318 18:23:32.772386 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg58n\" (UniqueName: \"kubernetes.io/projected/a6cc0c25-d795-4421-abbe-bfdcefb9db61-kube-api-access-qg58n\") pod \"a6cc0c25-d795-4421-abbe-bfdcefb9db61\" (UID: \"a6cc0c25-d795-4421-abbe-bfdcefb9db61\") " Mar 18 18:23:32 crc kubenswrapper[4830]: I0318 18:23:32.772474 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6cc0c25-d795-4421-abbe-bfdcefb9db61-logs\") pod \"a6cc0c25-d795-4421-abbe-bfdcefb9db61\" (UID: \"a6cc0c25-d795-4421-abbe-bfdcefb9db61\") " Mar 18 18:23:32 crc kubenswrapper[4830]: I0318 18:23:32.774481 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6cc0c25-d795-4421-abbe-bfdcefb9db61-logs" (OuterVolumeSpecName: "logs") pod "a6cc0c25-d795-4421-abbe-bfdcefb9db61" (UID: "a6cc0c25-d795-4421-abbe-bfdcefb9db61"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:23:32 crc kubenswrapper[4830]: I0318 18:23:32.779369 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6cc0c25-d795-4421-abbe-bfdcefb9db61-kube-api-access-qg58n" (OuterVolumeSpecName: "kube-api-access-qg58n") pod "a6cc0c25-d795-4421-abbe-bfdcefb9db61" (UID: "a6cc0c25-d795-4421-abbe-bfdcefb9db61"). InnerVolumeSpecName "kube-api-access-qg58n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:23:32 crc kubenswrapper[4830]: I0318 18:23:32.806915 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6cc0c25-d795-4421-abbe-bfdcefb9db61-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6cc0c25-d795-4421-abbe-bfdcefb9db61" (UID: "a6cc0c25-d795-4421-abbe-bfdcefb9db61"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:32 crc kubenswrapper[4830]: I0318 18:23:32.824034 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6cc0c25-d795-4421-abbe-bfdcefb9db61-config-data" (OuterVolumeSpecName: "config-data") pod "a6cc0c25-d795-4421-abbe-bfdcefb9db61" (UID: "a6cc0c25-d795-4421-abbe-bfdcefb9db61"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:32 crc kubenswrapper[4830]: I0318 18:23:32.874903 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6cc0c25-d795-4421-abbe-bfdcefb9db61-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:32 crc kubenswrapper[4830]: I0318 18:23:32.874938 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6cc0c25-d795-4421-abbe-bfdcefb9db61-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:32 crc kubenswrapper[4830]: I0318 18:23:32.874952 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg58n\" (UniqueName: \"kubernetes.io/projected/a6cc0c25-d795-4421-abbe-bfdcefb9db61-kube-api-access-qg58n\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:32 crc kubenswrapper[4830]: I0318 18:23:32.874964 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6cc0c25-d795-4421-abbe-bfdcefb9db61-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.352439 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.369569 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.514912 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3da95371-091a-4a62-b2c9-92ed39b8a65c","Type":"ContainerStarted","Data":"e84225cc224d9f46d3586fa610485cf75c0075f86c01acbf7bc669015be12afd"} Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.514961 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3da95371-091a-4a62-b2c9-92ed39b8a65c","Type":"ContainerStarted","Data":"ca7c061ffd65d20aeedee0bcfba2a22fe5e5b5e41c7dfdde6be577142464368a"} Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.517944 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.519098 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a6cc0c25-d795-4421-abbe-bfdcefb9db61","Type":"ContainerDied","Data":"dc058af0be962eec06a8d1b36a9585d165a0148144c020b313f616f0c4949ca9"} Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.519166 4830 scope.go:117] "RemoveContainer" containerID="ae5d24f64c1c35cc8a1935c826501d58649d4660951145043ac21c04591ba75f" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.553870 4830 scope.go:117] "RemoveContainer" containerID="35816140a6f613f53804dd433976493f1668d481308131393e3b3babebfa4eb5" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.559629 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.577117 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.592237 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.607850 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 18:23:33 crc kubenswrapper[4830]: E0318 18:23:33.608380 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6cc0c25-d795-4421-abbe-bfdcefb9db61" containerName="nova-api-log" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.608405 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6cc0c25-d795-4421-abbe-bfdcefb9db61" containerName="nova-api-log" Mar 18 18:23:33 crc kubenswrapper[4830]: E0318 18:23:33.608425 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6cc0c25-d795-4421-abbe-bfdcefb9db61" containerName="nova-api-api" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.608432 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6cc0c25-d795-4421-abbe-bfdcefb9db61" containerName="nova-api-api" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.608606 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6cc0c25-d795-4421-abbe-bfdcefb9db61" containerName="nova-api-log" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.608626 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6cc0c25-d795-4421-abbe-bfdcefb9db61" containerName="nova-api-api" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.609656 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.621799 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.621925 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.621979 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.621996 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.689609 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-logs\") pod \"nova-api-0\" (UID: \"43e3c2e1-af49-49f4-ba46-9efd70ee61c0\") " pod="openstack/nova-api-0" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.689652 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"43e3c2e1-af49-49f4-ba46-9efd70ee61c0\") " pod="openstack/nova-api-0" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.689690 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4j2j\" (UniqueName: \"kubernetes.io/projected/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-kube-api-access-m4j2j\") pod \"nova-api-0\" (UID: \"43e3c2e1-af49-49f4-ba46-9efd70ee61c0\") " pod="openstack/nova-api-0" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.689711 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"43e3c2e1-af49-49f4-ba46-9efd70ee61c0\") " pod="openstack/nova-api-0" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.689736 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-config-data\") pod \"nova-api-0\" (UID: \"43e3c2e1-af49-49f4-ba46-9efd70ee61c0\") " pod="openstack/nova-api-0" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.689810 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-public-tls-certs\") pod \"nova-api-0\" (UID: \"43e3c2e1-af49-49f4-ba46-9efd70ee61c0\") " pod="openstack/nova-api-0" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.792708 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"43e3c2e1-af49-49f4-ba46-9efd70ee61c0\") " pod="openstack/nova-api-0" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.792787 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4j2j\" (UniqueName: \"kubernetes.io/projected/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-kube-api-access-m4j2j\") pod \"nova-api-0\" (UID: \"43e3c2e1-af49-49f4-ba46-9efd70ee61c0\") " pod="openstack/nova-api-0" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.792807 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"43e3c2e1-af49-49f4-ba46-9efd70ee61c0\") " pod="openstack/nova-api-0" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.792836 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-config-data\") pod \"nova-api-0\" (UID: \"43e3c2e1-af49-49f4-ba46-9efd70ee61c0\") " pod="openstack/nova-api-0" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.792900 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-public-tls-certs\") pod \"nova-api-0\" (UID: \"43e3c2e1-af49-49f4-ba46-9efd70ee61c0\") " pod="openstack/nova-api-0" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.792958 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-logs\") pod \"nova-api-0\" (UID: \"43e3c2e1-af49-49f4-ba46-9efd70ee61c0\") " pod="openstack/nova-api-0" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.793443 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-logs\") pod \"nova-api-0\" (UID: \"43e3c2e1-af49-49f4-ba46-9efd70ee61c0\") " pod="openstack/nova-api-0" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.803512 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-pnc5q"] Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.809009 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"43e3c2e1-af49-49f4-ba46-9efd70ee61c0\") " pod="openstack/nova-api-0" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.811411 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pnc5q" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.814277 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.815366 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.818210 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"43e3c2e1-af49-49f4-ba46-9efd70ee61c0\") " pod="openstack/nova-api-0" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.837308 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-public-tls-certs\") pod \"nova-api-0\" (UID: \"43e3c2e1-af49-49f4-ba46-9efd70ee61c0\") " pod="openstack/nova-api-0" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.842580 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-config-data\") pod \"nova-api-0\" (UID: \"43e3c2e1-af49-49f4-ba46-9efd70ee61c0\") " pod="openstack/nova-api-0" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.863829 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-pnc5q"] Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.865401 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4j2j\" (UniqueName: \"kubernetes.io/projected/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-kube-api-access-m4j2j\") pod \"nova-api-0\" (UID: \"43e3c2e1-af49-49f4-ba46-9efd70ee61c0\") " pod="openstack/nova-api-0" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.894356 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59b25fae-fe31-4c24-b22c-9a459c4ecebc-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pnc5q\" (UID: \"59b25fae-fe31-4c24-b22c-9a459c4ecebc\") " pod="openstack/nova-cell1-cell-mapping-pnc5q" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.894420 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9spq\" (UniqueName: \"kubernetes.io/projected/59b25fae-fe31-4c24-b22c-9a459c4ecebc-kube-api-access-q9spq\") pod \"nova-cell1-cell-mapping-pnc5q\" (UID: \"59b25fae-fe31-4c24-b22c-9a459c4ecebc\") " pod="openstack/nova-cell1-cell-mapping-pnc5q" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.894513 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59b25fae-fe31-4c24-b22c-9a459c4ecebc-scripts\") pod \"nova-cell1-cell-mapping-pnc5q\" (UID: \"59b25fae-fe31-4c24-b22c-9a459c4ecebc\") " pod="openstack/nova-cell1-cell-mapping-pnc5q" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.894590 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59b25fae-fe31-4c24-b22c-9a459c4ecebc-config-data\") pod \"nova-cell1-cell-mapping-pnc5q\" (UID: \"59b25fae-fe31-4c24-b22c-9a459c4ecebc\") " pod="openstack/nova-cell1-cell-mapping-pnc5q" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.938655 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.996848 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59b25fae-fe31-4c24-b22c-9a459c4ecebc-scripts\") pod \"nova-cell1-cell-mapping-pnc5q\" (UID: \"59b25fae-fe31-4c24-b22c-9a459c4ecebc\") " pod="openstack/nova-cell1-cell-mapping-pnc5q" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.996941 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59b25fae-fe31-4c24-b22c-9a459c4ecebc-config-data\") pod \"nova-cell1-cell-mapping-pnc5q\" (UID: \"59b25fae-fe31-4c24-b22c-9a459c4ecebc\") " pod="openstack/nova-cell1-cell-mapping-pnc5q" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.997002 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59b25fae-fe31-4c24-b22c-9a459c4ecebc-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pnc5q\" (UID: \"59b25fae-fe31-4c24-b22c-9a459c4ecebc\") " pod="openstack/nova-cell1-cell-mapping-pnc5q" Mar 18 18:23:33 crc kubenswrapper[4830]: I0318 18:23:33.997022 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9spq\" (UniqueName: \"kubernetes.io/projected/59b25fae-fe31-4c24-b22c-9a459c4ecebc-kube-api-access-q9spq\") pod \"nova-cell1-cell-mapping-pnc5q\" (UID: \"59b25fae-fe31-4c24-b22c-9a459c4ecebc\") " pod="openstack/nova-cell1-cell-mapping-pnc5q" Mar 18 18:23:34 crc kubenswrapper[4830]: I0318 18:23:34.001355 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59b25fae-fe31-4c24-b22c-9a459c4ecebc-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pnc5q\" (UID: \"59b25fae-fe31-4c24-b22c-9a459c4ecebc\") " pod="openstack/nova-cell1-cell-mapping-pnc5q" Mar 18 18:23:34 crc kubenswrapper[4830]: I0318 18:23:34.001571 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59b25fae-fe31-4c24-b22c-9a459c4ecebc-config-data\") pod \"nova-cell1-cell-mapping-pnc5q\" (UID: \"59b25fae-fe31-4c24-b22c-9a459c4ecebc\") " pod="openstack/nova-cell1-cell-mapping-pnc5q" Mar 18 18:23:34 crc kubenswrapper[4830]: I0318 18:23:34.001689 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59b25fae-fe31-4c24-b22c-9a459c4ecebc-scripts\") pod \"nova-cell1-cell-mapping-pnc5q\" (UID: \"59b25fae-fe31-4c24-b22c-9a459c4ecebc\") " pod="openstack/nova-cell1-cell-mapping-pnc5q" Mar 18 18:23:34 crc kubenswrapper[4830]: I0318 18:23:34.014001 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9spq\" (UniqueName: \"kubernetes.io/projected/59b25fae-fe31-4c24-b22c-9a459c4ecebc-kube-api-access-q9spq\") pod \"nova-cell1-cell-mapping-pnc5q\" (UID: \"59b25fae-fe31-4c24-b22c-9a459c4ecebc\") " pod="openstack/nova-cell1-cell-mapping-pnc5q" Mar 18 18:23:34 crc kubenswrapper[4830]: I0318 18:23:34.238660 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pnc5q" Mar 18 18:23:34 crc kubenswrapper[4830]: I0318 18:23:34.245731 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6cc0c25-d795-4421-abbe-bfdcefb9db61" path="/var/lib/kubelet/pods/a6cc0c25-d795-4421-abbe-bfdcefb9db61/volumes" Mar 18 18:23:35 crc kubenswrapper[4830]: I0318 18:23:35.136667 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-pnc5q"] Mar 18 18:23:35 crc kubenswrapper[4830]: I0318 18:23:35.154159 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:23:35 crc kubenswrapper[4830]: W0318 18:23:35.165910 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59b25fae_fe31_4c24_b22c_9a459c4ecebc.slice/crio-bc41128319b44d961b3628c61abc4424f20d4872fc795cb4f5698f7f39e8bc3e WatchSource:0}: Error finding container bc41128319b44d961b3628c61abc4424f20d4872fc795cb4f5698f7f39e8bc3e: Status 404 returned error can't find the container with id bc41128319b44d961b3628c61abc4424f20d4872fc795cb4f5698f7f39e8bc3e Mar 18 18:23:35 crc kubenswrapper[4830]: W0318 18:23:35.169169 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43e3c2e1_af49_49f4_ba46_9efd70ee61c0.slice/crio-69738431af4325afd2c419682d5dff2fc7ca7399d7ee627229c2ecef8bb8c24a WatchSource:0}: Error finding container 69738431af4325afd2c419682d5dff2fc7ca7399d7ee627229c2ecef8bb8c24a: Status 404 returned error can't find the container with id 69738431af4325afd2c419682d5dff2fc7ca7399d7ee627229c2ecef8bb8c24a Mar 18 18:23:35 crc kubenswrapper[4830]: I0318 18:23:35.542725 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"43e3c2e1-af49-49f4-ba46-9efd70ee61c0","Type":"ContainerStarted","Data":"ee5c6f31e6d02731861f61693ae219e41705976b4fec2867532e615f43ad5b16"} Mar 18 18:23:35 crc kubenswrapper[4830]: I0318 18:23:35.542780 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"43e3c2e1-af49-49f4-ba46-9efd70ee61c0","Type":"ContainerStarted","Data":"69738431af4325afd2c419682d5dff2fc7ca7399d7ee627229c2ecef8bb8c24a"} Mar 18 18:23:35 crc kubenswrapper[4830]: I0318 18:23:35.545829 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3da95371-091a-4a62-b2c9-92ed39b8a65c","Type":"ContainerStarted","Data":"c10a539833ccf4d417d3d1825ebcaa814fac29719635eaccd8995c9177eee120"} Mar 18 18:23:35 crc kubenswrapper[4830]: I0318 18:23:35.545980 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3da95371-091a-4a62-b2c9-92ed39b8a65c" containerName="ceilometer-central-agent" containerID="cri-o://5448f2fb01d7197c5bd824c6050a90bcc7f9a161ebc18200f7991c27b0486779" gracePeriod=30 Mar 18 18:23:35 crc kubenswrapper[4830]: I0318 18:23:35.546093 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3da95371-091a-4a62-b2c9-92ed39b8a65c" containerName="proxy-httpd" containerID="cri-o://c10a539833ccf4d417d3d1825ebcaa814fac29719635eaccd8995c9177eee120" gracePeriod=30 Mar 18 18:23:35 crc kubenswrapper[4830]: I0318 18:23:35.546130 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 18:23:35 crc kubenswrapper[4830]: I0318 18:23:35.546187 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3da95371-091a-4a62-b2c9-92ed39b8a65c" containerName="ceilometer-notification-agent" containerID="cri-o://ca7c061ffd65d20aeedee0bcfba2a22fe5e5b5e41c7dfdde6be577142464368a" gracePeriod=30 Mar 18 18:23:35 crc kubenswrapper[4830]: I0318 18:23:35.546844 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3da95371-091a-4a62-b2c9-92ed39b8a65c" containerName="sg-core" containerID="cri-o://e84225cc224d9f46d3586fa610485cf75c0075f86c01acbf7bc669015be12afd" gracePeriod=30 Mar 18 18:23:35 crc kubenswrapper[4830]: I0318 18:23:35.550933 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pnc5q" event={"ID":"59b25fae-fe31-4c24-b22c-9a459c4ecebc","Type":"ContainerStarted","Data":"529a3e227f8ade8e3311b168c9e554155f352338f229794d4a4c7d826510abae"} Mar 18 18:23:35 crc kubenswrapper[4830]: I0318 18:23:35.550974 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pnc5q" event={"ID":"59b25fae-fe31-4c24-b22c-9a459c4ecebc","Type":"ContainerStarted","Data":"bc41128319b44d961b3628c61abc4424f20d4872fc795cb4f5698f7f39e8bc3e"} Mar 18 18:23:35 crc kubenswrapper[4830]: I0318 18:23:35.582192 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.169900091 podStartE2EDuration="6.582175008s" podCreationTimestamp="2026-03-18 18:23:29 +0000 UTC" firstStartedPulling="2026-03-18 18:23:30.658996923 +0000 UTC m=+1245.226627255" lastFinishedPulling="2026-03-18 18:23:35.07127184 +0000 UTC m=+1249.638902172" observedRunningTime="2026-03-18 18:23:35.569279774 +0000 UTC m=+1250.136910106" watchObservedRunningTime="2026-03-18 18:23:35.582175008 +0000 UTC m=+1250.149805340" Mar 18 18:23:35 crc kubenswrapper[4830]: I0318 18:23:35.592863 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-pnc5q" podStartSLOduration=2.592837009 podStartE2EDuration="2.592837009s" podCreationTimestamp="2026-03-18 18:23:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:23:35.592278314 +0000 UTC m=+1250.159908646" watchObservedRunningTime="2026-03-18 18:23:35.592837009 +0000 UTC m=+1250.160467351" Mar 18 18:23:36 crc kubenswrapper[4830]: I0318 18:23:36.564263 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"43e3c2e1-af49-49f4-ba46-9efd70ee61c0","Type":"ContainerStarted","Data":"2ff2d60391d175c92bbb8b788a67210ac736d78d0e09ed108b2b0e45a8a58fcf"} Mar 18 18:23:36 crc kubenswrapper[4830]: I0318 18:23:36.567725 4830 generic.go:334] "Generic (PLEG): container finished" podID="3da95371-091a-4a62-b2c9-92ed39b8a65c" containerID="e84225cc224d9f46d3586fa610485cf75c0075f86c01acbf7bc669015be12afd" exitCode=2 Mar 18 18:23:36 crc kubenswrapper[4830]: I0318 18:23:36.567754 4830 generic.go:334] "Generic (PLEG): container finished" podID="3da95371-091a-4a62-b2c9-92ed39b8a65c" containerID="ca7c061ffd65d20aeedee0bcfba2a22fe5e5b5e41c7dfdde6be577142464368a" exitCode=0 Mar 18 18:23:36 crc kubenswrapper[4830]: I0318 18:23:36.567821 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3da95371-091a-4a62-b2c9-92ed39b8a65c","Type":"ContainerDied","Data":"e84225cc224d9f46d3586fa610485cf75c0075f86c01acbf7bc669015be12afd"} Mar 18 18:23:36 crc kubenswrapper[4830]: I0318 18:23:36.567868 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3da95371-091a-4a62-b2c9-92ed39b8a65c","Type":"ContainerDied","Data":"ca7c061ffd65d20aeedee0bcfba2a22fe5e5b5e41c7dfdde6be577142464368a"} Mar 18 18:23:36 crc kubenswrapper[4830]: I0318 18:23:36.599380 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.599361362 podStartE2EDuration="3.599361362s" podCreationTimestamp="2026-03-18 18:23:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:23:36.588842205 +0000 UTC m=+1251.156472537" watchObservedRunningTime="2026-03-18 18:23:36.599361362 +0000 UTC m=+1251.166991694" Mar 18 18:23:36 crc kubenswrapper[4830]: I0318 18:23:36.962428 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bd85b459c-7ck7d" Mar 18 18:23:37 crc kubenswrapper[4830]: I0318 18:23:37.059306 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b495b9cc7-kpxvt"] Mar 18 18:23:37 crc kubenswrapper[4830]: I0318 18:23:37.059538 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b495b9cc7-kpxvt" podUID="25520247-79bc-4a87-abaf-57cd2e711a99" containerName="dnsmasq-dns" containerID="cri-o://50be01a9e054905f2176b43b44c6f84b5ef7a1a1c4414136310d958c154087b5" gracePeriod=10 Mar 18 18:23:37 crc kubenswrapper[4830]: I0318 18:23:37.563254 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b495b9cc7-kpxvt" Mar 18 18:23:37 crc kubenswrapper[4830]: I0318 18:23:37.578126 4830 generic.go:334] "Generic (PLEG): container finished" podID="25520247-79bc-4a87-abaf-57cd2e711a99" containerID="50be01a9e054905f2176b43b44c6f84b5ef7a1a1c4414136310d958c154087b5" exitCode=0 Mar 18 18:23:37 crc kubenswrapper[4830]: I0318 18:23:37.578310 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b495b9cc7-kpxvt" event={"ID":"25520247-79bc-4a87-abaf-57cd2e711a99","Type":"ContainerDied","Data":"50be01a9e054905f2176b43b44c6f84b5ef7a1a1c4414136310d958c154087b5"} Mar 18 18:23:37 crc kubenswrapper[4830]: I0318 18:23:37.579030 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b495b9cc7-kpxvt" event={"ID":"25520247-79bc-4a87-abaf-57cd2e711a99","Type":"ContainerDied","Data":"26f6d2b50b3ead96107aae571ab80074596a2330376ba03711041cab6812c723"} Mar 18 18:23:37 crc kubenswrapper[4830]: I0318 18:23:37.579054 4830 scope.go:117] "RemoveContainer" containerID="50be01a9e054905f2176b43b44c6f84b5ef7a1a1c4414136310d958c154087b5" Mar 18 18:23:37 crc kubenswrapper[4830]: I0318 18:23:37.578399 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b495b9cc7-kpxvt" Mar 18 18:23:37 crc kubenswrapper[4830]: I0318 18:23:37.606207 4830 scope.go:117] "RemoveContainer" containerID="b113e35a62fc7a7325b784db3f281726e66cfb7fbc5fc1c68c20b2099dadd208" Mar 18 18:23:37 crc kubenswrapper[4830]: I0318 18:23:37.626858 4830 scope.go:117] "RemoveContainer" containerID="50be01a9e054905f2176b43b44c6f84b5ef7a1a1c4414136310d958c154087b5" Mar 18 18:23:37 crc kubenswrapper[4830]: E0318 18:23:37.627357 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50be01a9e054905f2176b43b44c6f84b5ef7a1a1c4414136310d958c154087b5\": container with ID starting with 50be01a9e054905f2176b43b44c6f84b5ef7a1a1c4414136310d958c154087b5 not found: ID does not exist" containerID="50be01a9e054905f2176b43b44c6f84b5ef7a1a1c4414136310d958c154087b5" Mar 18 18:23:37 crc kubenswrapper[4830]: I0318 18:23:37.627484 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50be01a9e054905f2176b43b44c6f84b5ef7a1a1c4414136310d958c154087b5"} err="failed to get container status \"50be01a9e054905f2176b43b44c6f84b5ef7a1a1c4414136310d958c154087b5\": rpc error: code = NotFound desc = could not find container \"50be01a9e054905f2176b43b44c6f84b5ef7a1a1c4414136310d958c154087b5\": container with ID starting with 50be01a9e054905f2176b43b44c6f84b5ef7a1a1c4414136310d958c154087b5 not found: ID does not exist" Mar 18 18:23:37 crc kubenswrapper[4830]: I0318 18:23:37.627578 4830 scope.go:117] "RemoveContainer" containerID="b113e35a62fc7a7325b784db3f281726e66cfb7fbc5fc1c68c20b2099dadd208" Mar 18 18:23:37 crc kubenswrapper[4830]: E0318 18:23:37.630993 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b113e35a62fc7a7325b784db3f281726e66cfb7fbc5fc1c68c20b2099dadd208\": container with ID starting with b113e35a62fc7a7325b784db3f281726e66cfb7fbc5fc1c68c20b2099dadd208 not found: ID does not exist" containerID="b113e35a62fc7a7325b784db3f281726e66cfb7fbc5fc1c68c20b2099dadd208" Mar 18 18:23:37 crc kubenswrapper[4830]: I0318 18:23:37.631017 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b113e35a62fc7a7325b784db3f281726e66cfb7fbc5fc1c68c20b2099dadd208"} err="failed to get container status \"b113e35a62fc7a7325b784db3f281726e66cfb7fbc5fc1c68c20b2099dadd208\": rpc error: code = NotFound desc = could not find container \"b113e35a62fc7a7325b784db3f281726e66cfb7fbc5fc1c68c20b2099dadd208\": container with ID starting with b113e35a62fc7a7325b784db3f281726e66cfb7fbc5fc1c68c20b2099dadd208 not found: ID does not exist" Mar 18 18:23:37 crc kubenswrapper[4830]: I0318 18:23:37.687172 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25520247-79bc-4a87-abaf-57cd2e711a99-config\") pod \"25520247-79bc-4a87-abaf-57cd2e711a99\" (UID: \"25520247-79bc-4a87-abaf-57cd2e711a99\") " Mar 18 18:23:37 crc kubenswrapper[4830]: I0318 18:23:37.687244 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25520247-79bc-4a87-abaf-57cd2e711a99-ovsdbserver-sb\") pod \"25520247-79bc-4a87-abaf-57cd2e711a99\" (UID: \"25520247-79bc-4a87-abaf-57cd2e711a99\") " Mar 18 18:23:37 crc kubenswrapper[4830]: I0318 18:23:37.687297 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5p8zr\" (UniqueName: \"kubernetes.io/projected/25520247-79bc-4a87-abaf-57cd2e711a99-kube-api-access-5p8zr\") pod \"25520247-79bc-4a87-abaf-57cd2e711a99\" (UID: \"25520247-79bc-4a87-abaf-57cd2e711a99\") " Mar 18 18:23:37 crc kubenswrapper[4830]: I0318 18:23:37.687382 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25520247-79bc-4a87-abaf-57cd2e711a99-dns-swift-storage-0\") pod \"25520247-79bc-4a87-abaf-57cd2e711a99\" (UID: \"25520247-79bc-4a87-abaf-57cd2e711a99\") " Mar 18 18:23:37 crc kubenswrapper[4830]: I0318 18:23:37.687407 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25520247-79bc-4a87-abaf-57cd2e711a99-dns-svc\") pod \"25520247-79bc-4a87-abaf-57cd2e711a99\" (UID: \"25520247-79bc-4a87-abaf-57cd2e711a99\") " Mar 18 18:23:37 crc kubenswrapper[4830]: I0318 18:23:37.687424 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25520247-79bc-4a87-abaf-57cd2e711a99-ovsdbserver-nb\") pod \"25520247-79bc-4a87-abaf-57cd2e711a99\" (UID: \"25520247-79bc-4a87-abaf-57cd2e711a99\") " Mar 18 18:23:37 crc kubenswrapper[4830]: I0318 18:23:37.703091 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25520247-79bc-4a87-abaf-57cd2e711a99-kube-api-access-5p8zr" (OuterVolumeSpecName: "kube-api-access-5p8zr") pod "25520247-79bc-4a87-abaf-57cd2e711a99" (UID: "25520247-79bc-4a87-abaf-57cd2e711a99"). InnerVolumeSpecName "kube-api-access-5p8zr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:23:37 crc kubenswrapper[4830]: I0318 18:23:37.741628 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25520247-79bc-4a87-abaf-57cd2e711a99-config" (OuterVolumeSpecName: "config") pod "25520247-79bc-4a87-abaf-57cd2e711a99" (UID: "25520247-79bc-4a87-abaf-57cd2e711a99"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:37 crc kubenswrapper[4830]: I0318 18:23:37.742595 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25520247-79bc-4a87-abaf-57cd2e711a99-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "25520247-79bc-4a87-abaf-57cd2e711a99" (UID: "25520247-79bc-4a87-abaf-57cd2e711a99"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:37 crc kubenswrapper[4830]: I0318 18:23:37.751323 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25520247-79bc-4a87-abaf-57cd2e711a99-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "25520247-79bc-4a87-abaf-57cd2e711a99" (UID: "25520247-79bc-4a87-abaf-57cd2e711a99"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:37 crc kubenswrapper[4830]: I0318 18:23:37.757225 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25520247-79bc-4a87-abaf-57cd2e711a99-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "25520247-79bc-4a87-abaf-57cd2e711a99" (UID: "25520247-79bc-4a87-abaf-57cd2e711a99"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:37 crc kubenswrapper[4830]: I0318 18:23:37.772253 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25520247-79bc-4a87-abaf-57cd2e711a99-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "25520247-79bc-4a87-abaf-57cd2e711a99" (UID: "25520247-79bc-4a87-abaf-57cd2e711a99"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:37 crc kubenswrapper[4830]: I0318 18:23:37.789939 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25520247-79bc-4a87-abaf-57cd2e711a99-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:37 crc kubenswrapper[4830]: I0318 18:23:37.789992 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25520247-79bc-4a87-abaf-57cd2e711a99-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:37 crc kubenswrapper[4830]: I0318 18:23:37.790007 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5p8zr\" (UniqueName: \"kubernetes.io/projected/25520247-79bc-4a87-abaf-57cd2e711a99-kube-api-access-5p8zr\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:37 crc kubenswrapper[4830]: I0318 18:23:37.790018 4830 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25520247-79bc-4a87-abaf-57cd2e711a99-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:37 crc kubenswrapper[4830]: I0318 18:23:37.790028 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25520247-79bc-4a87-abaf-57cd2e711a99-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:37 crc kubenswrapper[4830]: I0318 18:23:37.790037 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25520247-79bc-4a87-abaf-57cd2e711a99-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:37 crc kubenswrapper[4830]: I0318 18:23:37.912255 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b495b9cc7-kpxvt"] Mar 18 18:23:37 crc kubenswrapper[4830]: I0318 18:23:37.923460 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b495b9cc7-kpxvt"] Mar 18 18:23:38 crc kubenswrapper[4830]: I0318 18:23:38.252167 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25520247-79bc-4a87-abaf-57cd2e711a99" path="/var/lib/kubelet/pods/25520247-79bc-4a87-abaf-57cd2e711a99/volumes" Mar 18 18:23:39 crc kubenswrapper[4830]: I0318 18:23:39.606784 4830 generic.go:334] "Generic (PLEG): container finished" podID="3da95371-091a-4a62-b2c9-92ed39b8a65c" containerID="5448f2fb01d7197c5bd824c6050a90bcc7f9a161ebc18200f7991c27b0486779" exitCode=0 Mar 18 18:23:39 crc kubenswrapper[4830]: I0318 18:23:39.606847 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3da95371-091a-4a62-b2c9-92ed39b8a65c","Type":"ContainerDied","Data":"5448f2fb01d7197c5bd824c6050a90bcc7f9a161ebc18200f7991c27b0486779"} Mar 18 18:23:40 crc kubenswrapper[4830]: I0318 18:23:40.626585 4830 generic.go:334] "Generic (PLEG): container finished" podID="59b25fae-fe31-4c24-b22c-9a459c4ecebc" containerID="529a3e227f8ade8e3311b168c9e554155f352338f229794d4a4c7d826510abae" exitCode=0 Mar 18 18:23:40 crc kubenswrapper[4830]: I0318 18:23:40.626665 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pnc5q" event={"ID":"59b25fae-fe31-4c24-b22c-9a459c4ecebc","Type":"ContainerDied","Data":"529a3e227f8ade8e3311b168c9e554155f352338f229794d4a4c7d826510abae"} Mar 18 18:23:42 crc kubenswrapper[4830]: I0318 18:23:42.057970 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pnc5q" Mar 18 18:23:42 crc kubenswrapper[4830]: I0318 18:23:42.184869 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9spq\" (UniqueName: \"kubernetes.io/projected/59b25fae-fe31-4c24-b22c-9a459c4ecebc-kube-api-access-q9spq\") pod \"59b25fae-fe31-4c24-b22c-9a459c4ecebc\" (UID: \"59b25fae-fe31-4c24-b22c-9a459c4ecebc\") " Mar 18 18:23:42 crc kubenswrapper[4830]: I0318 18:23:42.185016 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59b25fae-fe31-4c24-b22c-9a459c4ecebc-scripts\") pod \"59b25fae-fe31-4c24-b22c-9a459c4ecebc\" (UID: \"59b25fae-fe31-4c24-b22c-9a459c4ecebc\") " Mar 18 18:23:42 crc kubenswrapper[4830]: I0318 18:23:42.185120 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59b25fae-fe31-4c24-b22c-9a459c4ecebc-config-data\") pod \"59b25fae-fe31-4c24-b22c-9a459c4ecebc\" (UID: \"59b25fae-fe31-4c24-b22c-9a459c4ecebc\") " Mar 18 18:23:42 crc kubenswrapper[4830]: I0318 18:23:42.185309 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59b25fae-fe31-4c24-b22c-9a459c4ecebc-combined-ca-bundle\") pod \"59b25fae-fe31-4c24-b22c-9a459c4ecebc\" (UID: \"59b25fae-fe31-4c24-b22c-9a459c4ecebc\") " Mar 18 18:23:42 crc kubenswrapper[4830]: I0318 18:23:42.191178 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59b25fae-fe31-4c24-b22c-9a459c4ecebc-scripts" (OuterVolumeSpecName: "scripts") pod "59b25fae-fe31-4c24-b22c-9a459c4ecebc" (UID: "59b25fae-fe31-4c24-b22c-9a459c4ecebc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:42 crc kubenswrapper[4830]: I0318 18:23:42.192065 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59b25fae-fe31-4c24-b22c-9a459c4ecebc-kube-api-access-q9spq" (OuterVolumeSpecName: "kube-api-access-q9spq") pod "59b25fae-fe31-4c24-b22c-9a459c4ecebc" (UID: "59b25fae-fe31-4c24-b22c-9a459c4ecebc"). InnerVolumeSpecName "kube-api-access-q9spq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:23:42 crc kubenswrapper[4830]: I0318 18:23:42.213959 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59b25fae-fe31-4c24-b22c-9a459c4ecebc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59b25fae-fe31-4c24-b22c-9a459c4ecebc" (UID: "59b25fae-fe31-4c24-b22c-9a459c4ecebc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:42 crc kubenswrapper[4830]: I0318 18:23:42.239831 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59b25fae-fe31-4c24-b22c-9a459c4ecebc-config-data" (OuterVolumeSpecName: "config-data") pod "59b25fae-fe31-4c24-b22c-9a459c4ecebc" (UID: "59b25fae-fe31-4c24-b22c-9a459c4ecebc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:42 crc kubenswrapper[4830]: I0318 18:23:42.299506 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59b25fae-fe31-4c24-b22c-9a459c4ecebc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:42 crc kubenswrapper[4830]: I0318 18:23:42.299560 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9spq\" (UniqueName: \"kubernetes.io/projected/59b25fae-fe31-4c24-b22c-9a459c4ecebc-kube-api-access-q9spq\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:42 crc kubenswrapper[4830]: I0318 18:23:42.299585 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59b25fae-fe31-4c24-b22c-9a459c4ecebc-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:42 crc kubenswrapper[4830]: I0318 18:23:42.299602 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59b25fae-fe31-4c24-b22c-9a459c4ecebc-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:42 crc kubenswrapper[4830]: I0318 18:23:42.328121 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7b495b9cc7-kpxvt" podUID="25520247-79bc-4a87-abaf-57cd2e711a99" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.194:5353: i/o timeout" Mar 18 18:23:42 crc kubenswrapper[4830]: I0318 18:23:42.653737 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pnc5q" event={"ID":"59b25fae-fe31-4c24-b22c-9a459c4ecebc","Type":"ContainerDied","Data":"bc41128319b44d961b3628c61abc4424f20d4872fc795cb4f5698f7f39e8bc3e"} Mar 18 18:23:42 crc kubenswrapper[4830]: I0318 18:23:42.653862 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc41128319b44d961b3628c61abc4424f20d4872fc795cb4f5698f7f39e8bc3e" Mar 18 18:23:42 crc kubenswrapper[4830]: I0318 18:23:42.653833 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pnc5q" Mar 18 18:23:42 crc kubenswrapper[4830]: I0318 18:23:42.906583 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:23:42 crc kubenswrapper[4830]: I0318 18:23:42.906989 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="43e3c2e1-af49-49f4-ba46-9efd70ee61c0" containerName="nova-api-log" containerID="cri-o://ee5c6f31e6d02731861f61693ae219e41705976b4fec2867532e615f43ad5b16" gracePeriod=30 Mar 18 18:23:42 crc kubenswrapper[4830]: I0318 18:23:42.907512 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="43e3c2e1-af49-49f4-ba46-9efd70ee61c0" containerName="nova-api-api" containerID="cri-o://2ff2d60391d175c92bbb8b788a67210ac736d78d0e09ed108b2b0e45a8a58fcf" gracePeriod=30 Mar 18 18:23:42 crc kubenswrapper[4830]: I0318 18:23:42.930540 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:23:42 crc kubenswrapper[4830]: I0318 18:23:42.930920 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="ff705200-15b1-471b-a5af-97566ce67516" containerName="nova-scheduler-scheduler" containerID="cri-o://8b88e3c3ace0f597cf06b04b680fd9ca20902806c14182d7fd471f34d5addb83" gracePeriod=30 Mar 18 18:23:42 crc kubenswrapper[4830]: I0318 18:23:42.953961 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:23:42 crc kubenswrapper[4830]: I0318 18:23:42.954385 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d" containerName="nova-metadata-metadata" containerID="cri-o://adee33e11c5c3bc1e71277096d004012a317b9250abfca3711a234ab58e6379e" gracePeriod=30 Mar 18 18:23:42 crc kubenswrapper[4830]: I0318 18:23:42.954252 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d" containerName="nova-metadata-log" containerID="cri-o://3e8a58d9287f21f167fefaeca6f3af608e8ac6f3b70f32f6d87d0fa5a94baf68" gracePeriod=30 Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.434750 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:23:43 crc kubenswrapper[4830]: E0318 18:23:43.523923 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8b88e3c3ace0f597cf06b04b680fd9ca20902806c14182d7fd471f34d5addb83" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.524390 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4j2j\" (UniqueName: \"kubernetes.io/projected/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-kube-api-access-m4j2j\") pod \"43e3c2e1-af49-49f4-ba46-9efd70ee61c0\" (UID: \"43e3c2e1-af49-49f4-ba46-9efd70ee61c0\") " Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.524494 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-logs\") pod \"43e3c2e1-af49-49f4-ba46-9efd70ee61c0\" (UID: \"43e3c2e1-af49-49f4-ba46-9efd70ee61c0\") " Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.524621 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-combined-ca-bundle\") pod \"43e3c2e1-af49-49f4-ba46-9efd70ee61c0\" (UID: \"43e3c2e1-af49-49f4-ba46-9efd70ee61c0\") " Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.524648 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-config-data\") pod \"43e3c2e1-af49-49f4-ba46-9efd70ee61c0\" (UID: \"43e3c2e1-af49-49f4-ba46-9efd70ee61c0\") " Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.524738 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-internal-tls-certs\") pod \"43e3c2e1-af49-49f4-ba46-9efd70ee61c0\" (UID: \"43e3c2e1-af49-49f4-ba46-9efd70ee61c0\") " Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.524781 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-public-tls-certs\") pod \"43e3c2e1-af49-49f4-ba46-9efd70ee61c0\" (UID: \"43e3c2e1-af49-49f4-ba46-9efd70ee61c0\") " Mar 18 18:23:43 crc kubenswrapper[4830]: E0318 18:23:43.526811 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8b88e3c3ace0f597cf06b04b680fd9ca20902806c14182d7fd471f34d5addb83" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.527079 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-logs" (OuterVolumeSpecName: "logs") pod "43e3c2e1-af49-49f4-ba46-9efd70ee61c0" (UID: "43e3c2e1-af49-49f4-ba46-9efd70ee61c0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.530340 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-kube-api-access-m4j2j" (OuterVolumeSpecName: "kube-api-access-m4j2j") pod "43e3c2e1-af49-49f4-ba46-9efd70ee61c0" (UID: "43e3c2e1-af49-49f4-ba46-9efd70ee61c0"). InnerVolumeSpecName "kube-api-access-m4j2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:23:43 crc kubenswrapper[4830]: E0318 18:23:43.530349 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8b88e3c3ace0f597cf06b04b680fd9ca20902806c14182d7fd471f34d5addb83" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 18:23:43 crc kubenswrapper[4830]: E0318 18:23:43.530465 4830 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="ff705200-15b1-471b-a5af-97566ce67516" containerName="nova-scheduler-scheduler" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.555037 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-config-data" (OuterVolumeSpecName: "config-data") pod "43e3c2e1-af49-49f4-ba46-9efd70ee61c0" (UID: "43e3c2e1-af49-49f4-ba46-9efd70ee61c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.560070 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43e3c2e1-af49-49f4-ba46-9efd70ee61c0" (UID: "43e3c2e1-af49-49f4-ba46-9efd70ee61c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.593500 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "43e3c2e1-af49-49f4-ba46-9efd70ee61c0" (UID: "43e3c2e1-af49-49f4-ba46-9efd70ee61c0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.594493 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "43e3c2e1-af49-49f4-ba46-9efd70ee61c0" (UID: "43e3c2e1-af49-49f4-ba46-9efd70ee61c0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.627388 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.627435 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.627448 4830 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.627461 4830 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.627477 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4j2j\" (UniqueName: \"kubernetes.io/projected/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-kube-api-access-m4j2j\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.627496 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43e3c2e1-af49-49f4-ba46-9efd70ee61c0-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.665164 4830 generic.go:334] "Generic (PLEG): container finished" podID="6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d" containerID="3e8a58d9287f21f167fefaeca6f3af608e8ac6f3b70f32f6d87d0fa5a94baf68" exitCode=143 Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.665238 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d","Type":"ContainerDied","Data":"3e8a58d9287f21f167fefaeca6f3af608e8ac6f3b70f32f6d87d0fa5a94baf68"} Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.667468 4830 generic.go:334] "Generic (PLEG): container finished" podID="43e3c2e1-af49-49f4-ba46-9efd70ee61c0" containerID="2ff2d60391d175c92bbb8b788a67210ac736d78d0e09ed108b2b0e45a8a58fcf" exitCode=0 Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.667492 4830 generic.go:334] "Generic (PLEG): container finished" podID="43e3c2e1-af49-49f4-ba46-9efd70ee61c0" containerID="ee5c6f31e6d02731861f61693ae219e41705976b4fec2867532e615f43ad5b16" exitCode=143 Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.667507 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"43e3c2e1-af49-49f4-ba46-9efd70ee61c0","Type":"ContainerDied","Data":"2ff2d60391d175c92bbb8b788a67210ac736d78d0e09ed108b2b0e45a8a58fcf"} Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.667525 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"43e3c2e1-af49-49f4-ba46-9efd70ee61c0","Type":"ContainerDied","Data":"ee5c6f31e6d02731861f61693ae219e41705976b4fec2867532e615f43ad5b16"} Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.667536 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"43e3c2e1-af49-49f4-ba46-9efd70ee61c0","Type":"ContainerDied","Data":"69738431af4325afd2c419682d5dff2fc7ca7399d7ee627229c2ecef8bb8c24a"} Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.667551 4830 scope.go:117] "RemoveContainer" containerID="2ff2d60391d175c92bbb8b788a67210ac736d78d0e09ed108b2b0e45a8a58fcf" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.667606 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.708336 4830 scope.go:117] "RemoveContainer" containerID="ee5c6f31e6d02731861f61693ae219e41705976b4fec2867532e615f43ad5b16" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.724891 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.735137 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.749684 4830 scope.go:117] "RemoveContainer" containerID="2ff2d60391d175c92bbb8b788a67210ac736d78d0e09ed108b2b0e45a8a58fcf" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.751446 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 18:23:43 crc kubenswrapper[4830]: E0318 18:23:43.751920 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43e3c2e1-af49-49f4-ba46-9efd70ee61c0" containerName="nova-api-log" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.751941 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="43e3c2e1-af49-49f4-ba46-9efd70ee61c0" containerName="nova-api-log" Mar 18 18:23:43 crc kubenswrapper[4830]: E0318 18:23:43.751960 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25520247-79bc-4a87-abaf-57cd2e711a99" containerName="dnsmasq-dns" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.751971 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="25520247-79bc-4a87-abaf-57cd2e711a99" containerName="dnsmasq-dns" Mar 18 18:23:43 crc kubenswrapper[4830]: E0318 18:23:43.751989 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25520247-79bc-4a87-abaf-57cd2e711a99" containerName="init" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.751997 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="25520247-79bc-4a87-abaf-57cd2e711a99" containerName="init" Mar 18 18:23:43 crc kubenswrapper[4830]: E0318 18:23:43.752014 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59b25fae-fe31-4c24-b22c-9a459c4ecebc" containerName="nova-manage" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.752023 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="59b25fae-fe31-4c24-b22c-9a459c4ecebc" containerName="nova-manage" Mar 18 18:23:43 crc kubenswrapper[4830]: E0318 18:23:43.752042 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43e3c2e1-af49-49f4-ba46-9efd70ee61c0" containerName="nova-api-api" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.752051 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="43e3c2e1-af49-49f4-ba46-9efd70ee61c0" containerName="nova-api-api" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.752279 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="59b25fae-fe31-4c24-b22c-9a459c4ecebc" containerName="nova-manage" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.752296 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="25520247-79bc-4a87-abaf-57cd2e711a99" containerName="dnsmasq-dns" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.752341 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="43e3c2e1-af49-49f4-ba46-9efd70ee61c0" containerName="nova-api-api" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.752374 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="43e3c2e1-af49-49f4-ba46-9efd70ee61c0" containerName="nova-api-log" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.753597 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:23:43 crc kubenswrapper[4830]: E0318 18:23:43.755378 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ff2d60391d175c92bbb8b788a67210ac736d78d0e09ed108b2b0e45a8a58fcf\": container with ID starting with 2ff2d60391d175c92bbb8b788a67210ac736d78d0e09ed108b2b0e45a8a58fcf not found: ID does not exist" containerID="2ff2d60391d175c92bbb8b788a67210ac736d78d0e09ed108b2b0e45a8a58fcf" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.755416 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ff2d60391d175c92bbb8b788a67210ac736d78d0e09ed108b2b0e45a8a58fcf"} err="failed to get container status \"2ff2d60391d175c92bbb8b788a67210ac736d78d0e09ed108b2b0e45a8a58fcf\": rpc error: code = NotFound desc = could not find container \"2ff2d60391d175c92bbb8b788a67210ac736d78d0e09ed108b2b0e45a8a58fcf\": container with ID starting with 2ff2d60391d175c92bbb8b788a67210ac736d78d0e09ed108b2b0e45a8a58fcf not found: ID does not exist" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.755447 4830 scope.go:117] "RemoveContainer" containerID="ee5c6f31e6d02731861f61693ae219e41705976b4fec2867532e615f43ad5b16" Mar 18 18:23:43 crc kubenswrapper[4830]: E0318 18:23:43.757273 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee5c6f31e6d02731861f61693ae219e41705976b4fec2867532e615f43ad5b16\": container with ID starting with ee5c6f31e6d02731861f61693ae219e41705976b4fec2867532e615f43ad5b16 not found: ID does not exist" containerID="ee5c6f31e6d02731861f61693ae219e41705976b4fec2867532e615f43ad5b16" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.757356 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee5c6f31e6d02731861f61693ae219e41705976b4fec2867532e615f43ad5b16"} err="failed to get container status \"ee5c6f31e6d02731861f61693ae219e41705976b4fec2867532e615f43ad5b16\": rpc error: code = NotFound desc = could not find container \"ee5c6f31e6d02731861f61693ae219e41705976b4fec2867532e615f43ad5b16\": container with ID starting with ee5c6f31e6d02731861f61693ae219e41705976b4fec2867532e615f43ad5b16 not found: ID does not exist" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.757401 4830 scope.go:117] "RemoveContainer" containerID="2ff2d60391d175c92bbb8b788a67210ac736d78d0e09ed108b2b0e45a8a58fcf" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.758018 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ff2d60391d175c92bbb8b788a67210ac736d78d0e09ed108b2b0e45a8a58fcf"} err="failed to get container status \"2ff2d60391d175c92bbb8b788a67210ac736d78d0e09ed108b2b0e45a8a58fcf\": rpc error: code = NotFound desc = could not find container \"2ff2d60391d175c92bbb8b788a67210ac736d78d0e09ed108b2b0e45a8a58fcf\": container with ID starting with 2ff2d60391d175c92bbb8b788a67210ac736d78d0e09ed108b2b0e45a8a58fcf not found: ID does not exist" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.758057 4830 scope.go:117] "RemoveContainer" containerID="ee5c6f31e6d02731861f61693ae219e41705976b4fec2867532e615f43ad5b16" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.758280 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee5c6f31e6d02731861f61693ae219e41705976b4fec2867532e615f43ad5b16"} err="failed to get container status \"ee5c6f31e6d02731861f61693ae219e41705976b4fec2867532e615f43ad5b16\": rpc error: code = NotFound desc = could not find container \"ee5c6f31e6d02731861f61693ae219e41705976b4fec2867532e615f43ad5b16\": container with ID starting with ee5c6f31e6d02731861f61693ae219e41705976b4fec2867532e615f43ad5b16 not found: ID does not exist" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.760353 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.760554 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.760798 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.776910 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.833960 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ba738f-c556-4b36-a045-3516efdf886a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b3ba738f-c556-4b36-a045-3516efdf886a\") " pod="openstack/nova-api-0" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.834053 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3ba738f-c556-4b36-a045-3516efdf886a-logs\") pod \"nova-api-0\" (UID: \"b3ba738f-c556-4b36-a045-3516efdf886a\") " pod="openstack/nova-api-0" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.834095 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3ba738f-c556-4b36-a045-3516efdf886a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b3ba738f-c556-4b36-a045-3516efdf886a\") " pod="openstack/nova-api-0" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.834178 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3ba738f-c556-4b36-a045-3516efdf886a-public-tls-certs\") pod \"nova-api-0\" (UID: \"b3ba738f-c556-4b36-a045-3516efdf886a\") " pod="openstack/nova-api-0" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.834282 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86k49\" (UniqueName: \"kubernetes.io/projected/b3ba738f-c556-4b36-a045-3516efdf886a-kube-api-access-86k49\") pod \"nova-api-0\" (UID: \"b3ba738f-c556-4b36-a045-3516efdf886a\") " pod="openstack/nova-api-0" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.834324 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ba738f-c556-4b36-a045-3516efdf886a-config-data\") pod \"nova-api-0\" (UID: \"b3ba738f-c556-4b36-a045-3516efdf886a\") " pod="openstack/nova-api-0" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.935904 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3ba738f-c556-4b36-a045-3516efdf886a-public-tls-certs\") pod \"nova-api-0\" (UID: \"b3ba738f-c556-4b36-a045-3516efdf886a\") " pod="openstack/nova-api-0" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.935970 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86k49\" (UniqueName: \"kubernetes.io/projected/b3ba738f-c556-4b36-a045-3516efdf886a-kube-api-access-86k49\") pod \"nova-api-0\" (UID: \"b3ba738f-c556-4b36-a045-3516efdf886a\") " pod="openstack/nova-api-0" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.935990 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ba738f-c556-4b36-a045-3516efdf886a-config-data\") pod \"nova-api-0\" (UID: \"b3ba738f-c556-4b36-a045-3516efdf886a\") " pod="openstack/nova-api-0" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.936043 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ba738f-c556-4b36-a045-3516efdf886a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b3ba738f-c556-4b36-a045-3516efdf886a\") " pod="openstack/nova-api-0" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.936541 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3ba738f-c556-4b36-a045-3516efdf886a-logs\") pod \"nova-api-0\" (UID: \"b3ba738f-c556-4b36-a045-3516efdf886a\") " pod="openstack/nova-api-0" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.936610 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3ba738f-c556-4b36-a045-3516efdf886a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b3ba738f-c556-4b36-a045-3516efdf886a\") " pod="openstack/nova-api-0" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.936896 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3ba738f-c556-4b36-a045-3516efdf886a-logs\") pod \"nova-api-0\" (UID: \"b3ba738f-c556-4b36-a045-3516efdf886a\") " pod="openstack/nova-api-0" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.940234 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ba738f-c556-4b36-a045-3516efdf886a-config-data\") pod \"nova-api-0\" (UID: \"b3ba738f-c556-4b36-a045-3516efdf886a\") " pod="openstack/nova-api-0" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.940829 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3ba738f-c556-4b36-a045-3516efdf886a-public-tls-certs\") pod \"nova-api-0\" (UID: \"b3ba738f-c556-4b36-a045-3516efdf886a\") " pod="openstack/nova-api-0" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.941721 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ba738f-c556-4b36-a045-3516efdf886a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b3ba738f-c556-4b36-a045-3516efdf886a\") " pod="openstack/nova-api-0" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.946216 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3ba738f-c556-4b36-a045-3516efdf886a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b3ba738f-c556-4b36-a045-3516efdf886a\") " pod="openstack/nova-api-0" Mar 18 18:23:43 crc kubenswrapper[4830]: I0318 18:23:43.952846 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86k49\" (UniqueName: \"kubernetes.io/projected/b3ba738f-c556-4b36-a045-3516efdf886a-kube-api-access-86k49\") pod \"nova-api-0\" (UID: \"b3ba738f-c556-4b36-a045-3516efdf886a\") " pod="openstack/nova-api-0" Mar 18 18:23:44 crc kubenswrapper[4830]: I0318 18:23:44.073206 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:23:44 crc kubenswrapper[4830]: I0318 18:23:44.249141 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43e3c2e1-af49-49f4-ba46-9efd70ee61c0" path="/var/lib/kubelet/pods/43e3c2e1-af49-49f4-ba46-9efd70ee61c0/volumes" Mar 18 18:23:44 crc kubenswrapper[4830]: I0318 18:23:44.541760 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:23:44 crc kubenswrapper[4830]: I0318 18:23:44.678672 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b3ba738f-c556-4b36-a045-3516efdf886a","Type":"ContainerStarted","Data":"f33077f9185a7354604c1c307deb7f9d8596ac8e975665c909a3a47886c7b2ac"} Mar 18 18:23:45 crc kubenswrapper[4830]: I0318 18:23:45.697243 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b3ba738f-c556-4b36-a045-3516efdf886a","Type":"ContainerStarted","Data":"91aff4166cbebec7917a849f1dae12a4f2caababfa680539bc75bf53f49cf551"} Mar 18 18:23:45 crc kubenswrapper[4830]: I0318 18:23:45.700399 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b3ba738f-c556-4b36-a045-3516efdf886a","Type":"ContainerStarted","Data":"7269557d2134b5328a9871c88e8307ed1155b8cea2686c2ae04cc355079f438f"} Mar 18 18:23:45 crc kubenswrapper[4830]: I0318 18:23:45.750951 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.750914697 podStartE2EDuration="2.750914697s" podCreationTimestamp="2026-03-18 18:23:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:23:45.727101574 +0000 UTC m=+1260.294731946" watchObservedRunningTime="2026-03-18 18:23:45.750914697 +0000 UTC m=+1260.318545059" Mar 18 18:23:46 crc kubenswrapper[4830]: I0318 18:23:46.670613 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:23:46 crc kubenswrapper[4830]: I0318 18:23:46.724307 4830 generic.go:334] "Generic (PLEG): container finished" podID="6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d" containerID="adee33e11c5c3bc1e71277096d004012a317b9250abfca3711a234ab58e6379e" exitCode=0 Mar 18 18:23:46 crc kubenswrapper[4830]: I0318 18:23:46.724593 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d","Type":"ContainerDied","Data":"adee33e11c5c3bc1e71277096d004012a317b9250abfca3711a234ab58e6379e"} Mar 18 18:23:46 crc kubenswrapper[4830]: I0318 18:23:46.724671 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:23:46 crc kubenswrapper[4830]: I0318 18:23:46.724692 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d","Type":"ContainerDied","Data":"4f964093b91b5f5c5ca440d51fb40c3661385f00ab648b3dcc670918940b3038"} Mar 18 18:23:46 crc kubenswrapper[4830]: I0318 18:23:46.724712 4830 scope.go:117] "RemoveContainer" containerID="adee33e11c5c3bc1e71277096d004012a317b9250abfca3711a234ab58e6379e" Mar 18 18:23:46 crc kubenswrapper[4830]: I0318 18:23:46.768445 4830 scope.go:117] "RemoveContainer" containerID="3e8a58d9287f21f167fefaeca6f3af608e8ac6f3b70f32f6d87d0fa5a94baf68" Mar 18 18:23:46 crc kubenswrapper[4830]: I0318 18:23:46.796338 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d-nova-metadata-tls-certs\") pod \"6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d\" (UID: \"6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d\") " Mar 18 18:23:46 crc kubenswrapper[4830]: I0318 18:23:46.796513 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtklf\" (UniqueName: \"kubernetes.io/projected/6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d-kube-api-access-mtklf\") pod \"6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d\" (UID: \"6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d\") " Mar 18 18:23:46 crc kubenswrapper[4830]: I0318 18:23:46.796684 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d-config-data\") pod \"6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d\" (UID: \"6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d\") " Mar 18 18:23:46 crc kubenswrapper[4830]: I0318 18:23:46.796865 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d-combined-ca-bundle\") pod \"6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d\" (UID: \"6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d\") " Mar 18 18:23:46 crc kubenswrapper[4830]: I0318 18:23:46.796972 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d-logs\") pod \"6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d\" (UID: \"6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d\") " Mar 18 18:23:46 crc kubenswrapper[4830]: I0318 18:23:46.799794 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d-logs" (OuterVolumeSpecName: "logs") pod "6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d" (UID: "6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:23:46 crc kubenswrapper[4830]: I0318 18:23:46.805046 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d-kube-api-access-mtklf" (OuterVolumeSpecName: "kube-api-access-mtklf") pod "6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d" (UID: "6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d"). InnerVolumeSpecName "kube-api-access-mtklf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:23:46 crc kubenswrapper[4830]: I0318 18:23:46.814735 4830 scope.go:117] "RemoveContainer" containerID="adee33e11c5c3bc1e71277096d004012a317b9250abfca3711a234ab58e6379e" Mar 18 18:23:46 crc kubenswrapper[4830]: E0318 18:23:46.815246 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adee33e11c5c3bc1e71277096d004012a317b9250abfca3711a234ab58e6379e\": container with ID starting with adee33e11c5c3bc1e71277096d004012a317b9250abfca3711a234ab58e6379e not found: ID does not exist" containerID="adee33e11c5c3bc1e71277096d004012a317b9250abfca3711a234ab58e6379e" Mar 18 18:23:46 crc kubenswrapper[4830]: I0318 18:23:46.815289 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adee33e11c5c3bc1e71277096d004012a317b9250abfca3711a234ab58e6379e"} err="failed to get container status \"adee33e11c5c3bc1e71277096d004012a317b9250abfca3711a234ab58e6379e\": rpc error: code = NotFound desc = could not find container \"adee33e11c5c3bc1e71277096d004012a317b9250abfca3711a234ab58e6379e\": container with ID starting with adee33e11c5c3bc1e71277096d004012a317b9250abfca3711a234ab58e6379e not found: ID does not exist" Mar 18 18:23:46 crc kubenswrapper[4830]: I0318 18:23:46.815318 4830 scope.go:117] "RemoveContainer" containerID="3e8a58d9287f21f167fefaeca6f3af608e8ac6f3b70f32f6d87d0fa5a94baf68" Mar 18 18:23:46 crc kubenswrapper[4830]: E0318 18:23:46.816986 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e8a58d9287f21f167fefaeca6f3af608e8ac6f3b70f32f6d87d0fa5a94baf68\": container with ID starting with 3e8a58d9287f21f167fefaeca6f3af608e8ac6f3b70f32f6d87d0fa5a94baf68 not found: ID does not exist" containerID="3e8a58d9287f21f167fefaeca6f3af608e8ac6f3b70f32f6d87d0fa5a94baf68" Mar 18 18:23:46 crc kubenswrapper[4830]: I0318 18:23:46.817013 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e8a58d9287f21f167fefaeca6f3af608e8ac6f3b70f32f6d87d0fa5a94baf68"} err="failed to get container status \"3e8a58d9287f21f167fefaeca6f3af608e8ac6f3b70f32f6d87d0fa5a94baf68\": rpc error: code = NotFound desc = could not find container \"3e8a58d9287f21f167fefaeca6f3af608e8ac6f3b70f32f6d87d0fa5a94baf68\": container with ID starting with 3e8a58d9287f21f167fefaeca6f3af608e8ac6f3b70f32f6d87d0fa5a94baf68 not found: ID does not exist" Mar 18 18:23:46 crc kubenswrapper[4830]: I0318 18:23:46.834351 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d" (UID: "6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:46 crc kubenswrapper[4830]: I0318 18:23:46.852880 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d-config-data" (OuterVolumeSpecName: "config-data") pod "6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d" (UID: "6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:46 crc kubenswrapper[4830]: I0318 18:23:46.859534 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d" (UID: "6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:46 crc kubenswrapper[4830]: I0318 18:23:46.900738 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:46 crc kubenswrapper[4830]: I0318 18:23:46.900793 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:46 crc kubenswrapper[4830]: I0318 18:23:46.900811 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:46 crc kubenswrapper[4830]: I0318 18:23:46.900824 4830 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:46 crc kubenswrapper[4830]: I0318 18:23:46.900839 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtklf\" (UniqueName: \"kubernetes.io/projected/6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d-kube-api-access-mtklf\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.070145 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.080509 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.091089 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:23:47 crc kubenswrapper[4830]: E0318 18:23:47.091493 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d" containerName="nova-metadata-metadata" Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.091510 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d" containerName="nova-metadata-metadata" Mar 18 18:23:47 crc kubenswrapper[4830]: E0318 18:23:47.091556 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d" containerName="nova-metadata-log" Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.091562 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d" containerName="nova-metadata-log" Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.091753 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d" containerName="nova-metadata-log" Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.091824 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d" containerName="nova-metadata-metadata" Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.092855 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.098469 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.098795 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.147909 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.210019 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd0fbdd2-a99b-4758-9f27-1f5055ca0172-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dd0fbdd2-a99b-4758-9f27-1f5055ca0172\") " pod="openstack/nova-metadata-0" Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.210099 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm9sr\" (UniqueName: \"kubernetes.io/projected/dd0fbdd2-a99b-4758-9f27-1f5055ca0172-kube-api-access-fm9sr\") pod \"nova-metadata-0\" (UID: \"dd0fbdd2-a99b-4758-9f27-1f5055ca0172\") " pod="openstack/nova-metadata-0" Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.210148 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd0fbdd2-a99b-4758-9f27-1f5055ca0172-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dd0fbdd2-a99b-4758-9f27-1f5055ca0172\") " pod="openstack/nova-metadata-0" Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.210176 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd0fbdd2-a99b-4758-9f27-1f5055ca0172-config-data\") pod \"nova-metadata-0\" (UID: \"dd0fbdd2-a99b-4758-9f27-1f5055ca0172\") " pod="openstack/nova-metadata-0" Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.210543 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd0fbdd2-a99b-4758-9f27-1f5055ca0172-logs\") pod \"nova-metadata-0\" (UID: \"dd0fbdd2-a99b-4758-9f27-1f5055ca0172\") " pod="openstack/nova-metadata-0" Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.312573 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd0fbdd2-a99b-4758-9f27-1f5055ca0172-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dd0fbdd2-a99b-4758-9f27-1f5055ca0172\") " pod="openstack/nova-metadata-0" Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.312640 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fm9sr\" (UniqueName: \"kubernetes.io/projected/dd0fbdd2-a99b-4758-9f27-1f5055ca0172-kube-api-access-fm9sr\") pod \"nova-metadata-0\" (UID: \"dd0fbdd2-a99b-4758-9f27-1f5055ca0172\") " pod="openstack/nova-metadata-0" Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.312685 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd0fbdd2-a99b-4758-9f27-1f5055ca0172-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dd0fbdd2-a99b-4758-9f27-1f5055ca0172\") " pod="openstack/nova-metadata-0" Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.312714 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd0fbdd2-a99b-4758-9f27-1f5055ca0172-config-data\") pod \"nova-metadata-0\" (UID: \"dd0fbdd2-a99b-4758-9f27-1f5055ca0172\") " pod="openstack/nova-metadata-0" Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.312862 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd0fbdd2-a99b-4758-9f27-1f5055ca0172-logs\") pod \"nova-metadata-0\" (UID: \"dd0fbdd2-a99b-4758-9f27-1f5055ca0172\") " pod="openstack/nova-metadata-0" Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.313544 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd0fbdd2-a99b-4758-9f27-1f5055ca0172-logs\") pod \"nova-metadata-0\" (UID: \"dd0fbdd2-a99b-4758-9f27-1f5055ca0172\") " pod="openstack/nova-metadata-0" Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.318673 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd0fbdd2-a99b-4758-9f27-1f5055ca0172-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dd0fbdd2-a99b-4758-9f27-1f5055ca0172\") " pod="openstack/nova-metadata-0" Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.319251 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd0fbdd2-a99b-4758-9f27-1f5055ca0172-config-data\") pod \"nova-metadata-0\" (UID: \"dd0fbdd2-a99b-4758-9f27-1f5055ca0172\") " pod="openstack/nova-metadata-0" Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.319730 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd0fbdd2-a99b-4758-9f27-1f5055ca0172-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dd0fbdd2-a99b-4758-9f27-1f5055ca0172\") " pod="openstack/nova-metadata-0" Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.349513 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm9sr\" (UniqueName: \"kubernetes.io/projected/dd0fbdd2-a99b-4758-9f27-1f5055ca0172-kube-api-access-fm9sr\") pod \"nova-metadata-0\" (UID: \"dd0fbdd2-a99b-4758-9f27-1f5055ca0172\") " pod="openstack/nova-metadata-0" Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.421246 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.739018 4830 generic.go:334] "Generic (PLEG): container finished" podID="ff705200-15b1-471b-a5af-97566ce67516" containerID="8b88e3c3ace0f597cf06b04b680fd9ca20902806c14182d7fd471f34d5addb83" exitCode=0 Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.739384 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ff705200-15b1-471b-a5af-97566ce67516","Type":"ContainerDied","Data":"8b88e3c3ace0f597cf06b04b680fd9ca20902806c14182d7fd471f34d5addb83"} Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.778643 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.821726 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhmmn\" (UniqueName: \"kubernetes.io/projected/ff705200-15b1-471b-a5af-97566ce67516-kube-api-access-vhmmn\") pod \"ff705200-15b1-471b-a5af-97566ce67516\" (UID: \"ff705200-15b1-471b-a5af-97566ce67516\") " Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.821820 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff705200-15b1-471b-a5af-97566ce67516-config-data\") pod \"ff705200-15b1-471b-a5af-97566ce67516\" (UID: \"ff705200-15b1-471b-a5af-97566ce67516\") " Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.821943 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff705200-15b1-471b-a5af-97566ce67516-combined-ca-bundle\") pod \"ff705200-15b1-471b-a5af-97566ce67516\" (UID: \"ff705200-15b1-471b-a5af-97566ce67516\") " Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.828366 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff705200-15b1-471b-a5af-97566ce67516-kube-api-access-vhmmn" (OuterVolumeSpecName: "kube-api-access-vhmmn") pod "ff705200-15b1-471b-a5af-97566ce67516" (UID: "ff705200-15b1-471b-a5af-97566ce67516"). InnerVolumeSpecName "kube-api-access-vhmmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.851918 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff705200-15b1-471b-a5af-97566ce67516-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff705200-15b1-471b-a5af-97566ce67516" (UID: "ff705200-15b1-471b-a5af-97566ce67516"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.857892 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff705200-15b1-471b-a5af-97566ce67516-config-data" (OuterVolumeSpecName: "config-data") pod "ff705200-15b1-471b-a5af-97566ce67516" (UID: "ff705200-15b1-471b-a5af-97566ce67516"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.924655 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhmmn\" (UniqueName: \"kubernetes.io/projected/ff705200-15b1-471b-a5af-97566ce67516-kube-api-access-vhmmn\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.925196 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff705200-15b1-471b-a5af-97566ce67516-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.925216 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff705200-15b1-471b-a5af-97566ce67516-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:47 crc kubenswrapper[4830]: I0318 18:23:47.992184 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:23:48 crc kubenswrapper[4830]: I0318 18:23:48.254159 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d" path="/var/lib/kubelet/pods/6dc9c9ee-00b4-49e0-a4cc-eedf2d89702d/volumes" Mar 18 18:23:48 crc kubenswrapper[4830]: I0318 18:23:48.752310 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 18:23:48 crc kubenswrapper[4830]: I0318 18:23:48.752323 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ff705200-15b1-471b-a5af-97566ce67516","Type":"ContainerDied","Data":"0b3eb7284131a964ea82b013e3fd88b742c7bc0f6b9d679ac1eed4fd906617b3"} Mar 18 18:23:48 crc kubenswrapper[4830]: I0318 18:23:48.752833 4830 scope.go:117] "RemoveContainer" containerID="8b88e3c3ace0f597cf06b04b680fd9ca20902806c14182d7fd471f34d5addb83" Mar 18 18:23:48 crc kubenswrapper[4830]: I0318 18:23:48.755700 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd0fbdd2-a99b-4758-9f27-1f5055ca0172","Type":"ContainerStarted","Data":"e3b1b0010366275a52abd6d5d86dee987f708c05b0176a216ab2bd64c79302b6"} Mar 18 18:23:48 crc kubenswrapper[4830]: I0318 18:23:48.755740 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd0fbdd2-a99b-4758-9f27-1f5055ca0172","Type":"ContainerStarted","Data":"7ec14097d0f88bba4680e90148e2beece59c3678f735e4a8e1a973a7adfaf364"} Mar 18 18:23:48 crc kubenswrapper[4830]: I0318 18:23:48.755751 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd0fbdd2-a99b-4758-9f27-1f5055ca0172","Type":"ContainerStarted","Data":"9e11bb5f59234678f00d01cc0b778b5cdb4c6aa29fb289544c4f5a0c09d38e67"} Mar 18 18:23:48 crc kubenswrapper[4830]: I0318 18:23:48.783995 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.783973258 podStartE2EDuration="1.783973258s" podCreationTimestamp="2026-03-18 18:23:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:23:48.776572159 +0000 UTC m=+1263.344202521" watchObservedRunningTime="2026-03-18 18:23:48.783973258 +0000 UTC m=+1263.351603600" Mar 18 18:23:48 crc kubenswrapper[4830]: I0318 18:23:48.801650 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:23:48 crc kubenswrapper[4830]: I0318 18:23:48.812071 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:23:48 crc kubenswrapper[4830]: I0318 18:23:48.845953 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:23:48 crc kubenswrapper[4830]: E0318 18:23:48.846404 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff705200-15b1-471b-a5af-97566ce67516" containerName="nova-scheduler-scheduler" Mar 18 18:23:48 crc kubenswrapper[4830]: I0318 18:23:48.846422 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff705200-15b1-471b-a5af-97566ce67516" containerName="nova-scheduler-scheduler" Mar 18 18:23:48 crc kubenswrapper[4830]: I0318 18:23:48.846605 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff705200-15b1-471b-a5af-97566ce67516" containerName="nova-scheduler-scheduler" Mar 18 18:23:48 crc kubenswrapper[4830]: I0318 18:23:48.847263 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 18:23:48 crc kubenswrapper[4830]: I0318 18:23:48.852794 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 18:23:48 crc kubenswrapper[4830]: I0318 18:23:48.859327 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:23:48 crc kubenswrapper[4830]: I0318 18:23:48.940487 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e8e20bd-67c1-48a7-be43-c585d65656ea-config-data\") pod \"nova-scheduler-0\" (UID: \"1e8e20bd-67c1-48a7-be43-c585d65656ea\") " pod="openstack/nova-scheduler-0" Mar 18 18:23:48 crc kubenswrapper[4830]: I0318 18:23:48.940572 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e8e20bd-67c1-48a7-be43-c585d65656ea-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1e8e20bd-67c1-48a7-be43-c585d65656ea\") " pod="openstack/nova-scheduler-0" Mar 18 18:23:48 crc kubenswrapper[4830]: I0318 18:23:48.940597 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5plzm\" (UniqueName: \"kubernetes.io/projected/1e8e20bd-67c1-48a7-be43-c585d65656ea-kube-api-access-5plzm\") pod \"nova-scheduler-0\" (UID: \"1e8e20bd-67c1-48a7-be43-c585d65656ea\") " pod="openstack/nova-scheduler-0" Mar 18 18:23:49 crc kubenswrapper[4830]: I0318 18:23:49.041603 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e8e20bd-67c1-48a7-be43-c585d65656ea-config-data\") pod \"nova-scheduler-0\" (UID: \"1e8e20bd-67c1-48a7-be43-c585d65656ea\") " pod="openstack/nova-scheduler-0" Mar 18 18:23:49 crc kubenswrapper[4830]: I0318 18:23:49.041748 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e8e20bd-67c1-48a7-be43-c585d65656ea-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1e8e20bd-67c1-48a7-be43-c585d65656ea\") " pod="openstack/nova-scheduler-0" Mar 18 18:23:49 crc kubenswrapper[4830]: I0318 18:23:49.041852 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5plzm\" (UniqueName: \"kubernetes.io/projected/1e8e20bd-67c1-48a7-be43-c585d65656ea-kube-api-access-5plzm\") pod \"nova-scheduler-0\" (UID: \"1e8e20bd-67c1-48a7-be43-c585d65656ea\") " pod="openstack/nova-scheduler-0" Mar 18 18:23:49 crc kubenswrapper[4830]: I0318 18:23:49.048532 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e8e20bd-67c1-48a7-be43-c585d65656ea-config-data\") pod \"nova-scheduler-0\" (UID: \"1e8e20bd-67c1-48a7-be43-c585d65656ea\") " pod="openstack/nova-scheduler-0" Mar 18 18:23:49 crc kubenswrapper[4830]: I0318 18:23:49.052415 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e8e20bd-67c1-48a7-be43-c585d65656ea-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1e8e20bd-67c1-48a7-be43-c585d65656ea\") " pod="openstack/nova-scheduler-0" Mar 18 18:23:49 crc kubenswrapper[4830]: I0318 18:23:49.063002 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5plzm\" (UniqueName: \"kubernetes.io/projected/1e8e20bd-67c1-48a7-be43-c585d65656ea-kube-api-access-5plzm\") pod \"nova-scheduler-0\" (UID: \"1e8e20bd-67c1-48a7-be43-c585d65656ea\") " pod="openstack/nova-scheduler-0" Mar 18 18:23:49 crc kubenswrapper[4830]: I0318 18:23:49.177136 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 18:23:49 crc kubenswrapper[4830]: I0318 18:23:49.505267 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:23:49 crc kubenswrapper[4830]: I0318 18:23:49.773255 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1e8e20bd-67c1-48a7-be43-c585d65656ea","Type":"ContainerStarted","Data":"4cc141c38da7f2f14e8af81b886f2466b15b63a804233f4ae743bb0e785d7d90"} Mar 18 18:23:49 crc kubenswrapper[4830]: I0318 18:23:49.773704 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1e8e20bd-67c1-48a7-be43-c585d65656ea","Type":"ContainerStarted","Data":"af056868fa1366cf5665b0b5558680ca7fbae4a4157d29750ff0672dfb35222e"} Mar 18 18:23:49 crc kubenswrapper[4830]: I0318 18:23:49.803987 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.803962722 podStartE2EDuration="1.803962722s" podCreationTimestamp="2026-03-18 18:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:23:49.790754879 +0000 UTC m=+1264.358385261" watchObservedRunningTime="2026-03-18 18:23:49.803962722 +0000 UTC m=+1264.371593064" Mar 18 18:23:50 crc kubenswrapper[4830]: I0318 18:23:50.255313 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff705200-15b1-471b-a5af-97566ce67516" path="/var/lib/kubelet/pods/ff705200-15b1-471b-a5af-97566ce67516/volumes" Mar 18 18:23:54 crc kubenswrapper[4830]: I0318 18:23:54.073423 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 18:23:54 crc kubenswrapper[4830]: I0318 18:23:54.074190 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 18:23:54 crc kubenswrapper[4830]: I0318 18:23:54.177666 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 18:23:55 crc kubenswrapper[4830]: I0318 18:23:55.088079 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b3ba738f-c556-4b36-a045-3516efdf886a" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.209:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 18:23:55 crc kubenswrapper[4830]: I0318 18:23:55.088487 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b3ba738f-c556-4b36-a045-3516efdf886a" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.209:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 18:23:57 crc kubenswrapper[4830]: I0318 18:23:57.422281 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 18:23:57 crc kubenswrapper[4830]: I0318 18:23:57.422550 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 18:23:58 crc kubenswrapper[4830]: I0318 18:23:58.442965 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="dd0fbdd2-a99b-4758-9f27-1f5055ca0172" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 18:23:58 crc kubenswrapper[4830]: I0318 18:23:58.443038 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="dd0fbdd2-a99b-4758-9f27-1f5055ca0172" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 18:23:59 crc kubenswrapper[4830]: I0318 18:23:59.178092 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 18:23:59 crc kubenswrapper[4830]: I0318 18:23:59.211347 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 18:23:59 crc kubenswrapper[4830]: I0318 18:23:59.947305 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 18:24:00 crc kubenswrapper[4830]: I0318 18:24:00.149427 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564304-4mmp9"] Mar 18 18:24:00 crc kubenswrapper[4830]: I0318 18:24:00.152250 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564304-4mmp9" Mar 18 18:24:00 crc kubenswrapper[4830]: I0318 18:24:00.154503 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 18:24:00 crc kubenswrapper[4830]: I0318 18:24:00.155427 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:24:00 crc kubenswrapper[4830]: I0318 18:24:00.156095 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:24:00 crc kubenswrapper[4830]: I0318 18:24:00.158079 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8dpk\" (UniqueName: \"kubernetes.io/projected/86e46d34-ca52-4c41-88c3-376e6219e90f-kube-api-access-n8dpk\") pod \"auto-csr-approver-29564304-4mmp9\" (UID: \"86e46d34-ca52-4c41-88c3-376e6219e90f\") " pod="openshift-infra/auto-csr-approver-29564304-4mmp9" Mar 18 18:24:00 crc kubenswrapper[4830]: I0318 18:24:00.161378 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564304-4mmp9"] Mar 18 18:24:00 crc kubenswrapper[4830]: I0318 18:24:00.219683 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="3da95371-091a-4a62-b2c9-92ed39b8a65c" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 18:24:00 crc kubenswrapper[4830]: I0318 18:24:00.260571 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8dpk\" (UniqueName: \"kubernetes.io/projected/86e46d34-ca52-4c41-88c3-376e6219e90f-kube-api-access-n8dpk\") pod \"auto-csr-approver-29564304-4mmp9\" (UID: \"86e46d34-ca52-4c41-88c3-376e6219e90f\") " pod="openshift-infra/auto-csr-approver-29564304-4mmp9" Mar 18 18:24:00 crc kubenswrapper[4830]: I0318 18:24:00.286702 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8dpk\" (UniqueName: \"kubernetes.io/projected/86e46d34-ca52-4c41-88c3-376e6219e90f-kube-api-access-n8dpk\") pod \"auto-csr-approver-29564304-4mmp9\" (UID: \"86e46d34-ca52-4c41-88c3-376e6219e90f\") " pod="openshift-infra/auto-csr-approver-29564304-4mmp9" Mar 18 18:24:00 crc kubenswrapper[4830]: I0318 18:24:00.478293 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564304-4mmp9" Mar 18 18:24:00 crc kubenswrapper[4830]: I0318 18:24:00.946033 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564304-4mmp9"] Mar 18 18:24:01 crc kubenswrapper[4830]: I0318 18:24:01.929796 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564304-4mmp9" event={"ID":"86e46d34-ca52-4c41-88c3-376e6219e90f","Type":"ContainerStarted","Data":"02d3e554ac2ead0273cade38e259e3eb9b688b8587aecd1372c131c70712beb3"} Mar 18 18:24:02 crc kubenswrapper[4830]: I0318 18:24:02.073383 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 18:24:02 crc kubenswrapper[4830]: I0318 18:24:02.073452 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 18:24:02 crc kubenswrapper[4830]: I0318 18:24:02.939898 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564304-4mmp9" event={"ID":"86e46d34-ca52-4c41-88c3-376e6219e90f","Type":"ContainerStarted","Data":"a497502cf1e6b810a8afc3afffce3046c12bda7092873c827fdf04d4ed710b99"} Mar 18 18:24:02 crc kubenswrapper[4830]: I0318 18:24:02.961715 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564304-4mmp9" podStartSLOduration=1.389751055 podStartE2EDuration="2.961696217s" podCreationTimestamp="2026-03-18 18:24:00 +0000 UTC" firstStartedPulling="2026-03-18 18:24:00.956668826 +0000 UTC m=+1275.524299188" lastFinishedPulling="2026-03-18 18:24:02.528614008 +0000 UTC m=+1277.096244350" observedRunningTime="2026-03-18 18:24:02.952160887 +0000 UTC m=+1277.519791219" watchObservedRunningTime="2026-03-18 18:24:02.961696217 +0000 UTC m=+1277.529326569" Mar 18 18:24:03 crc kubenswrapper[4830]: I0318 18:24:03.958041 4830 generic.go:334] "Generic (PLEG): container finished" podID="86e46d34-ca52-4c41-88c3-376e6219e90f" containerID="a497502cf1e6b810a8afc3afffce3046c12bda7092873c827fdf04d4ed710b99" exitCode=0 Mar 18 18:24:03 crc kubenswrapper[4830]: I0318 18:24:03.958127 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564304-4mmp9" event={"ID":"86e46d34-ca52-4c41-88c3-376e6219e90f","Type":"ContainerDied","Data":"a497502cf1e6b810a8afc3afffce3046c12bda7092873c827fdf04d4ed710b99"} Mar 18 18:24:04 crc kubenswrapper[4830]: I0318 18:24:04.087014 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 18:24:04 crc kubenswrapper[4830]: I0318 18:24:04.091076 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 18:24:04 crc kubenswrapper[4830]: I0318 18:24:04.103296 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 18:24:04 crc kubenswrapper[4830]: E0318 18:24:04.164917 4830 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/d0a4cdbdf169bb9b005f84907324b2a16b6e19566ee4e0211290da96094049dc/diff" to get inode usage: stat /var/lib/containers/storage/overlay/d0a4cdbdf169bb9b005f84907324b2a16b6e19566ee4e0211290da96094049dc/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_nova-scheduler-0_ff705200-15b1-471b-a5af-97566ce67516/nova-scheduler-scheduler/0.log" to get inode usage: stat /var/log/pods/openstack_nova-scheduler-0_ff705200-15b1-471b-a5af-97566ce67516/nova-scheduler-scheduler/0.log: no such file or directory Mar 18 18:24:04 crc kubenswrapper[4830]: I0318 18:24:04.978850 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 18:24:05 crc kubenswrapper[4830]: I0318 18:24:05.354022 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564304-4mmp9" Mar 18 18:24:05 crc kubenswrapper[4830]: I0318 18:24:05.422120 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 18:24:05 crc kubenswrapper[4830]: I0318 18:24:05.422424 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 18:24:05 crc kubenswrapper[4830]: I0318 18:24:05.478296 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8dpk\" (UniqueName: \"kubernetes.io/projected/86e46d34-ca52-4c41-88c3-376e6219e90f-kube-api-access-n8dpk\") pod \"86e46d34-ca52-4c41-88c3-376e6219e90f\" (UID: \"86e46d34-ca52-4c41-88c3-376e6219e90f\") " Mar 18 18:24:05 crc kubenswrapper[4830]: I0318 18:24:05.483913 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86e46d34-ca52-4c41-88c3-376e6219e90f-kube-api-access-n8dpk" (OuterVolumeSpecName: "kube-api-access-n8dpk") pod "86e46d34-ca52-4c41-88c3-376e6219e90f" (UID: "86e46d34-ca52-4c41-88c3-376e6219e90f"). InnerVolumeSpecName "kube-api-access-n8dpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:24:05 crc kubenswrapper[4830]: I0318 18:24:05.581461 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8dpk\" (UniqueName: \"kubernetes.io/projected/86e46d34-ca52-4c41-88c3-376e6219e90f-kube-api-access-n8dpk\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:05 crc kubenswrapper[4830]: W0318 18:24:05.607673 4830 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86e46d34_ca52_4c41_88c3_376e6219e90f.slice/crio-conmon-a497502cf1e6b810a8afc3afffce3046c12bda7092873c827fdf04d4ed710b99.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86e46d34_ca52_4c41_88c3_376e6219e90f.slice/crio-conmon-a497502cf1e6b810a8afc3afffce3046c12bda7092873c827fdf04d4ed710b99.scope: no such file or directory Mar 18 18:24:05 crc kubenswrapper[4830]: W0318 18:24:05.607760 4830 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86e46d34_ca52_4c41_88c3_376e6219e90f.slice/crio-a497502cf1e6b810a8afc3afffce3046c12bda7092873c827fdf04d4ed710b99.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86e46d34_ca52_4c41_88c3_376e6219e90f.slice/crio-a497502cf1e6b810a8afc3afffce3046c12bda7092873c827fdf04d4ed710b99.scope: no such file or directory Mar 18 18:24:05 crc kubenswrapper[4830]: E0318 18:24:05.826240 4830 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3da95371_091a_4a62_b2c9_92ed39b8a65c.slice/crio-conmon-c10a539833ccf4d417d3d1825ebcaa814fac29719635eaccd8995c9177eee120.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3da95371_091a_4a62_b2c9_92ed39b8a65c.slice/crio-c10a539833ccf4d417d3d1825ebcaa814fac29719635eaccd8995c9177eee120.scope\": RecentStats: unable to find data in memory cache]" Mar 18 18:24:05 crc kubenswrapper[4830]: I0318 18:24:05.920257 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.007867 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564304-4mmp9" event={"ID":"86e46d34-ca52-4c41-88c3-376e6219e90f","Type":"ContainerDied","Data":"02d3e554ac2ead0273cade38e259e3eb9b688b8587aecd1372c131c70712beb3"} Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.007906 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02d3e554ac2ead0273cade38e259e3eb9b688b8587aecd1372c131c70712beb3" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.007937 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564304-4mmp9" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.013067 4830 generic.go:334] "Generic (PLEG): container finished" podID="3da95371-091a-4a62-b2c9-92ed39b8a65c" containerID="c10a539833ccf4d417d3d1825ebcaa814fac29719635eaccd8995c9177eee120" exitCode=137 Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.013892 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.013917 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3da95371-091a-4a62-b2c9-92ed39b8a65c","Type":"ContainerDied","Data":"c10a539833ccf4d417d3d1825ebcaa814fac29719635eaccd8995c9177eee120"} Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.013975 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3da95371-091a-4a62-b2c9-92ed39b8a65c","Type":"ContainerDied","Data":"34b427733a2d0e3f7fb6d3ab5bb46acc389a9f9b8cb44be41e45ff54230e07f5"} Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.013999 4830 scope.go:117] "RemoveContainer" containerID="c10a539833ccf4d417d3d1825ebcaa814fac29719635eaccd8995c9177eee120" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.042135 4830 scope.go:117] "RemoveContainer" containerID="e84225cc224d9f46d3586fa610485cf75c0075f86c01acbf7bc669015be12afd" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.049844 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564298-2ll76"] Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.057568 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564298-2ll76"] Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.062563 4830 scope.go:117] "RemoveContainer" containerID="ca7c061ffd65d20aeedee0bcfba2a22fe5e5b5e41c7dfdde6be577142464368a" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.082643 4830 scope.go:117] "RemoveContainer" containerID="5448f2fb01d7197c5bd824c6050a90bcc7f9a161ebc18200f7991c27b0486779" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.093499 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3da95371-091a-4a62-b2c9-92ed39b8a65c-scripts\") pod \"3da95371-091a-4a62-b2c9-92ed39b8a65c\" (UID: \"3da95371-091a-4a62-b2c9-92ed39b8a65c\") " Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.093564 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3da95371-091a-4a62-b2c9-92ed39b8a65c-config-data\") pod \"3da95371-091a-4a62-b2c9-92ed39b8a65c\" (UID: \"3da95371-091a-4a62-b2c9-92ed39b8a65c\") " Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.093620 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3da95371-091a-4a62-b2c9-92ed39b8a65c-sg-core-conf-yaml\") pod \"3da95371-091a-4a62-b2c9-92ed39b8a65c\" (UID: \"3da95371-091a-4a62-b2c9-92ed39b8a65c\") " Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.093683 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3da95371-091a-4a62-b2c9-92ed39b8a65c-run-httpd\") pod \"3da95371-091a-4a62-b2c9-92ed39b8a65c\" (UID: \"3da95371-091a-4a62-b2c9-92ed39b8a65c\") " Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.093874 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3da95371-091a-4a62-b2c9-92ed39b8a65c-log-httpd\") pod \"3da95371-091a-4a62-b2c9-92ed39b8a65c\" (UID: \"3da95371-091a-4a62-b2c9-92ed39b8a65c\") " Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.093952 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3da95371-091a-4a62-b2c9-92ed39b8a65c-combined-ca-bundle\") pod \"3da95371-091a-4a62-b2c9-92ed39b8a65c\" (UID: \"3da95371-091a-4a62-b2c9-92ed39b8a65c\") " Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.094029 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwv47\" (UniqueName: \"kubernetes.io/projected/3da95371-091a-4a62-b2c9-92ed39b8a65c-kube-api-access-mwv47\") pod \"3da95371-091a-4a62-b2c9-92ed39b8a65c\" (UID: \"3da95371-091a-4a62-b2c9-92ed39b8a65c\") " Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.094071 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3da95371-091a-4a62-b2c9-92ed39b8a65c-ceilometer-tls-certs\") pod \"3da95371-091a-4a62-b2c9-92ed39b8a65c\" (UID: \"3da95371-091a-4a62-b2c9-92ed39b8a65c\") " Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.094426 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3da95371-091a-4a62-b2c9-92ed39b8a65c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3da95371-091a-4a62-b2c9-92ed39b8a65c" (UID: "3da95371-091a-4a62-b2c9-92ed39b8a65c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.094551 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3da95371-091a-4a62-b2c9-92ed39b8a65c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3da95371-091a-4a62-b2c9-92ed39b8a65c" (UID: "3da95371-091a-4a62-b2c9-92ed39b8a65c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.094649 4830 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3da95371-091a-4a62-b2c9-92ed39b8a65c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.094669 4830 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3da95371-091a-4a62-b2c9-92ed39b8a65c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.099038 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3da95371-091a-4a62-b2c9-92ed39b8a65c-kube-api-access-mwv47" (OuterVolumeSpecName: "kube-api-access-mwv47") pod "3da95371-091a-4a62-b2c9-92ed39b8a65c" (UID: "3da95371-091a-4a62-b2c9-92ed39b8a65c"). InnerVolumeSpecName "kube-api-access-mwv47". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.101594 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3da95371-091a-4a62-b2c9-92ed39b8a65c-scripts" (OuterVolumeSpecName: "scripts") pod "3da95371-091a-4a62-b2c9-92ed39b8a65c" (UID: "3da95371-091a-4a62-b2c9-92ed39b8a65c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.107071 4830 scope.go:117] "RemoveContainer" containerID="c10a539833ccf4d417d3d1825ebcaa814fac29719635eaccd8995c9177eee120" Mar 18 18:24:06 crc kubenswrapper[4830]: E0318 18:24:06.107586 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c10a539833ccf4d417d3d1825ebcaa814fac29719635eaccd8995c9177eee120\": container with ID starting with c10a539833ccf4d417d3d1825ebcaa814fac29719635eaccd8995c9177eee120 not found: ID does not exist" containerID="c10a539833ccf4d417d3d1825ebcaa814fac29719635eaccd8995c9177eee120" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.107623 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c10a539833ccf4d417d3d1825ebcaa814fac29719635eaccd8995c9177eee120"} err="failed to get container status \"c10a539833ccf4d417d3d1825ebcaa814fac29719635eaccd8995c9177eee120\": rpc error: code = NotFound desc = could not find container \"c10a539833ccf4d417d3d1825ebcaa814fac29719635eaccd8995c9177eee120\": container with ID starting with c10a539833ccf4d417d3d1825ebcaa814fac29719635eaccd8995c9177eee120 not found: ID does not exist" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.107650 4830 scope.go:117] "RemoveContainer" containerID="e84225cc224d9f46d3586fa610485cf75c0075f86c01acbf7bc669015be12afd" Mar 18 18:24:06 crc kubenswrapper[4830]: E0318 18:24:06.108040 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e84225cc224d9f46d3586fa610485cf75c0075f86c01acbf7bc669015be12afd\": container with ID starting with e84225cc224d9f46d3586fa610485cf75c0075f86c01acbf7bc669015be12afd not found: ID does not exist" containerID="e84225cc224d9f46d3586fa610485cf75c0075f86c01acbf7bc669015be12afd" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.108088 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e84225cc224d9f46d3586fa610485cf75c0075f86c01acbf7bc669015be12afd"} err="failed to get container status \"e84225cc224d9f46d3586fa610485cf75c0075f86c01acbf7bc669015be12afd\": rpc error: code = NotFound desc = could not find container \"e84225cc224d9f46d3586fa610485cf75c0075f86c01acbf7bc669015be12afd\": container with ID starting with e84225cc224d9f46d3586fa610485cf75c0075f86c01acbf7bc669015be12afd not found: ID does not exist" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.108114 4830 scope.go:117] "RemoveContainer" containerID="ca7c061ffd65d20aeedee0bcfba2a22fe5e5b5e41c7dfdde6be577142464368a" Mar 18 18:24:06 crc kubenswrapper[4830]: E0318 18:24:06.108383 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca7c061ffd65d20aeedee0bcfba2a22fe5e5b5e41c7dfdde6be577142464368a\": container with ID starting with ca7c061ffd65d20aeedee0bcfba2a22fe5e5b5e41c7dfdde6be577142464368a not found: ID does not exist" containerID="ca7c061ffd65d20aeedee0bcfba2a22fe5e5b5e41c7dfdde6be577142464368a" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.108416 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca7c061ffd65d20aeedee0bcfba2a22fe5e5b5e41c7dfdde6be577142464368a"} err="failed to get container status \"ca7c061ffd65d20aeedee0bcfba2a22fe5e5b5e41c7dfdde6be577142464368a\": rpc error: code = NotFound desc = could not find container \"ca7c061ffd65d20aeedee0bcfba2a22fe5e5b5e41c7dfdde6be577142464368a\": container with ID starting with ca7c061ffd65d20aeedee0bcfba2a22fe5e5b5e41c7dfdde6be577142464368a not found: ID does not exist" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.108428 4830 scope.go:117] "RemoveContainer" containerID="5448f2fb01d7197c5bd824c6050a90bcc7f9a161ebc18200f7991c27b0486779" Mar 18 18:24:06 crc kubenswrapper[4830]: E0318 18:24:06.108756 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5448f2fb01d7197c5bd824c6050a90bcc7f9a161ebc18200f7991c27b0486779\": container with ID starting with 5448f2fb01d7197c5bd824c6050a90bcc7f9a161ebc18200f7991c27b0486779 not found: ID does not exist" containerID="5448f2fb01d7197c5bd824c6050a90bcc7f9a161ebc18200f7991c27b0486779" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.108797 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5448f2fb01d7197c5bd824c6050a90bcc7f9a161ebc18200f7991c27b0486779"} err="failed to get container status \"5448f2fb01d7197c5bd824c6050a90bcc7f9a161ebc18200f7991c27b0486779\": rpc error: code = NotFound desc = could not find container \"5448f2fb01d7197c5bd824c6050a90bcc7f9a161ebc18200f7991c27b0486779\": container with ID starting with 5448f2fb01d7197c5bd824c6050a90bcc7f9a161ebc18200f7991c27b0486779 not found: ID does not exist" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.135059 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3da95371-091a-4a62-b2c9-92ed39b8a65c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3da95371-091a-4a62-b2c9-92ed39b8a65c" (UID: "3da95371-091a-4a62-b2c9-92ed39b8a65c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.146473 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3da95371-091a-4a62-b2c9-92ed39b8a65c-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "3da95371-091a-4a62-b2c9-92ed39b8a65c" (UID: "3da95371-091a-4a62-b2c9-92ed39b8a65c"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.167420 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3da95371-091a-4a62-b2c9-92ed39b8a65c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3da95371-091a-4a62-b2c9-92ed39b8a65c" (UID: "3da95371-091a-4a62-b2c9-92ed39b8a65c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.184231 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3da95371-091a-4a62-b2c9-92ed39b8a65c-config-data" (OuterVolumeSpecName: "config-data") pod "3da95371-091a-4a62-b2c9-92ed39b8a65c" (UID: "3da95371-091a-4a62-b2c9-92ed39b8a65c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.196510 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3da95371-091a-4a62-b2c9-92ed39b8a65c-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.196536 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3da95371-091a-4a62-b2c9-92ed39b8a65c-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.196546 4830 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3da95371-091a-4a62-b2c9-92ed39b8a65c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.196556 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3da95371-091a-4a62-b2c9-92ed39b8a65c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.196565 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwv47\" (UniqueName: \"kubernetes.io/projected/3da95371-091a-4a62-b2c9-92ed39b8a65c-kube-api-access-mwv47\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.196574 4830 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3da95371-091a-4a62-b2c9-92ed39b8a65c-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.248338 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe310aca-eb6c-414a-ace8-e55bc2fd4133" path="/var/lib/kubelet/pods/fe310aca-eb6c-414a-ace8-e55bc2fd4133/volumes" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.347490 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.356554 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.381814 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:24:06 crc kubenswrapper[4830]: E0318 18:24:06.382806 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3da95371-091a-4a62-b2c9-92ed39b8a65c" containerName="sg-core" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.382822 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="3da95371-091a-4a62-b2c9-92ed39b8a65c" containerName="sg-core" Mar 18 18:24:06 crc kubenswrapper[4830]: E0318 18:24:06.382856 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3da95371-091a-4a62-b2c9-92ed39b8a65c" containerName="ceilometer-notification-agent" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.382862 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="3da95371-091a-4a62-b2c9-92ed39b8a65c" containerName="ceilometer-notification-agent" Mar 18 18:24:06 crc kubenswrapper[4830]: E0318 18:24:06.382877 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86e46d34-ca52-4c41-88c3-376e6219e90f" containerName="oc" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.382885 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="86e46d34-ca52-4c41-88c3-376e6219e90f" containerName="oc" Mar 18 18:24:06 crc kubenswrapper[4830]: E0318 18:24:06.382893 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3da95371-091a-4a62-b2c9-92ed39b8a65c" containerName="proxy-httpd" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.382899 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="3da95371-091a-4a62-b2c9-92ed39b8a65c" containerName="proxy-httpd" Mar 18 18:24:06 crc kubenswrapper[4830]: E0318 18:24:06.382906 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3da95371-091a-4a62-b2c9-92ed39b8a65c" containerName="ceilometer-central-agent" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.382912 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="3da95371-091a-4a62-b2c9-92ed39b8a65c" containerName="ceilometer-central-agent" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.383106 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="3da95371-091a-4a62-b2c9-92ed39b8a65c" containerName="ceilometer-central-agent" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.383125 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="3da95371-091a-4a62-b2c9-92ed39b8a65c" containerName="sg-core" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.383137 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="3da95371-091a-4a62-b2c9-92ed39b8a65c" containerName="ceilometer-notification-agent" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.383145 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="3da95371-091a-4a62-b2c9-92ed39b8a65c" containerName="proxy-httpd" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.383161 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="86e46d34-ca52-4c41-88c3-376e6219e90f" containerName="oc" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.384708 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.391661 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.392383 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.392503 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.393429 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.403441 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4kh5\" (UniqueName: \"kubernetes.io/projected/eaec193f-d7b0-4d62-8133-3c1b094a1c71-kube-api-access-b4kh5\") pod \"ceilometer-0\" (UID: \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\") " pod="openstack/ceilometer-0" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.403563 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaec193f-d7b0-4d62-8133-3c1b094a1c71-run-httpd\") pod \"ceilometer-0\" (UID: \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\") " pod="openstack/ceilometer-0" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.403633 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaec193f-d7b0-4d62-8133-3c1b094a1c71-scripts\") pod \"ceilometer-0\" (UID: \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\") " pod="openstack/ceilometer-0" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.403700 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eaec193f-d7b0-4d62-8133-3c1b094a1c71-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\") " pod="openstack/ceilometer-0" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.403740 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaec193f-d7b0-4d62-8133-3c1b094a1c71-log-httpd\") pod \"ceilometer-0\" (UID: \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\") " pod="openstack/ceilometer-0" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.403848 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaec193f-d7b0-4d62-8133-3c1b094a1c71-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\") " pod="openstack/ceilometer-0" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.403915 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaec193f-d7b0-4d62-8133-3c1b094a1c71-config-data\") pod \"ceilometer-0\" (UID: \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\") " pod="openstack/ceilometer-0" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.403947 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaec193f-d7b0-4d62-8133-3c1b094a1c71-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\") " pod="openstack/ceilometer-0" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.505453 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eaec193f-d7b0-4d62-8133-3c1b094a1c71-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\") " pod="openstack/ceilometer-0" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.505502 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaec193f-d7b0-4d62-8133-3c1b094a1c71-log-httpd\") pod \"ceilometer-0\" (UID: \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\") " pod="openstack/ceilometer-0" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.505561 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaec193f-d7b0-4d62-8133-3c1b094a1c71-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\") " pod="openstack/ceilometer-0" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.505603 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaec193f-d7b0-4d62-8133-3c1b094a1c71-config-data\") pod \"ceilometer-0\" (UID: \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\") " pod="openstack/ceilometer-0" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.505624 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaec193f-d7b0-4d62-8133-3c1b094a1c71-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\") " pod="openstack/ceilometer-0" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.505718 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4kh5\" (UniqueName: \"kubernetes.io/projected/eaec193f-d7b0-4d62-8133-3c1b094a1c71-kube-api-access-b4kh5\") pod \"ceilometer-0\" (UID: \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\") " pod="openstack/ceilometer-0" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.505887 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaec193f-d7b0-4d62-8133-3c1b094a1c71-run-httpd\") pod \"ceilometer-0\" (UID: \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\") " pod="openstack/ceilometer-0" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.506175 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaec193f-d7b0-4d62-8133-3c1b094a1c71-log-httpd\") pod \"ceilometer-0\" (UID: \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\") " pod="openstack/ceilometer-0" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.506239 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaec193f-d7b0-4d62-8133-3c1b094a1c71-scripts\") pod \"ceilometer-0\" (UID: \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\") " pod="openstack/ceilometer-0" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.506291 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaec193f-d7b0-4d62-8133-3c1b094a1c71-run-httpd\") pod \"ceilometer-0\" (UID: \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\") " pod="openstack/ceilometer-0" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.509551 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaec193f-d7b0-4d62-8133-3c1b094a1c71-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\") " pod="openstack/ceilometer-0" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.510267 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaec193f-d7b0-4d62-8133-3c1b094a1c71-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\") " pod="openstack/ceilometer-0" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.510620 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eaec193f-d7b0-4d62-8133-3c1b094a1c71-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\") " pod="openstack/ceilometer-0" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.510867 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaec193f-d7b0-4d62-8133-3c1b094a1c71-config-data\") pod \"ceilometer-0\" (UID: \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\") " pod="openstack/ceilometer-0" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.517589 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaec193f-d7b0-4d62-8133-3c1b094a1c71-scripts\") pod \"ceilometer-0\" (UID: \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\") " pod="openstack/ceilometer-0" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.520738 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4kh5\" (UniqueName: \"kubernetes.io/projected/eaec193f-d7b0-4d62-8133-3c1b094a1c71-kube-api-access-b4kh5\") pod \"ceilometer-0\" (UID: \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\") " pod="openstack/ceilometer-0" Mar 18 18:24:06 crc kubenswrapper[4830]: I0318 18:24:06.716070 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:24:07 crc kubenswrapper[4830]: I0318 18:24:07.218738 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:24:07 crc kubenswrapper[4830]: I0318 18:24:07.436503 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 18:24:07 crc kubenswrapper[4830]: I0318 18:24:07.437244 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 18:24:07 crc kubenswrapper[4830]: I0318 18:24:07.446181 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 18:24:08 crc kubenswrapper[4830]: I0318 18:24:08.040659 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eaec193f-d7b0-4d62-8133-3c1b094a1c71","Type":"ContainerStarted","Data":"65066b3a4fe0d4c7187ffc8f87f73fa9a33a31d62b060e41f62c950b0fe762f3"} Mar 18 18:24:08 crc kubenswrapper[4830]: I0318 18:24:08.041083 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eaec193f-d7b0-4d62-8133-3c1b094a1c71","Type":"ContainerStarted","Data":"a63310a9928b07dd1360ac3b6c497432a25106199f7212391886bee0dcf8cbb8"} Mar 18 18:24:08 crc kubenswrapper[4830]: I0318 18:24:08.052589 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 18:24:08 crc kubenswrapper[4830]: I0318 18:24:08.256035 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3da95371-091a-4a62-b2c9-92ed39b8a65c" path="/var/lib/kubelet/pods/3da95371-091a-4a62-b2c9-92ed39b8a65c/volumes" Mar 18 18:24:10 crc kubenswrapper[4830]: I0318 18:24:10.069743 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eaec193f-d7b0-4d62-8133-3c1b094a1c71","Type":"ContainerStarted","Data":"2e586d789cc93d7b5024e68fdd566bb1e63ba1b1e2a073f9da5ce7c5613a1dec"} Mar 18 18:24:11 crc kubenswrapper[4830]: I0318 18:24:11.086659 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eaec193f-d7b0-4d62-8133-3c1b094a1c71","Type":"ContainerStarted","Data":"005913a3f36b52570d077d6af5c36588b42eb3f96b9355948ef0a743de24a6ba"} Mar 18 18:24:12 crc kubenswrapper[4830]: I0318 18:24:12.398055 4830 scope.go:117] "RemoveContainer" containerID="0d178a10cd244dd97bedc863ccf064d241d53860aa33a14df01d0859301aff05" Mar 18 18:24:14 crc kubenswrapper[4830]: I0318 18:24:14.140080 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eaec193f-d7b0-4d62-8133-3c1b094a1c71","Type":"ContainerStarted","Data":"63b37854cf719feb1c02ec413574066ba4e4851ec5e2dcf31206e9b303fe11b9"} Mar 18 18:24:14 crc kubenswrapper[4830]: I0318 18:24:14.140852 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 18:24:14 crc kubenswrapper[4830]: I0318 18:24:14.178458 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.503300587 podStartE2EDuration="8.178438351s" podCreationTimestamp="2026-03-18 18:24:06 +0000 UTC" firstStartedPulling="2026-03-18 18:24:07.232050063 +0000 UTC m=+1281.799680425" lastFinishedPulling="2026-03-18 18:24:12.907187817 +0000 UTC m=+1287.474818189" observedRunningTime="2026-03-18 18:24:14.167313267 +0000 UTC m=+1288.734943599" watchObservedRunningTime="2026-03-18 18:24:14.178438351 +0000 UTC m=+1288.746068683" Mar 18 18:24:36 crc kubenswrapper[4830]: I0318 18:24:36.740208 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.226760 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-lhdqd"] Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.228722 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lhdqd" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.243790 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.250899 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-lhdqd"] Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.376004 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf6vv\" (UniqueName: \"kubernetes.io/projected/e3a34a0e-8390-4618-8b6e-c27ed8adc51a-kube-api-access-zf6vv\") pod \"root-account-create-update-lhdqd\" (UID: \"e3a34a0e-8390-4618-8b6e-c27ed8adc51a\") " pod="openstack/root-account-create-update-lhdqd" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.376077 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3a34a0e-8390-4618-8b6e-c27ed8adc51a-operator-scripts\") pod \"root-account-create-update-lhdqd\" (UID: \"e3a34a0e-8390-4618-8b6e-c27ed8adc51a\") " pod="openstack/root-account-create-update-lhdqd" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.429126 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-z5m7v"] Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.451026 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-z5m7v"] Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.479924 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf6vv\" (UniqueName: \"kubernetes.io/projected/e3a34a0e-8390-4618-8b6e-c27ed8adc51a-kube-api-access-zf6vv\") pod \"root-account-create-update-lhdqd\" (UID: \"e3a34a0e-8390-4618-8b6e-c27ed8adc51a\") " pod="openstack/root-account-create-update-lhdqd" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.480008 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3a34a0e-8390-4618-8b6e-c27ed8adc51a-operator-scripts\") pod \"root-account-create-update-lhdqd\" (UID: \"e3a34a0e-8390-4618-8b6e-c27ed8adc51a\") " pod="openstack/root-account-create-update-lhdqd" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.480954 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3a34a0e-8390-4618-8b6e-c27ed8adc51a-operator-scripts\") pod \"root-account-create-update-lhdqd\" (UID: \"e3a34a0e-8390-4618-8b6e-c27ed8adc51a\") " pod="openstack/root-account-create-update-lhdqd" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.503846 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.504113 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="1965a180-09c8-4af1-852e-7792c02564ca" containerName="openstackclient" containerID="cri-o://ad0f8b84dc205164a749c73530020347ca97fd9f6445a06f2cc16f1876d40ecc" gracePeriod=2 Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.529020 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-4043-account-create-update-qpth4"] Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.530319 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4043-account-create-update-qpth4" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.538311 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.569829 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.583467 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf6vv\" (UniqueName: \"kubernetes.io/projected/e3a34a0e-8390-4618-8b6e-c27ed8adc51a-kube-api-access-zf6vv\") pod \"root-account-create-update-lhdqd\" (UID: \"e3a34a0e-8390-4618-8b6e-c27ed8adc51a\") " pod="openstack/root-account-create-update-lhdqd" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.595248 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-4043-account-create-update-qpth4"] Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.633551 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-2062-account-create-update-92hq2"] Mar 18 18:24:57 crc kubenswrapper[4830]: E0318 18:24:57.634351 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1965a180-09c8-4af1-852e-7792c02564ca" containerName="openstackclient" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.634376 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1965a180-09c8-4af1-852e-7792c02564ca" containerName="openstackclient" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.634638 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="1965a180-09c8-4af1-852e-7792c02564ca" containerName="openstackclient" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.635706 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2062-account-create-update-92hq2" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.649830 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2062-account-create-update-92hq2"] Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.658180 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.685414 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/781fdccd-a9f3-40ce-9234-d651c079eb1e-operator-scripts\") pod \"barbican-4043-account-create-update-qpth4\" (UID: \"781fdccd-a9f3-40ce-9234-d651c079eb1e\") " pod="openstack/barbican-4043-account-create-update-qpth4" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.685705 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gjp4\" (UniqueName: \"kubernetes.io/projected/781fdccd-a9f3-40ce-9234-d651c079eb1e-kube-api-access-6gjp4\") pod \"barbican-4043-account-create-update-qpth4\" (UID: \"781fdccd-a9f3-40ce-9234-d651c079eb1e\") " pod="openstack/barbican-4043-account-create-update-qpth4" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.688615 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b8f8-account-create-update-knfmq"] Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.689972 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b8f8-account-create-update-knfmq" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.710359 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.750968 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.790346 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/781fdccd-a9f3-40ce-9234-d651c079eb1e-operator-scripts\") pod \"barbican-4043-account-create-update-qpth4\" (UID: \"781fdccd-a9f3-40ce-9234-d651c079eb1e\") " pod="openstack/barbican-4043-account-create-update-qpth4" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.790418 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0df8dbfa-578e-4edf-ac2a-2030b582bc63-operator-scripts\") pod \"neutron-b8f8-account-create-update-knfmq\" (UID: \"0df8dbfa-578e-4edf-ac2a-2030b582bc63\") " pod="openstack/neutron-b8f8-account-create-update-knfmq" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.790511 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gjp4\" (UniqueName: \"kubernetes.io/projected/781fdccd-a9f3-40ce-9234-d651c079eb1e-kube-api-access-6gjp4\") pod \"barbican-4043-account-create-update-qpth4\" (UID: \"781fdccd-a9f3-40ce-9234-d651c079eb1e\") " pod="openstack/barbican-4043-account-create-update-qpth4" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.790557 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1e4500f-681e-433d-8283-008eec618721-operator-scripts\") pod \"placement-2062-account-create-update-92hq2\" (UID: \"b1e4500f-681e-433d-8283-008eec618721\") " pod="openstack/placement-2062-account-create-update-92hq2" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.790583 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7zpv\" (UniqueName: \"kubernetes.io/projected/0df8dbfa-578e-4edf-ac2a-2030b582bc63-kube-api-access-w7zpv\") pod \"neutron-b8f8-account-create-update-knfmq\" (UID: \"0df8dbfa-578e-4edf-ac2a-2030b582bc63\") " pod="openstack/neutron-b8f8-account-create-update-knfmq" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.790671 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn6h4\" (UniqueName: \"kubernetes.io/projected/b1e4500f-681e-433d-8283-008eec618721-kube-api-access-tn6h4\") pod \"placement-2062-account-create-update-92hq2\" (UID: \"b1e4500f-681e-433d-8283-008eec618721\") " pod="openstack/placement-2062-account-create-update-92hq2" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.791067 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/781fdccd-a9f3-40ce-9234-d651c079eb1e-operator-scripts\") pod \"barbican-4043-account-create-update-qpth4\" (UID: \"781fdccd-a9f3-40ce-9234-d651c079eb1e\") " pod="openstack/barbican-4043-account-create-update-qpth4" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.817118 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b8f8-account-create-update-knfmq"] Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.863440 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-4043-account-create-update-d7kjb"] Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.874806 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-4043-account-create-update-d7kjb"] Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.875275 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lhdqd" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.878710 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gjp4\" (UniqueName: \"kubernetes.io/projected/781fdccd-a9f3-40ce-9234-d651c079eb1e-kube-api-access-6gjp4\") pod \"barbican-4043-account-create-update-qpth4\" (UID: \"781fdccd-a9f3-40ce-9234-d651c079eb1e\") " pod="openstack/barbican-4043-account-create-update-qpth4" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.879505 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4043-account-create-update-qpth4" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.895589 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1e4500f-681e-433d-8283-008eec618721-operator-scripts\") pod \"placement-2062-account-create-update-92hq2\" (UID: \"b1e4500f-681e-433d-8283-008eec618721\") " pod="openstack/placement-2062-account-create-update-92hq2" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.895629 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7zpv\" (UniqueName: \"kubernetes.io/projected/0df8dbfa-578e-4edf-ac2a-2030b582bc63-kube-api-access-w7zpv\") pod \"neutron-b8f8-account-create-update-knfmq\" (UID: \"0df8dbfa-578e-4edf-ac2a-2030b582bc63\") " pod="openstack/neutron-b8f8-account-create-update-knfmq" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.895717 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn6h4\" (UniqueName: \"kubernetes.io/projected/b1e4500f-681e-433d-8283-008eec618721-kube-api-access-tn6h4\") pod \"placement-2062-account-create-update-92hq2\" (UID: \"b1e4500f-681e-433d-8283-008eec618721\") " pod="openstack/placement-2062-account-create-update-92hq2" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.895855 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0df8dbfa-578e-4edf-ac2a-2030b582bc63-operator-scripts\") pod \"neutron-b8f8-account-create-update-knfmq\" (UID: \"0df8dbfa-578e-4edf-ac2a-2030b582bc63\") " pod="openstack/neutron-b8f8-account-create-update-knfmq" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.899700 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1e4500f-681e-433d-8283-008eec618721-operator-scripts\") pod \"placement-2062-account-create-update-92hq2\" (UID: \"b1e4500f-681e-433d-8283-008eec618721\") " pod="openstack/placement-2062-account-create-update-92hq2" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.900581 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0df8dbfa-578e-4edf-ac2a-2030b582bc63-operator-scripts\") pod \"neutron-b8f8-account-create-update-knfmq\" (UID: \"0df8dbfa-578e-4edf-ac2a-2030b582bc63\") " pod="openstack/neutron-b8f8-account-create-update-knfmq" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.919916 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.920486 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="f82e5b2f-cb79-4b83-901f-eca64116c6dc" containerName="cinder-scheduler" containerID="cri-o://b5a1cb9ea4b62aec9ff11f16f75f04dd21b28a1d37ed79fbc6fa3de1b8390289" gracePeriod=30 Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.920876 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="f82e5b2f-cb79-4b83-901f-eca64116c6dc" containerName="probe" containerID="cri-o://29d7529f10ab82210873010bcc63b7af8c9609591cd2c3e31e8d0689d2b017f6" gracePeriod=30 Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.977537 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7zpv\" (UniqueName: \"kubernetes.io/projected/0df8dbfa-578e-4edf-ac2a-2030b582bc63-kube-api-access-w7zpv\") pod \"neutron-b8f8-account-create-update-knfmq\" (UID: \"0df8dbfa-578e-4edf-ac2a-2030b582bc63\") " pod="openstack/neutron-b8f8-account-create-update-knfmq" Mar 18 18:24:57 crc kubenswrapper[4830]: I0318 18:24:57.982984 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn6h4\" (UniqueName: \"kubernetes.io/projected/b1e4500f-681e-433d-8283-008eec618721-kube-api-access-tn6h4\") pod \"placement-2062-account-create-update-92hq2\" (UID: \"b1e4500f-681e-433d-8283-008eec618721\") " pod="openstack/placement-2062-account-create-update-92hq2" Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.015187 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-2062-account-create-update-zb4gw"] Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.070455 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b8f8-account-create-update-knfmq" Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.075723 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-2062-account-create-update-zb4gw"] Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.142269 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-7f98-account-create-update-lcc8p"] Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.146227 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7f98-account-create-update-lcc8p" Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.156861 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.157213 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b" containerName="cinder-api-log" containerID="cri-o://7f2292a71a9c798a2c17f9c6fc6b6d12fc68c263b920ce067f07390c0bc23f23" gracePeriod=30 Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.157399 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b" containerName="cinder-api" containerID="cri-o://29fc62aa8b0c7dff64144c93d1f53c7be2667c73d45b77f8b2e9fee0136dd279" gracePeriod=30 Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.170875 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7f98-account-create-update-lcc8p"] Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.181298 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.185888 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-a2d6-account-create-update-ltf4b"] Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.187423 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a2d6-account-create-update-ltf4b" Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.224614 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.243052 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4120b308-df6b-45df-ab90-abc5417228e5-operator-scripts\") pod \"cinder-7f98-account-create-update-lcc8p\" (UID: \"4120b308-df6b-45df-ab90-abc5417228e5\") " pod="openstack/cinder-7f98-account-create-update-lcc8p" Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.244115 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6llb\" (UniqueName: \"kubernetes.io/projected/4120b308-df6b-45df-ab90-abc5417228e5-kube-api-access-z6llb\") pod \"cinder-7f98-account-create-update-lcc8p\" (UID: \"4120b308-df6b-45df-ab90-abc5417228e5\") " pod="openstack/cinder-7f98-account-create-update-lcc8p" Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.245356 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.245696 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="7b116575-f650-432e-9eb8-31b6f16b027c" containerName="ovn-northd" containerID="cri-o://5dd7c3004c5f8608ed4722eddd8ec0a5d064fff0cca450e65c3c344caa64b4da" gracePeriod=30 Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.245912 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="7b116575-f650-432e-9eb8-31b6f16b027c" containerName="openstack-network-exporter" containerID="cri-o://1cae5bbd9865bbf63fae7e180aeb6c01f50309bbfd9244d4790218d40ab51f78" gracePeriod=30 Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.265455 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2062-account-create-update-92hq2" Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.365749 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6llb\" (UniqueName: \"kubernetes.io/projected/4120b308-df6b-45df-ab90-abc5417228e5-kube-api-access-z6llb\") pod \"cinder-7f98-account-create-update-lcc8p\" (UID: \"4120b308-df6b-45df-ab90-abc5417228e5\") " pod="openstack/cinder-7f98-account-create-update-lcc8p" Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.365962 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4120b308-df6b-45df-ab90-abc5417228e5-operator-scripts\") pod \"cinder-7f98-account-create-update-lcc8p\" (UID: \"4120b308-df6b-45df-ab90-abc5417228e5\") " pod="openstack/cinder-7f98-account-create-update-lcc8p" Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.366014 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2c4q\" (UniqueName: \"kubernetes.io/projected/a3421452-ceb9-441f-8982-77c0a33c7a3b-kube-api-access-p2c4q\") pod \"nova-api-a2d6-account-create-update-ltf4b\" (UID: \"a3421452-ceb9-441f-8982-77c0a33c7a3b\") " pod="openstack/nova-api-a2d6-account-create-update-ltf4b" Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.366155 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3421452-ceb9-441f-8982-77c0a33c7a3b-operator-scripts\") pod \"nova-api-a2d6-account-create-update-ltf4b\" (UID: \"a3421452-ceb9-441f-8982-77c0a33c7a3b\") " pod="openstack/nova-api-a2d6-account-create-update-ltf4b" Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.368084 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4120b308-df6b-45df-ab90-abc5417228e5-operator-scripts\") pod \"cinder-7f98-account-create-update-lcc8p\" (UID: \"4120b308-df6b-45df-ab90-abc5417228e5\") " pod="openstack/cinder-7f98-account-create-update-lcc8p" Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.448995 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e6b8f3f-7b85-4504-b582-07edbfee2020" path="/var/lib/kubelet/pods/7e6b8f3f-7b85-4504-b582-07edbfee2020/volumes" Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.470216 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be89bcb9-66b2-4bbd-bc78-be14e9503088" path="/var/lib/kubelet/pods/be89bcb9-66b2-4bbd-bc78-be14e9503088/volumes" Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.470947 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca5d6885-918e-49f2-8fdf-0098353bb996" path="/var/lib/kubelet/pods/ca5d6885-918e-49f2-8fdf-0098353bb996/volumes" Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.471652 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-a2d6-account-create-update-ltf4b"] Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.494704 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2c4q\" (UniqueName: \"kubernetes.io/projected/a3421452-ceb9-441f-8982-77c0a33c7a3b-kube-api-access-p2c4q\") pod \"nova-api-a2d6-account-create-update-ltf4b\" (UID: \"a3421452-ceb9-441f-8982-77c0a33c7a3b\") " pod="openstack/nova-api-a2d6-account-create-update-ltf4b" Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.509064 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b8f8-account-create-update-vwb9c"] Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.509127 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3421452-ceb9-441f-8982-77c0a33c7a3b-operator-scripts\") pod \"nova-api-a2d6-account-create-update-ltf4b\" (UID: \"a3421452-ceb9-441f-8982-77c0a33c7a3b\") " pod="openstack/nova-api-a2d6-account-create-update-ltf4b" Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.516676 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3421452-ceb9-441f-8982-77c0a33c7a3b-operator-scripts\") pod \"nova-api-a2d6-account-create-update-ltf4b\" (UID: \"a3421452-ceb9-441f-8982-77c0a33c7a3b\") " pod="openstack/nova-api-a2d6-account-create-update-ltf4b" Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.529539 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6llb\" (UniqueName: \"kubernetes.io/projected/4120b308-df6b-45df-ab90-abc5417228e5-kube-api-access-z6llb\") pod \"cinder-7f98-account-create-update-lcc8p\" (UID: \"4120b308-df6b-45df-ab90-abc5417228e5\") " pod="openstack/cinder-7f98-account-create-update-lcc8p" Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.562866 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b8f8-account-create-update-vwb9c"] Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.588216 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-jkvj9"] Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.588486 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-jkvj9" podUID="60094d0f-d530-424e-92d1-62e473acc664" containerName="openstack-network-exporter" containerID="cri-o://42b14e059955cc8e166c0627991a760521592d7af71c5890c3dca6e2c64b9fb8" gracePeriod=30 Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.631526 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-dv8kn"] Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.653724 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-7f98-account-create-update-5lnx7"] Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.696527 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2c4q\" (UniqueName: \"kubernetes.io/projected/a3421452-ceb9-441f-8982-77c0a33c7a3b-kube-api-access-p2c4q\") pod \"nova-api-a2d6-account-create-update-ltf4b\" (UID: \"a3421452-ceb9-441f-8982-77c0a33c7a3b\") " pod="openstack/nova-api-a2d6-account-create-update-ltf4b" Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.748176 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-chwf9"] Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.787832 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-hxlrh"] Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.790494 4830 generic.go:334] "Generic (PLEG): container finished" podID="9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b" containerID="7f2292a71a9c798a2c17f9c6fc6b6d12fc68c263b920ce067f07390c0bc23f23" exitCode=143 Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.790624 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b","Type":"ContainerDied","Data":"7f2292a71a9c798a2c17f9c6fc6b6d12fc68c263b920ce067f07390c0bc23f23"} Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.817810 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a2d6-account-create-update-ltf4b" Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.823119 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7f98-account-create-update-lcc8p" Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.835008 4830 generic.go:334] "Generic (PLEG): container finished" podID="7b116575-f650-432e-9eb8-31b6f16b027c" containerID="1cae5bbd9865bbf63fae7e180aeb6c01f50309bbfd9244d4790218d40ab51f78" exitCode=2 Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.835486 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"7b116575-f650-432e-9eb8-31b6f16b027c","Type":"ContainerDied","Data":"1cae5bbd9865bbf63fae7e180aeb6c01f50309bbfd9244d4790218d40ab51f78"} Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.858026 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-03fa-account-create-update-fsvc6"] Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.859549 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-03fa-account-create-update-fsvc6" Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.870448 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.914081 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-7f98-account-create-update-5lnx7"] Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.957171 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4a0587e-8ede-4ec6-beb7-7bea2c0af8bd-operator-scripts\") pod \"nova-cell0-03fa-account-create-update-fsvc6\" (UID: \"a4a0587e-8ede-4ec6-beb7-7bea2c0af8bd\") " pod="openstack/nova-cell0-03fa-account-create-update-fsvc6" Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.957266 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86vlz\" (UniqueName: \"kubernetes.io/projected/a4a0587e-8ede-4ec6-beb7-7bea2c0af8bd-kube-api-access-86vlz\") pod \"nova-cell0-03fa-account-create-update-fsvc6\" (UID: \"a4a0587e-8ede-4ec6-beb7-7bea2c0af8bd\") " pod="openstack/nova-cell0-03fa-account-create-update-fsvc6" Mar 18 18:24:58 crc kubenswrapper[4830]: I0318 18:24:58.966588 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-hxlrh"] Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.029862 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-88cd-account-create-update-8vhqn"] Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.031269 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-88cd-account-create-update-8vhqn" Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.033719 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.049835 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-03fa-account-create-update-fsvc6"] Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.066495 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4a0587e-8ede-4ec6-beb7-7bea2c0af8bd-operator-scripts\") pod \"nova-cell0-03fa-account-create-update-fsvc6\" (UID: \"a4a0587e-8ede-4ec6-beb7-7bea2c0af8bd\") " pod="openstack/nova-cell0-03fa-account-create-update-fsvc6" Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.066617 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86vlz\" (UniqueName: \"kubernetes.io/projected/a4a0587e-8ede-4ec6-beb7-7bea2c0af8bd-kube-api-access-86vlz\") pod \"nova-cell0-03fa-account-create-update-fsvc6\" (UID: \"a4a0587e-8ede-4ec6-beb7-7bea2c0af8bd\") " pod="openstack/nova-cell0-03fa-account-create-update-fsvc6" Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.067867 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4a0587e-8ede-4ec6-beb7-7bea2c0af8bd-operator-scripts\") pod \"nova-cell0-03fa-account-create-update-fsvc6\" (UID: \"a4a0587e-8ede-4ec6-beb7-7bea2c0af8bd\") " pod="openstack/nova-cell0-03fa-account-create-update-fsvc6" Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.076868 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-88cd-account-create-update-8vhqn"] Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.127842 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-a2d6-account-create-update-qh85j"] Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.170593 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5-operator-scripts\") pod \"nova-cell1-88cd-account-create-update-8vhqn\" (UID: \"eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5\") " pod="openstack/nova-cell1-88cd-account-create-update-8vhqn" Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.170942 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcd8w\" (UniqueName: \"kubernetes.io/projected/eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5-kube-api-access-wcd8w\") pod \"nova-cell1-88cd-account-create-update-8vhqn\" (UID: \"eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5\") " pod="openstack/nova-cell1-88cd-account-create-update-8vhqn" Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.171075 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-a2d6-account-create-update-qh85j"] Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.193825 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-msrv5"] Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.214246 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-msrv5"] Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.216147 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86vlz\" (UniqueName: \"kubernetes.io/projected/a4a0587e-8ede-4ec6-beb7-7bea2c0af8bd-kube-api-access-86vlz\") pod \"nova-cell0-03fa-account-create-update-fsvc6\" (UID: \"a4a0587e-8ede-4ec6-beb7-7bea2c0af8bd\") " pod="openstack/nova-cell0-03fa-account-create-update-fsvc6" Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.326914 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-03fa-account-create-update-b4pjm"] Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.338996 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-03fa-account-create-update-b4pjm"] Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.369608 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-wvvs5"] Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.400927 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-03fa-account-create-update-fsvc6" Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.408626 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcd8w\" (UniqueName: \"kubernetes.io/projected/eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5-kube-api-access-wcd8w\") pod \"nova-cell1-88cd-account-create-update-8vhqn\" (UID: \"eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5\") " pod="openstack/nova-cell1-88cd-account-create-update-8vhqn" Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.408958 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5-operator-scripts\") pod \"nova-cell1-88cd-account-create-update-8vhqn\" (UID: \"eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5\") " pod="openstack/nova-cell1-88cd-account-create-update-8vhqn" Mar 18 18:24:59 crc kubenswrapper[4830]: E0318 18:24:59.409124 4830 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 18 18:24:59 crc kubenswrapper[4830]: E0318 18:24:59.409176 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5-operator-scripts podName:eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5 nodeName:}" failed. No retries permitted until 2026-03-18 18:24:59.909159049 +0000 UTC m=+1334.476789381 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5-operator-scripts") pod "nova-cell1-88cd-account-create-update-8vhqn" (UID: "eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5") : configmap "openstack-cell1-scripts" not found Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.435480 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-wvvs5"] Mar 18 18:24:59 crc kubenswrapper[4830]: E0318 18:24:59.446574 4830 projected.go:194] Error preparing data for projected volume kube-api-access-wcd8w for pod openstack/nova-cell1-88cd-account-create-update-8vhqn: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Mar 18 18:24:59 crc kubenswrapper[4830]: E0318 18:24:59.447369 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5-kube-api-access-wcd8w podName:eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5 nodeName:}" failed. No retries permitted until 2026-03-18 18:24:59.947349548 +0000 UTC m=+1334.514979880 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-wcd8w" (UniqueName: "kubernetes.io/projected/eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5-kube-api-access-wcd8w") pod "nova-cell1-88cd-account-create-update-8vhqn" (UID: "eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Mar 18 18:24:59 crc kubenswrapper[4830]: E0318 18:24:59.453798 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5dd7c3004c5f8608ed4722eddd8ec0a5d064fff0cca450e65c3c344caa64b4da" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.460669 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 18:24:59 crc kubenswrapper[4830]: E0318 18:24:59.483860 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5dd7c3004c5f8608ed4722eddd8ec0a5d064fff0cca450e65c3c344caa64b4da" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 18 18:24:59 crc kubenswrapper[4830]: E0318 18:24:59.573316 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5dd7c3004c5f8608ed4722eddd8ec0a5d064fff0cca450e65c3c344caa64b4da" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 18 18:24:59 crc kubenswrapper[4830]: E0318 18:24:59.573385 4830 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="7b116575-f650-432e-9eb8-31b6f16b027c" containerName="ovn-northd" Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.587133 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.587565 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="e6d11dd9-4b5b-463e-a834-91c7ecc8b021" containerName="openstack-network-exporter" containerID="cri-o://6d02c3d4022f8ff71336fe32eb97efefa0f42dad83cb62b31f62c9f071d62b10" gracePeriod=300 Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.604040 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-88cd-account-create-update-qgxqj"] Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.611830 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.612279 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="93739148-39fb-4db3-ae9d-d222feb368d7" containerName="openstack-network-exporter" containerID="cri-o://e6ff33896ab819ecb0f974d24f8341e4cd47187d5b83f4031d921d854055799e" gracePeriod=300 Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.623528 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bd85b459c-7ck7d"] Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.623786 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bd85b459c-7ck7d" podUID="86ecee90-92ea-4ef1-a871-49018c2ac648" containerName="dnsmasq-dns" containerID="cri-o://1eb0db0b8dfbe3a3b14e7bb26b25f620aed32ce646e43dd05cbe50fab52b6163" gracePeriod=10 Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.641513 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-88cd-account-create-update-qgxqj"] Mar 18 18:24:59 crc kubenswrapper[4830]: E0318 18:24:59.657009 4830 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-chwf9" message=< Mar 18 18:24:59 crc kubenswrapper[4830]: Exiting ovn-controller (1) [ OK ] Mar 18 18:24:59 crc kubenswrapper[4830]: > Mar 18 18:24:59 crc kubenswrapper[4830]: E0318 18:24:59.657051 4830 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-chwf9" podUID="544c01f7-a6da-45de-96f2-9ab9dea0567c" containerName="ovn-controller" containerID="cri-o://7eab1cf8b6cb575621ae6e6f99b624e1a23b211fa8cf4fe29aa7e8049a993337" Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.657087 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-chwf9" podUID="544c01f7-a6da-45de-96f2-9ab9dea0567c" containerName="ovn-controller" containerID="cri-o://7eab1cf8b6cb575621ae6e6f99b624e1a23b211fa8cf4fe29aa7e8049a993337" gracePeriod=30 Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.663370 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-vgb8c"] Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.685843 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-vgb8c"] Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.697932 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-nmp7q"] Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.745156 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-19b0-account-create-update-tm69m"] Mar 18 18:24:59 crc kubenswrapper[4830]: E0318 18:24:59.746030 4830 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 18 18:24:59 crc kubenswrapper[4830]: E0318 18:24:59.746077 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/56fb6c83-b748-4e21-9b1c-90fb37cefea1-config-data podName:56fb6c83-b748-4e21-9b1c-90fb37cefea1 nodeName:}" failed. No retries permitted until 2026-03-18 18:25:00.24606296 +0000 UTC m=+1334.813693292 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/56fb6c83-b748-4e21-9b1c-90fb37cefea1-config-data") pod "rabbitmq-server-0" (UID: "56fb6c83-b748-4e21-9b1c-90fb37cefea1") : configmap "rabbitmq-config-data" not found Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.748671 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="e6d11dd9-4b5b-463e-a834-91c7ecc8b021" containerName="ovsdbserver-sb" containerID="cri-o://e27720e7dca97ec5784c549e6e6c7e84e6b4913613d159710e88f4288654e511" gracePeriod=300 Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.798270 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="93739148-39fb-4db3-ae9d-d222feb368d7" containerName="ovsdbserver-nb" containerID="cri-o://3073305b4183467e7f6c2b40e18ca0a3dc5dd325e4392cdfee5efad929986263" gracePeriod=300 Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.845967 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-nmp7q"] Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.921450 4830 generic.go:334] "Generic (PLEG): container finished" podID="86ecee90-92ea-4ef1-a871-49018c2ac648" containerID="1eb0db0b8dfbe3a3b14e7bb26b25f620aed32ce646e43dd05cbe50fab52b6163" exitCode=0 Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.921586 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bd85b459c-7ck7d" event={"ID":"86ecee90-92ea-4ef1-a871-49018c2ac648","Type":"ContainerDied","Data":"1eb0db0b8dfbe3a3b14e7bb26b25f620aed32ce646e43dd05cbe50fab52b6163"} Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.930933 4830 generic.go:334] "Generic (PLEG): container finished" podID="f82e5b2f-cb79-4b83-901f-eca64116c6dc" containerID="29d7529f10ab82210873010bcc63b7af8c9609591cd2c3e31e8d0689d2b017f6" exitCode=0 Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.931124 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f82e5b2f-cb79-4b83-901f-eca64116c6dc","Type":"ContainerDied","Data":"29d7529f10ab82210873010bcc63b7af8c9609591cd2c3e31e8d0689d2b017f6"} Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.939171 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-19b0-account-create-update-tm69m"] Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.950922 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_93739148-39fb-4db3-ae9d-d222feb368d7/ovsdbserver-nb/0.log" Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.950981 4830 generic.go:334] "Generic (PLEG): container finished" podID="93739148-39fb-4db3-ae9d-d222feb368d7" containerID="e6ff33896ab819ecb0f974d24f8341e4cd47187d5b83f4031d921d854055799e" exitCode=2 Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.951080 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"93739148-39fb-4db3-ae9d-d222feb368d7","Type":"ContainerDied","Data":"e6ff33896ab819ecb0f974d24f8341e4cd47187d5b83f4031d921d854055799e"} Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.961015 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcd8w\" (UniqueName: \"kubernetes.io/projected/eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5-kube-api-access-wcd8w\") pod \"nova-cell1-88cd-account-create-update-8vhqn\" (UID: \"eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5\") " pod="openstack/nova-cell1-88cd-account-create-update-8vhqn" Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.961090 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5-operator-scripts\") pod \"nova-cell1-88cd-account-create-update-8vhqn\" (UID: \"eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5\") " pod="openstack/nova-cell1-88cd-account-create-update-8vhqn" Mar 18 18:24:59 crc kubenswrapper[4830]: E0318 18:24:59.961307 4830 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 18 18:24:59 crc kubenswrapper[4830]: E0318 18:24:59.961359 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5-operator-scripts podName:eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5 nodeName:}" failed. No retries permitted until 2026-03-18 18:25:00.961344334 +0000 UTC m=+1335.528974666 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5-operator-scripts") pod "nova-cell1-88cd-account-create-update-8vhqn" (UID: "eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5") : configmap "openstack-cell1-scripts" not found Mar 18 18:24:59 crc kubenswrapper[4830]: E0318 18:24:59.983434 4830 projected.go:194] Error preparing data for projected volume kube-api-access-wcd8w for pod openstack/nova-cell1-88cd-account-create-update-8vhqn: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Mar 18 18:24:59 crc kubenswrapper[4830]: E0318 18:24:59.983518 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5-kube-api-access-wcd8w podName:eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5 nodeName:}" failed. No retries permitted until 2026-03-18 18:25:00.98349586 +0000 UTC m=+1335.551126192 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-wcd8w" (UniqueName: "kubernetes.io/projected/eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5-kube-api-access-wcd8w") pod "nova-cell1-88cd-account-create-update-8vhqn" (UID: "eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.992919 4830 generic.go:334] "Generic (PLEG): container finished" podID="544c01f7-a6da-45de-96f2-9ab9dea0567c" containerID="7eab1cf8b6cb575621ae6e6f99b624e1a23b211fa8cf4fe29aa7e8049a993337" exitCode=0 Mar 18 18:24:59 crc kubenswrapper[4830]: I0318 18:24:59.993011 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-chwf9" event={"ID":"544c01f7-a6da-45de-96f2-9ab9dea0567c","Type":"ContainerDied","Data":"7eab1cf8b6cb575621ae6e6f99b624e1a23b211fa8cf4fe29aa7e8049a993337"} Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.005126 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-96knc"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.023722 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-jkvj9_60094d0f-d530-424e-92d1-62e473acc664/openstack-network-exporter/0.log" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.023797 4830 generic.go:334] "Generic (PLEG): container finished" podID="60094d0f-d530-424e-92d1-62e473acc664" containerID="42b14e059955cc8e166c0627991a760521592d7af71c5890c3dca6e2c64b9fb8" exitCode=2 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.023883 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-jkvj9" event={"ID":"60094d0f-d530-424e-92d1-62e473acc664","Type":"ContainerDied","Data":"42b14e059955cc8e166c0627991a760521592d7af71c5890c3dca6e2c64b9fb8"} Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.040829 4830 generic.go:334] "Generic (PLEG): container finished" podID="1965a180-09c8-4af1-852e-7792c02564ca" containerID="ad0f8b84dc205164a749c73530020347ca97fd9f6445a06f2cc16f1876d40ecc" exitCode=137 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.060288 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-4l8qp"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.074929 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-4l8qp"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.094055 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-96knc"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.103556 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.103951 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e8631247-bdcb-45ff-a17d-ac7e7ff81800" containerName="glance-httpd" containerID="cri-o://41f23f0d4fef2bb42d4c0645e34a4042e362df833aa1814c1dd80e578b447069" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.103914 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e8631247-bdcb-45ff-a17d-ac7e7ff81800" containerName="glance-log" containerID="cri-o://ae17ba4052b5b73e7f8747e0bbd64f898ebbc5356b7377e5822b10903adec77d" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.152097 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.153106 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="account-server" containerID="cri-o://7e87a03e3adb66017525596597b8739a2dd883902ed90c632e4e5cbfbfade6fe" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.153238 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="container-updater" containerID="cri-o://19e2f77105d5703f0646d3c61e7fe7c902c627dbb91bbc626d9e5d5bb3fa485c" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.153225 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="object-server" containerID="cri-o://3d815a588191bb1f303bdb826c5890be7d59362cd896066cfe0bd7ed228c7623" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.153316 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="container-auditor" containerID="cri-o://68ad223077ac746b9802f4eba8764e5eaa00ca98bf3773872d2cd95daf9b38f0" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.153349 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="container-replicator" containerID="cri-o://a19a7ebcd14a4be0ac0694088743b29c2a922f84d22e54d870830b7764d78682" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.153381 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="container-server" containerID="cri-o://44d7f582b1786283b3e923d17f41dabde89bb1069ef6be7a6bc4c163e7c6d398" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.153413 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="account-reaper" containerID="cri-o://1176cdff085a64af57931e22a9423ae76c0f52837134d47b63aa9518c32e92c6" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.153473 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="account-auditor" containerID="cri-o://6c5bd47f9683cd9c5e03e6fd6c51407a8085923fe0e9f8ac3506a2c980271f44" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.153524 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="account-replicator" containerID="cri-o://933487d6b7c0d60ba81cf11b01ceaae63489030bbb5fd50148a67d7724abf942" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.153595 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="object-expirer" containerID="cri-o://93c102b5fc9f4a88a8768bdc36062b71725eccef648daf124aa807bb59ea8cd8" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.153666 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="rsync" containerID="cri-o://ddac036c21cf6e7f7086be2d69ffc2a2c68d39299f23922898025a29a0596dc2" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.153638 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="object-auditor" containerID="cri-o://36d3831532c5080f76c17b505df06b38c560192c7e4793abf714c8adf589ca70" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.153646 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="object-updater" containerID="cri-o://a4919c4786f2548b6767558777a241dc56d419f9904e042566ee841adbe1f1e3" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.153655 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="object-replicator" containerID="cri-o://b34b3a8f9ec6ede03cd125304c68ec8d92f19169893d3ffa48c8c3477adb2572" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.153622 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="swift-recon-cron" containerID="cri-o://a20dea92408e3316b920fe3e34c3564167b91f44ec33c56fd94553ec6a29e550" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.160290 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.160599 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0ac8a4f8-88e7-4cd0-ab89-210fb088b137" containerName="glance-log" containerID="cri-o://0746b8bb66f5e7517a5d7f696d7212e472acad426b45aa47e1826fd52a0611e5" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.160806 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0ac8a4f8-88e7-4cd0-ab89-210fb088b137" containerName="glance-httpd" containerID="cri-o://13a949ebe12567f356b288e72620234deec79f64d460b08c050f70b6131858f4" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.200842 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-676956db6-6grw2"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.201276 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-676956db6-6grw2" podUID="ad760963-34af-440e-9931-fbc23783d7cb" containerName="placement-log" containerID="cri-o://1f65787d2e3aac204498b2bda1b107a09472a1e7a4c737c2468ded43190d999e" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.201922 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-676956db6-6grw2" podUID="ad760963-34af-440e-9931-fbc23783d7cb" containerName="placement-api" containerID="cri-o://06b5da3aa085e9b3e11d65936e872fab74b18aa97d39f5db82fc225e3ce954b4" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.229218 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-2062-account-create-update-92hq2"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.258007 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1057ef8c-09f6-4dc3-9350-bb834240d748" path="/var/lib/kubelet/pods/1057ef8c-09f6-4dc3-9350-bb834240d748/volumes" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.258675 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19dfbcf1-1e28-475a-ae01-092ae2e8764a" path="/var/lib/kubelet/pods/19dfbcf1-1e28-475a-ae01-092ae2e8764a/volumes" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.259489 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40d348ed-98d2-494b-b2b1-f1dfb190a636" path="/var/lib/kubelet/pods/40d348ed-98d2-494b-b2b1-f1dfb190a636/volumes" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.260082 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="523e43c6-ca4d-4107-bb08-02085e1fcd14" path="/var/lib/kubelet/pods/523e43c6-ca4d-4107-bb08-02085e1fcd14/volumes" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.261734 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d328c0f-c9ac-4381-884a-44182b2544d7" path="/var/lib/kubelet/pods/7d328c0f-c9ac-4381-884a-44182b2544d7/volumes" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.264226 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7db73a7a-33c0-4d36-9e96-39b5d68e5af8" path="/var/lib/kubelet/pods/7db73a7a-33c0-4d36-9e96-39b5d68e5af8/volumes" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.264944 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82570719-0e07-4ef2-adee-a287052cc4ac" path="/var/lib/kubelet/pods/82570719-0e07-4ef2-adee-a287052cc4ac/volumes" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.265570 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a1597fb-6c27-4b75-8996-40ff17a49e69" path="/var/lib/kubelet/pods/8a1597fb-6c27-4b75-8996-40ff17a49e69/volumes" Mar 18 18:25:00 crc kubenswrapper[4830]: E0318 18:25:00.266479 4830 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 18 18:25:00 crc kubenswrapper[4830]: E0318 18:25:00.266560 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/56fb6c83-b748-4e21-9b1c-90fb37cefea1-config-data podName:56fb6c83-b748-4e21-9b1c-90fb37cefea1 nodeName:}" failed. No retries permitted until 2026-03-18 18:25:01.266537468 +0000 UTC m=+1335.834167800 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/56fb6c83-b748-4e21-9b1c-90fb37cefea1-config-data") pod "rabbitmq-server-0" (UID: "56fb6c83-b748-4e21-9b1c-90fb37cefea1") : configmap "rabbitmq-config-data" not found Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.267300 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c42d089-56c7-45ee-ba54-ee464499ff29" path="/var/lib/kubelet/pods/8c42d089-56c7-45ee-ba54-ee464499ff29/volumes" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.268215 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5c80123-0588-4b50-a44b-18dca565e2ed" path="/var/lib/kubelet/pods/c5c80123-0588-4b50-a44b-18dca565e2ed/volumes" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.270252 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb262f8b-f0ed-4644-b313-2a2b46815860" path="/var/lib/kubelet/pods/cb262f8b-f0ed-4644-b313-2a2b46815860/volumes" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.284952 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf0bd2ef-825f-4fad-8a4a-135941d72b5b" path="/var/lib/kubelet/pods/cf0bd2ef-825f-4fad-8a4a-135941d72b5b/volumes" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.285535 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2cad194-a0a7-44e7-8e5c-4653ae33983c" path="/var/lib/kubelet/pods/e2cad194-a0a7-44e7-8e5c-4653ae33983c/volumes" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.296710 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-fl66n"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.296751 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-pnc5q"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.313936 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-fl66n"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.339939 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-pnc5q"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.380552 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-jkvj9_60094d0f-d530-424e-92d1-62e473acc664/openstack-network-exporter/0.log" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.380619 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-jkvj9" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.409176 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-49qmf"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.428358 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-fpvks"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.439764 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-fpvks"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.451851 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-49qmf"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.457978 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-dv8kn" podUID="23b737c7-6b5d-44f4-b05a-de278f4ca572" containerName="ovs-vswitchd" containerID="cri-o://880631acc0141d0007f3a250db7aaba33c7a12bda1b531c7c202660030481e50" gracePeriod=29 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.477051 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/60094d0f-d530-424e-92d1-62e473acc664-ovs-rundir\") pod \"60094d0f-d530-424e-92d1-62e473acc664\" (UID: \"60094d0f-d530-424e-92d1-62e473acc664\") " Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.477109 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60094d0f-d530-424e-92d1-62e473acc664-config\") pod \"60094d0f-d530-424e-92d1-62e473acc664\" (UID: \"60094d0f-d530-424e-92d1-62e473acc664\") " Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.477239 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkfb5\" (UniqueName: \"kubernetes.io/projected/60094d0f-d530-424e-92d1-62e473acc664-kube-api-access-kkfb5\") pod \"60094d0f-d530-424e-92d1-62e473acc664\" (UID: \"60094d0f-d530-424e-92d1-62e473acc664\") " Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.477346 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/60094d0f-d530-424e-92d1-62e473acc664-ovn-rundir\") pod \"60094d0f-d530-424e-92d1-62e473acc664\" (UID: \"60094d0f-d530-424e-92d1-62e473acc664\") " Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.477354 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-4043-account-create-update-qpth4"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.477367 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60094d0f-d530-424e-92d1-62e473acc664-combined-ca-bundle\") pod \"60094d0f-d530-424e-92d1-62e473acc664\" (UID: \"60094d0f-d530-424e-92d1-62e473acc664\") " Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.477486 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/60094d0f-d530-424e-92d1-62e473acc664-metrics-certs-tls-certs\") pod \"60094d0f-d530-424e-92d1-62e473acc664\" (UID: \"60094d0f-d530-424e-92d1-62e473acc664\") " Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.478045 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60094d0f-d530-424e-92d1-62e473acc664-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "60094d0f-d530-424e-92d1-62e473acc664" (UID: "60094d0f-d530-424e-92d1-62e473acc664"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.478465 4830 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/60094d0f-d530-424e-92d1-62e473acc664-ovs-rundir\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.478745 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60094d0f-d530-424e-92d1-62e473acc664-config" (OuterVolumeSpecName: "config") pod "60094d0f-d530-424e-92d1-62e473acc664" (UID: "60094d0f-d530-424e-92d1-62e473acc664"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.478820 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60094d0f-d530-424e-92d1-62e473acc664-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "60094d0f-d530-424e-92d1-62e473acc664" (UID: "60094d0f-d530-424e-92d1-62e473acc664"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.488224 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60094d0f-d530-424e-92d1-62e473acc664-kube-api-access-kkfb5" (OuterVolumeSpecName: "kube-api-access-kkfb5") pod "60094d0f-d530-424e-92d1-62e473acc664" (UID: "60094d0f-d530-424e-92d1-62e473acc664"). InnerVolumeSpecName "kube-api-access-kkfb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.502440 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.508982 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b8f8-account-create-update-knfmq"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.517233 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-85cbc86c69-bkfst"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.517666 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-85cbc86c69-bkfst" podUID="e184a0dc-c2fa-4cc2-9785-18a056ab0c46" containerName="neutron-api" containerID="cri-o://22ea3fa0cc5c2b7b61047286d5c724a062a16f2a3599d4207776fd36457bdcd2" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.518021 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60094d0f-d530-424e-92d1-62e473acc664-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60094d0f-d530-424e-92d1-62e473acc664" (UID: "60094d0f-d530-424e-92d1-62e473acc664"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.518093 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-85cbc86c69-bkfst" podUID="e184a0dc-c2fa-4cc2-9785-18a056ab0c46" containerName="neutron-httpd" containerID="cri-o://2cdcb9ee439266520f74d448b0617ce7209026290de151d3b384a0c54cc23c3f" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.524694 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-q96zj"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.530613 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bd85b459c-7ck7d" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.543560 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-d76d78d97-bs4hd"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.543836 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-d76d78d97-bs4hd" podUID="9be76a38-b85f-458f-b5c9-181abf962109" containerName="proxy-httpd" containerID="cri-o://10ea1ae62f7573f638e31db710f3455f544b39c9e8f84f23270b74eeae48b588" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.543992 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-d76d78d97-bs4hd" podUID="9be76a38-b85f-458f-b5c9-181abf962109" containerName="proxy-server" containerID="cri-o://c0416d3b3912bda28adfb32ff6910ca06aa3d2a68ff4208501b26467c7a964b5" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.577338 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.579817 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1965a180-09c8-4af1-852e-7792c02564ca-combined-ca-bundle\") pod \"1965a180-09c8-4af1-852e-7792c02564ca\" (UID: \"1965a180-09c8-4af1-852e-7792c02564ca\") " Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.579976 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1965a180-09c8-4af1-852e-7792c02564ca-openstack-config\") pod \"1965a180-09c8-4af1-852e-7792c02564ca\" (UID: \"1965a180-09c8-4af1-852e-7792c02564ca\") " Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.580109 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1965a180-09c8-4af1-852e-7792c02564ca-openstack-config-secret\") pod \"1965a180-09c8-4af1-852e-7792c02564ca\" (UID: \"1965a180-09c8-4af1-852e-7792c02564ca\") " Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.580164 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dc42f\" (UniqueName: \"kubernetes.io/projected/1965a180-09c8-4af1-852e-7792c02564ca-kube-api-access-dc42f\") pod \"1965a180-09c8-4af1-852e-7792c02564ca\" (UID: \"1965a180-09c8-4af1-852e-7792c02564ca\") " Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.580657 4830 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/60094d0f-d530-424e-92d1-62e473acc664-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.580669 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60094d0f-d530-424e-92d1-62e473acc664-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.580683 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60094d0f-d530-424e-92d1-62e473acc664-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.580692 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkfb5\" (UniqueName: \"kubernetes.io/projected/60094d0f-d530-424e-92d1-62e473acc664-kube-api-access-kkfb5\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.594557 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-q96zj"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.633048 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1965a180-09c8-4af1-852e-7792c02564ca-kube-api-access-dc42f" (OuterVolumeSpecName: "kube-api-access-dc42f") pod "1965a180-09c8-4af1-852e-7792c02564ca" (UID: "1965a180-09c8-4af1-852e-7792c02564ca"). InnerVolumeSpecName "kube-api-access-dc42f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.647721 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6bbb58d4c-74p8g"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.647948 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6bbb58d4c-74p8g" podUID="11e19037-abf1-4269-b933-0950913973b9" containerName="barbican-worker-log" containerID="cri-o://904ded3c9841d4d431c9a8b7917b3f2eec10c31241a56280fbcc48164d2a5323" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.649031 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6bbb58d4c-74p8g" podUID="11e19037-abf1-4269-b933-0950913973b9" containerName="barbican-worker" containerID="cri-o://e3cd2ffc35cea964dcec2e27b4b151f289beecdcd0e3b5f7b932d52f599b93c0" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.662472 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7f884dc87d-6wvs2"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.662727 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7f884dc87d-6wvs2" podUID="3e152864-9096-47a7-b0b0-c288840093e7" containerName="barbican-api-log" containerID="cri-o://b03c2437bc3b020985e30c6a140c3c922aeb2a95cd6d3bf3c72a87ffaf8ce7ba" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.662880 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7f884dc87d-6wvs2" podUID="3e152864-9096-47a7-b0b0-c288840093e7" containerName="barbican-api" containerID="cri-o://e20014a42907afd388ba14b211a6c05885fe859da4a4d5b322dfc735c19c8637" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.683265 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-6h9l5"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.684369 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86ecee90-92ea-4ef1-a871-49018c2ac648-ovsdbserver-sb\") pod \"86ecee90-92ea-4ef1-a871-49018c2ac648\" (UID: \"86ecee90-92ea-4ef1-a871-49018c2ac648\") " Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.684438 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86ecee90-92ea-4ef1-a871-49018c2ac648-config\") pod \"86ecee90-92ea-4ef1-a871-49018c2ac648\" (UID: \"86ecee90-92ea-4ef1-a871-49018c2ac648\") " Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.684873 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86ecee90-92ea-4ef1-a871-49018c2ac648-ovsdbserver-nb\") pod \"86ecee90-92ea-4ef1-a871-49018c2ac648\" (UID: \"86ecee90-92ea-4ef1-a871-49018c2ac648\") " Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.684935 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86ecee90-92ea-4ef1-a871-49018c2ac648-dns-svc\") pod \"86ecee90-92ea-4ef1-a871-49018c2ac648\" (UID: \"86ecee90-92ea-4ef1-a871-49018c2ac648\") " Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.685012 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/86ecee90-92ea-4ef1-a871-49018c2ac648-dns-swift-storage-0\") pod \"86ecee90-92ea-4ef1-a871-49018c2ac648\" (UID: \"86ecee90-92ea-4ef1-a871-49018c2ac648\") " Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.685082 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twsxn\" (UniqueName: \"kubernetes.io/projected/86ecee90-92ea-4ef1-a871-49018c2ac648-kube-api-access-twsxn\") pod \"86ecee90-92ea-4ef1-a871-49018c2ac648\" (UID: \"86ecee90-92ea-4ef1-a871-49018c2ac648\") " Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.685827 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dc42f\" (UniqueName: \"kubernetes.io/projected/1965a180-09c8-4af1-852e-7792c02564ca-kube-api-access-dc42f\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:00 crc kubenswrapper[4830]: E0318 18:25:00.689965 4830 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Mar 18 18:25:00 crc kubenswrapper[4830]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 18 18:25:00 crc kubenswrapper[4830]: + source /usr/local/bin/container-scripts/functions Mar 18 18:25:00 crc kubenswrapper[4830]: ++ OVNBridge=br-int Mar 18 18:25:00 crc kubenswrapper[4830]: ++ OVNRemote=tcp:localhost:6642 Mar 18 18:25:00 crc kubenswrapper[4830]: ++ OVNEncapType=geneve Mar 18 18:25:00 crc kubenswrapper[4830]: ++ OVNAvailabilityZones= Mar 18 18:25:00 crc kubenswrapper[4830]: ++ EnableChassisAsGateway=true Mar 18 18:25:00 crc kubenswrapper[4830]: ++ PhysicalNetworks= Mar 18 18:25:00 crc kubenswrapper[4830]: ++ OVNHostName= Mar 18 18:25:00 crc kubenswrapper[4830]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 18 18:25:00 crc kubenswrapper[4830]: ++ ovs_dir=/var/lib/openvswitch Mar 18 18:25:00 crc kubenswrapper[4830]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 18 18:25:00 crc kubenswrapper[4830]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 18 18:25:00 crc kubenswrapper[4830]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 18 18:25:00 crc kubenswrapper[4830]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 18 18:25:00 crc kubenswrapper[4830]: + sleep 0.5 Mar 18 18:25:00 crc kubenswrapper[4830]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 18 18:25:00 crc kubenswrapper[4830]: + sleep 0.5 Mar 18 18:25:00 crc kubenswrapper[4830]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 18 18:25:00 crc kubenswrapper[4830]: + cleanup_ovsdb_server_semaphore Mar 18 18:25:00 crc kubenswrapper[4830]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 18 18:25:00 crc kubenswrapper[4830]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 18 18:25:00 crc kubenswrapper[4830]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-dv8kn" message=< Mar 18 18:25:00 crc kubenswrapper[4830]: Exiting ovsdb-server (5) [ OK ] Mar 18 18:25:00 crc kubenswrapper[4830]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 18 18:25:00 crc kubenswrapper[4830]: + source /usr/local/bin/container-scripts/functions Mar 18 18:25:00 crc kubenswrapper[4830]: ++ OVNBridge=br-int Mar 18 18:25:00 crc kubenswrapper[4830]: ++ OVNRemote=tcp:localhost:6642 Mar 18 18:25:00 crc kubenswrapper[4830]: ++ OVNEncapType=geneve Mar 18 18:25:00 crc kubenswrapper[4830]: ++ OVNAvailabilityZones= Mar 18 18:25:00 crc kubenswrapper[4830]: ++ EnableChassisAsGateway=true Mar 18 18:25:00 crc kubenswrapper[4830]: ++ PhysicalNetworks= Mar 18 18:25:00 crc kubenswrapper[4830]: ++ OVNHostName= Mar 18 18:25:00 crc kubenswrapper[4830]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 18 18:25:00 crc kubenswrapper[4830]: ++ ovs_dir=/var/lib/openvswitch Mar 18 18:25:00 crc kubenswrapper[4830]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 18 18:25:00 crc kubenswrapper[4830]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 18 18:25:00 crc kubenswrapper[4830]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 18 18:25:00 crc kubenswrapper[4830]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 18 18:25:00 crc kubenswrapper[4830]: + sleep 0.5 Mar 18 18:25:00 crc kubenswrapper[4830]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 18 18:25:00 crc kubenswrapper[4830]: + sleep 0.5 Mar 18 18:25:00 crc kubenswrapper[4830]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 18 18:25:00 crc kubenswrapper[4830]: + cleanup_ovsdb_server_semaphore Mar 18 18:25:00 crc kubenswrapper[4830]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 18 18:25:00 crc kubenswrapper[4830]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 18 18:25:00 crc kubenswrapper[4830]: > Mar 18 18:25:00 crc kubenswrapper[4830]: E0318 18:25:00.690047 4830 kuberuntime_container.go:691] "PreStop hook failed" err=< Mar 18 18:25:00 crc kubenswrapper[4830]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 18 18:25:00 crc kubenswrapper[4830]: + source /usr/local/bin/container-scripts/functions Mar 18 18:25:00 crc kubenswrapper[4830]: ++ OVNBridge=br-int Mar 18 18:25:00 crc kubenswrapper[4830]: ++ OVNRemote=tcp:localhost:6642 Mar 18 18:25:00 crc kubenswrapper[4830]: ++ OVNEncapType=geneve Mar 18 18:25:00 crc kubenswrapper[4830]: ++ OVNAvailabilityZones= Mar 18 18:25:00 crc kubenswrapper[4830]: ++ EnableChassisAsGateway=true Mar 18 18:25:00 crc kubenswrapper[4830]: ++ PhysicalNetworks= Mar 18 18:25:00 crc kubenswrapper[4830]: ++ OVNHostName= Mar 18 18:25:00 crc kubenswrapper[4830]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 18 18:25:00 crc kubenswrapper[4830]: ++ ovs_dir=/var/lib/openvswitch Mar 18 18:25:00 crc kubenswrapper[4830]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 18 18:25:00 crc kubenswrapper[4830]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 18 18:25:00 crc kubenswrapper[4830]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 18 18:25:00 crc kubenswrapper[4830]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 18 18:25:00 crc kubenswrapper[4830]: + sleep 0.5 Mar 18 18:25:00 crc kubenswrapper[4830]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 18 18:25:00 crc kubenswrapper[4830]: + sleep 0.5 Mar 18 18:25:00 crc kubenswrapper[4830]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 18 18:25:00 crc kubenswrapper[4830]: + cleanup_ovsdb_server_semaphore Mar 18 18:25:00 crc kubenswrapper[4830]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 18 18:25:00 crc kubenswrapper[4830]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 18 18:25:00 crc kubenswrapper[4830]: > pod="openstack/ovn-controller-ovs-dv8kn" podUID="23b737c7-6b5d-44f4-b05a-de278f4ca572" containerName="ovsdb-server" containerID="cri-o://4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.690084 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-dv8kn" podUID="23b737c7-6b5d-44f4-b05a-de278f4ca572" containerName="ovsdb-server" containerID="cri-o://4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc" gracePeriod=28 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.704477 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-6h9l5"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.704848 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86ecee90-92ea-4ef1-a871-49018c2ac648-kube-api-access-twsxn" (OuterVolumeSpecName: "kube-api-access-twsxn") pod "86ecee90-92ea-4ef1-a871-49018c2ac648" (UID: "86ecee90-92ea-4ef1-a871-49018c2ac648"). InnerVolumeSpecName "kube-api-access-twsxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.728065 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-7f98-account-create-update-lcc8p"] Mar 18 18:25:00 crc kubenswrapper[4830]: E0318 18:25:00.732551 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3073305b4183467e7f6c2b40e18ca0a3dc5dd325e4392cdfee5efad929986263 is running failed: container process not found" containerID="3073305b4183467e7f6c2b40e18ca0a3dc5dd325e4392cdfee5efad929986263" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 18 18:25:00 crc kubenswrapper[4830]: E0318 18:25:00.733873 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3073305b4183467e7f6c2b40e18ca0a3dc5dd325e4392cdfee5efad929986263 is running failed: container process not found" containerID="3073305b4183467e7f6c2b40e18ca0a3dc5dd325e4392cdfee5efad929986263" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.733954 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-78f6989b54-vkxh8"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.734199 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-78f6989b54-vkxh8" podUID="48aa5450-29c8-47de-bb37-a7a6ffd441bc" containerName="barbican-keystone-listener-log" containerID="cri-o://164f985d7fecf295460783b0211bbc6afa41549a232c6b8704e09de623fb3cd3" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.734560 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-78f6989b54-vkxh8" podUID="48aa5450-29c8-47de-bb37-a7a6ffd441bc" containerName="barbican-keystone-listener" containerID="cri-o://fc53817ebacc0ce8c203daf49d972d55c5cc1843b058744c9a909e3088e8e2dc" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: E0318 18:25:00.734739 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3073305b4183467e7f6c2b40e18ca0a3dc5dd325e4392cdfee5efad929986263 is running failed: container process not found" containerID="3073305b4183467e7f6c2b40e18ca0a3dc5dd325e4392cdfee5efad929986263" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 18 18:25:00 crc kubenswrapper[4830]: E0318 18:25:00.735937 4830 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3073305b4183467e7f6c2b40e18ca0a3dc5dd325e4392cdfee5efad929986263 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-nb-0" podUID="93739148-39fb-4db3-ae9d-d222feb368d7" containerName="ovsdbserver-nb" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.741145 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.741411 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dd0fbdd2-a99b-4758-9f27-1f5055ca0172" containerName="nova-metadata-log" containerID="cri-o://7ec14097d0f88bba4680e90148e2beece59c3678f735e4a8e1a973a7adfaf364" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.741904 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dd0fbdd2-a99b-4758-9f27-1f5055ca0172" containerName="nova-metadata-metadata" containerID="cri-o://e3b1b0010366275a52abd6d5d86dee987f708c05b0176a216ab2bd64c79302b6" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.791148 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twsxn\" (UniqueName: \"kubernetes.io/projected/86ecee90-92ea-4ef1-a871-49018c2ac648-kube-api-access-twsxn\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.812964 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.832872 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.835626 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b3ba738f-c556-4b36-a045-3516efdf886a" containerName="nova-api-log" containerID="cri-o://7269557d2134b5328a9871c88e8307ed1155b8cea2686c2ae04cc355079f438f" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.836051 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b3ba738f-c556-4b36-a045-3516efdf886a" containerName="nova-api-api" containerID="cri-o://91aff4166cbebec7917a849f1dae12a4f2caababfa680539bc75bf53f49cf551" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.853291 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1965a180-09c8-4af1-852e-7792c02564ca-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "1965a180-09c8-4af1-852e-7792c02564ca" (UID: "1965a180-09c8-4af1-852e-7792c02564ca"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:00 crc kubenswrapper[4830]: E0318 18:25:00.871316 4830 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:25:00 crc kubenswrapper[4830]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 18 18:25:00 crc kubenswrapper[4830]: Mar 18 18:25:00 crc kubenswrapper[4830]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 18 18:25:00 crc kubenswrapper[4830]: Mar 18 18:25:00 crc kubenswrapper[4830]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 18 18:25:00 crc kubenswrapper[4830]: Mar 18 18:25:00 crc kubenswrapper[4830]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 18 18:25:00 crc kubenswrapper[4830]: Mar 18 18:25:00 crc kubenswrapper[4830]: if [ -n "neutron" ]; then Mar 18 18:25:00 crc kubenswrapper[4830]: GRANT_DATABASE="neutron" Mar 18 18:25:00 crc kubenswrapper[4830]: else Mar 18 18:25:00 crc kubenswrapper[4830]: GRANT_DATABASE="*" Mar 18 18:25:00 crc kubenswrapper[4830]: fi Mar 18 18:25:00 crc kubenswrapper[4830]: Mar 18 18:25:00 crc kubenswrapper[4830]: # going for maximum compatibility here: Mar 18 18:25:00 crc kubenswrapper[4830]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 18 18:25:00 crc kubenswrapper[4830]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 18 18:25:00 crc kubenswrapper[4830]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 18 18:25:00 crc kubenswrapper[4830]: # support updates Mar 18 18:25:00 crc kubenswrapper[4830]: Mar 18 18:25:00 crc kubenswrapper[4830]: $MYSQL_CMD < logger="UnhandledError" Mar 18 18:25:00 crc kubenswrapper[4830]: E0318 18:25:00.872028 4830 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:25:00 crc kubenswrapper[4830]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 18 18:25:00 crc kubenswrapper[4830]: Mar 18 18:25:00 crc kubenswrapper[4830]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 18 18:25:00 crc kubenswrapper[4830]: Mar 18 18:25:00 crc kubenswrapper[4830]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 18 18:25:00 crc kubenswrapper[4830]: Mar 18 18:25:00 crc kubenswrapper[4830]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 18 18:25:00 crc kubenswrapper[4830]: Mar 18 18:25:00 crc kubenswrapper[4830]: if [ -n "barbican" ]; then Mar 18 18:25:00 crc kubenswrapper[4830]: GRANT_DATABASE="barbican" Mar 18 18:25:00 crc kubenswrapper[4830]: else Mar 18 18:25:00 crc kubenswrapper[4830]: GRANT_DATABASE="*" Mar 18 18:25:00 crc kubenswrapper[4830]: fi Mar 18 18:25:00 crc kubenswrapper[4830]: Mar 18 18:25:00 crc kubenswrapper[4830]: # going for maximum compatibility here: Mar 18 18:25:00 crc kubenswrapper[4830]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 18 18:25:00 crc kubenswrapper[4830]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 18 18:25:00 crc kubenswrapper[4830]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 18 18:25:00 crc kubenswrapper[4830]: # support updates Mar 18 18:25:00 crc kubenswrapper[4830]: Mar 18 18:25:00 crc kubenswrapper[4830]: $MYSQL_CMD < logger="UnhandledError" Mar 18 18:25:00 crc kubenswrapper[4830]: E0318 18:25:00.874567 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-4043-account-create-update-qpth4" podUID="781fdccd-a9f3-40ce-9234-d651c079eb1e" Mar 18 18:25:00 crc kubenswrapper[4830]: E0318 18:25:00.874654 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack/neutron-b8f8-account-create-update-knfmq" podUID="0df8dbfa-578e-4edf-ac2a-2030b582bc63" Mar 18 18:25:00 crc kubenswrapper[4830]: E0318 18:25:00.876933 4830 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:25:00 crc kubenswrapper[4830]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 18 18:25:00 crc kubenswrapper[4830]: Mar 18 18:25:00 crc kubenswrapper[4830]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 18 18:25:00 crc kubenswrapper[4830]: Mar 18 18:25:00 crc kubenswrapper[4830]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 18 18:25:00 crc kubenswrapper[4830]: Mar 18 18:25:00 crc kubenswrapper[4830]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 18 18:25:00 crc kubenswrapper[4830]: Mar 18 18:25:00 crc kubenswrapper[4830]: if [ -n "" ]; then Mar 18 18:25:00 crc kubenswrapper[4830]: GRANT_DATABASE="" Mar 18 18:25:00 crc kubenswrapper[4830]: else Mar 18 18:25:00 crc kubenswrapper[4830]: GRANT_DATABASE="*" Mar 18 18:25:00 crc kubenswrapper[4830]: fi Mar 18 18:25:00 crc kubenswrapper[4830]: Mar 18 18:25:00 crc kubenswrapper[4830]: # going for maximum compatibility here: Mar 18 18:25:00 crc kubenswrapper[4830]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 18 18:25:00 crc kubenswrapper[4830]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 18 18:25:00 crc kubenswrapper[4830]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 18 18:25:00 crc kubenswrapper[4830]: # support updates Mar 18 18:25:00 crc kubenswrapper[4830]: Mar 18 18:25:00 crc kubenswrapper[4830]: $MYSQL_CMD < logger="UnhandledError" Mar 18 18:25:00 crc kubenswrapper[4830]: E0318 18:25:00.880151 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-lhdqd" podUID="e3a34a0e-8390-4618-8b6e-c27ed8adc51a" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.894800 4830 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1965a180-09c8-4af1-852e-7792c02564ca-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.897619 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="a639262d-5bc7-4b14-a6ef-59583fdffb07" containerName="rabbitmq" containerID="cri-o://dae4cab83feb5262c8c7a5b8b0cb453b9f964431009385de80e3e0c21a526b8f" gracePeriod=604800 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.901592 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-fhlvn"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.916125 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1965a180-09c8-4af1-852e-7792c02564ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1965a180-09c8-4af1-852e-7792c02564ca" (UID: "1965a180-09c8-4af1-852e-7792c02564ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.923406 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-fhlvn"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.925037 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60094d0f-d530-424e-92d1-62e473acc664-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "60094d0f-d530-424e-92d1-62e473acc664" (UID: "60094d0f-d530-424e-92d1-62e473acc664"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.936016 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-03fa-account-create-update-fsvc6"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.945256 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-4v25q"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.953304 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-5vmth"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.961999 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-4v25q"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.972992 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-5vmth"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.973046 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-a2d6-account-create-update-ltf4b"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.987045 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.987644 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="1e8e20bd-67c1-48a7-be43-c585d65656ea" containerName="nova-scheduler-scheduler" containerID="cri-o://4cc141c38da7f2f14e8af81b886f2466b15b63a804233f4ae743bb0e785d7d90" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.996030 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.996263 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="bab36094-736f-460a-83d1-bd298dee7774" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://62da465c43e927fc0029ba25702e0c328ba20afbde5a9049d2d8a147434c24a7" gracePeriod=30 Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.999818 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcd8w\" (UniqueName: \"kubernetes.io/projected/eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5-kube-api-access-wcd8w\") pod \"nova-cell1-88cd-account-create-update-8vhqn\" (UID: \"eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5\") " pod="openstack/nova-cell1-88cd-account-create-update-8vhqn" Mar 18 18:25:00 crc kubenswrapper[4830]: I0318 18:25:00.999951 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5-operator-scripts\") pod \"nova-cell1-88cd-account-create-update-8vhqn\" (UID: \"eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5\") " pod="openstack/nova-cell1-88cd-account-create-update-8vhqn" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.000079 4830 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/60094d0f-d530-424e-92d1-62e473acc664-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.000098 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1965a180-09c8-4af1-852e-7792c02564ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.000109 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1965a180-09c8-4af1-852e-7792c02564ca-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "1965a180-09c8-4af1-852e-7792c02564ca" (UID: "1965a180-09c8-4af1-852e-7792c02564ca"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:01 crc kubenswrapper[4830]: E0318 18:25:01.000150 4830 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 18 18:25:01 crc kubenswrapper[4830]: E0318 18:25:01.000188 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5-operator-scripts podName:eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5 nodeName:}" failed. No retries permitted until 2026-03-18 18:25:03.00017513 +0000 UTC m=+1337.567805452 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5-operator-scripts") pod "nova-cell1-88cd-account-create-update-8vhqn" (UID: "eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5") : configmap "openstack-cell1-scripts" not found Mar 18 18:25:01 crc kubenswrapper[4830]: E0318 18:25:01.004696 4830 projected.go:194] Error preparing data for projected volume kube-api-access-wcd8w for pod openstack/nova-cell1-88cd-account-create-update-8vhqn: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Mar 18 18:25:01 crc kubenswrapper[4830]: E0318 18:25:01.004800 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5-kube-api-access-wcd8w podName:eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5 nodeName:}" failed. No retries permitted until 2026-03-18 18:25:03.00476584 +0000 UTC m=+1337.572396172 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-wcd8w" (UniqueName: "kubernetes.io/projected/eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5-kube-api-access-wcd8w") pod "nova-cell1-88cd-account-create-update-8vhqn" (UID: "eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.006612 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-88cd-account-create-update-8vhqn"] Mar 18 18:25:01 crc kubenswrapper[4830]: E0318 18:25:01.007370 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-wcd8w operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/nova-cell1-88cd-account-create-update-8vhqn" podUID="eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.014974 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.015667 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="a0e71339-fd75-44b3-bbb8-15d75455d90f" containerName="nova-cell0-conductor-conductor" containerID="cri-o://b5f8b7f66219fddf66e22ef6b5a06dba84482b8f68cbbeea50a396ebe1d339d0" gracePeriod=30 Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.015969 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86ecee90-92ea-4ef1-a871-49018c2ac648-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "86ecee90-92ea-4ef1-a871-49018c2ac648" (UID: "86ecee90-92ea-4ef1-a871-49018c2ac648"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.020598 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86ecee90-92ea-4ef1-a871-49018c2ac648-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "86ecee90-92ea-4ef1-a871-49018c2ac648" (UID: "86ecee90-92ea-4ef1-a871-49018c2ac648"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.024745 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-lhdqd"] Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.028885 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86ecee90-92ea-4ef1-a871-49018c2ac648-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "86ecee90-92ea-4ef1-a871-49018c2ac648" (UID: "86ecee90-92ea-4ef1-a871-49018c2ac648"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.032941 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mbk9z"] Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.042436 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86ecee90-92ea-4ef1-a871-49018c2ac648-config" (OuterVolumeSpecName: "config") pod "86ecee90-92ea-4ef1-a871-49018c2ac648" (UID: "86ecee90-92ea-4ef1-a871-49018c2ac648"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.049860 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86ecee90-92ea-4ef1-a871-49018c2ac648-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "86ecee90-92ea-4ef1-a871-49018c2ac648" (UID: "86ecee90-92ea-4ef1-a871-49018c2ac648"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.058352 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.058644 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="44872ddd-52a8-4ca8-a07e-f84111475b8f" containerName="nova-cell1-conductor-conductor" containerID="cri-o://d65ffc2b335a667737c6a18c2b396b9a709039acd32a58d2211316eb8df8aa6d" gracePeriod=30 Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.063703 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e6d11dd9-4b5b-463e-a834-91c7ecc8b021/ovsdbserver-sb/0.log" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.063750 4830 generic.go:334] "Generic (PLEG): container finished" podID="e6d11dd9-4b5b-463e-a834-91c7ecc8b021" containerID="6d02c3d4022f8ff71336fe32eb97efefa0f42dad83cb62b31f62c9f071d62b10" exitCode=2 Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.063788 4830 generic.go:334] "Generic (PLEG): container finished" podID="e6d11dd9-4b5b-463e-a834-91c7ecc8b021" containerID="e27720e7dca97ec5784c549e6e6c7e84e6b4913613d159710e88f4288654e511" exitCode=143 Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.063835 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e6d11dd9-4b5b-463e-a834-91c7ecc8b021","Type":"ContainerDied","Data":"6d02c3d4022f8ff71336fe32eb97efefa0f42dad83cb62b31f62c9f071d62b10"} Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.063863 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e6d11dd9-4b5b-463e-a834-91c7ecc8b021","Type":"ContainerDied","Data":"e27720e7dca97ec5784c549e6e6c7e84e6b4913613d159710e88f4288654e511"} Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.067865 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-p8g2m"] Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.070115 4830 generic.go:334] "Generic (PLEG): container finished" podID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerID="ddac036c21cf6e7f7086be2d69ffc2a2c68d39299f23922898025a29a0596dc2" exitCode=0 Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.070142 4830 generic.go:334] "Generic (PLEG): container finished" podID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerID="93c102b5fc9f4a88a8768bdc36062b71725eccef648daf124aa807bb59ea8cd8" exitCode=0 Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.070152 4830 generic.go:334] "Generic (PLEG): container finished" podID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerID="a4919c4786f2548b6767558777a241dc56d419f9904e042566ee841adbe1f1e3" exitCode=0 Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.070161 4830 generic.go:334] "Generic (PLEG): container finished" podID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerID="36d3831532c5080f76c17b505df06b38c560192c7e4793abf714c8adf589ca70" exitCode=0 Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.070169 4830 generic.go:334] "Generic (PLEG): container finished" podID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerID="b34b3a8f9ec6ede03cd125304c68ec8d92f19169893d3ffa48c8c3477adb2572" exitCode=0 Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.070177 4830 generic.go:334] "Generic (PLEG): container finished" podID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerID="3d815a588191bb1f303bdb826c5890be7d59362cd896066cfe0bd7ed228c7623" exitCode=0 Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.070186 4830 generic.go:334] "Generic (PLEG): container finished" podID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerID="19e2f77105d5703f0646d3c61e7fe7c902c627dbb91bbc626d9e5d5bb3fa485c" exitCode=0 Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.070194 4830 generic.go:334] "Generic (PLEG): container finished" podID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerID="68ad223077ac746b9802f4eba8764e5eaa00ca98bf3773872d2cd95daf9b38f0" exitCode=0 Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.070218 4830 generic.go:334] "Generic (PLEG): container finished" podID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerID="a19a7ebcd14a4be0ac0694088743b29c2a922f84d22e54d870830b7764d78682" exitCode=0 Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.070227 4830 generic.go:334] "Generic (PLEG): container finished" podID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerID="44d7f582b1786283b3e923d17f41dabde89bb1069ef6be7a6bc4c163e7c6d398" exitCode=0 Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.070234 4830 generic.go:334] "Generic (PLEG): container finished" podID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerID="1176cdff085a64af57931e22a9423ae76c0f52837134d47b63aa9518c32e92c6" exitCode=0 Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.070240 4830 generic.go:334] "Generic (PLEG): container finished" podID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerID="6c5bd47f9683cd9c5e03e6fd6c51407a8085923fe0e9f8ac3506a2c980271f44" exitCode=0 Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.070247 4830 generic.go:334] "Generic (PLEG): container finished" podID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerID="933487d6b7c0d60ba81cf11b01ceaae63489030bbb5fd50148a67d7724abf942" exitCode=0 Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.070254 4830 generic.go:334] "Generic (PLEG): container finished" podID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerID="7e87a03e3adb66017525596597b8739a2dd883902ed90c632e4e5cbfbfade6fe" exitCode=0 Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.070286 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccc6cbaa-b562-49fc-9add-94aac04d60ed","Type":"ContainerDied","Data":"ddac036c21cf6e7f7086be2d69ffc2a2c68d39299f23922898025a29a0596dc2"} Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.070305 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccc6cbaa-b562-49fc-9add-94aac04d60ed","Type":"ContainerDied","Data":"93c102b5fc9f4a88a8768bdc36062b71725eccef648daf124aa807bb59ea8cd8"} Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.070316 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccc6cbaa-b562-49fc-9add-94aac04d60ed","Type":"ContainerDied","Data":"a4919c4786f2548b6767558777a241dc56d419f9904e042566ee841adbe1f1e3"} Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.070325 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccc6cbaa-b562-49fc-9add-94aac04d60ed","Type":"ContainerDied","Data":"36d3831532c5080f76c17b505df06b38c560192c7e4793abf714c8adf589ca70"} Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.070333 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccc6cbaa-b562-49fc-9add-94aac04d60ed","Type":"ContainerDied","Data":"b34b3a8f9ec6ede03cd125304c68ec8d92f19169893d3ffa48c8c3477adb2572"} Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.070342 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccc6cbaa-b562-49fc-9add-94aac04d60ed","Type":"ContainerDied","Data":"3d815a588191bb1f303bdb826c5890be7d59362cd896066cfe0bd7ed228c7623"} Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.070350 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccc6cbaa-b562-49fc-9add-94aac04d60ed","Type":"ContainerDied","Data":"19e2f77105d5703f0646d3c61e7fe7c902c627dbb91bbc626d9e5d5bb3fa485c"} Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.070358 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccc6cbaa-b562-49fc-9add-94aac04d60ed","Type":"ContainerDied","Data":"68ad223077ac746b9802f4eba8764e5eaa00ca98bf3773872d2cd95daf9b38f0"} Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.070366 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccc6cbaa-b562-49fc-9add-94aac04d60ed","Type":"ContainerDied","Data":"a19a7ebcd14a4be0ac0694088743b29c2a922f84d22e54d870830b7764d78682"} Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.070375 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccc6cbaa-b562-49fc-9add-94aac04d60ed","Type":"ContainerDied","Data":"44d7f582b1786283b3e923d17f41dabde89bb1069ef6be7a6bc4c163e7c6d398"} Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.070382 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccc6cbaa-b562-49fc-9add-94aac04d60ed","Type":"ContainerDied","Data":"1176cdff085a64af57931e22a9423ae76c0f52837134d47b63aa9518c32e92c6"} Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.070391 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccc6cbaa-b562-49fc-9add-94aac04d60ed","Type":"ContainerDied","Data":"6c5bd47f9683cd9c5e03e6fd6c51407a8085923fe0e9f8ac3506a2c980271f44"} Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.070399 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccc6cbaa-b562-49fc-9add-94aac04d60ed","Type":"ContainerDied","Data":"933487d6b7c0d60ba81cf11b01ceaae63489030bbb5fd50148a67d7724abf942"} Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.070407 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccc6cbaa-b562-49fc-9add-94aac04d60ed","Type":"ContainerDied","Data":"7e87a03e3adb66017525596597b8739a2dd883902ed90c632e4e5cbfbfade6fe"} Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.071860 4830 generic.go:334] "Generic (PLEG): container finished" podID="0ac8a4f8-88e7-4cd0-ab89-210fb088b137" containerID="0746b8bb66f5e7517a5d7f696d7212e472acad426b45aa47e1826fd52a0611e5" exitCode=143 Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.071899 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0ac8a4f8-88e7-4cd0-ab89-210fb088b137","Type":"ContainerDied","Data":"0746b8bb66f5e7517a5d7f696d7212e472acad426b45aa47e1826fd52a0611e5"} Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.075170 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-chwf9" event={"ID":"544c01f7-a6da-45de-96f2-9ab9dea0567c","Type":"ContainerDied","Data":"d3c3e314884865817c6c30dd92c28997f9b3bd149baf807074125a27d2981e24"} Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.075207 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3c3e314884865817c6c30dd92c28997f9b3bd149baf807074125a27d2981e24" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.078654 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mbk9z"] Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.086264 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-p8g2m"] Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.092166 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.093114 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-jkvj9_60094d0f-d530-424e-92d1-62e473acc664/openstack-network-exporter/0.log" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.093173 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-jkvj9" event={"ID":"60094d0f-d530-424e-92d1-62e473acc664","Type":"ContainerDied","Data":"250dd4e24f2c79323606d92cdadeed2e6b85ef56d96df84143ca1b4bbede7635"} Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.093202 4830 scope.go:117] "RemoveContainer" containerID="42b14e059955cc8e166c0627991a760521592d7af71c5890c3dca6e2c64b9fb8" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.093300 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-jkvj9" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.102746 4830 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1965a180-09c8-4af1-852e-7792c02564ca-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.102788 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86ecee90-92ea-4ef1-a871-49018c2ac648-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.102799 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86ecee90-92ea-4ef1-a871-49018c2ac648-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.102808 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86ecee90-92ea-4ef1-a871-49018c2ac648-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.102817 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86ecee90-92ea-4ef1-a871-49018c2ac648-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.102825 4830 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/86ecee90-92ea-4ef1-a871-49018c2ac648-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.118872 4830 generic.go:334] "Generic (PLEG): container finished" podID="3e152864-9096-47a7-b0b0-c288840093e7" containerID="b03c2437bc3b020985e30c6a140c3c922aeb2a95cd6d3bf3c72a87ffaf8ce7ba" exitCode=143 Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.118916 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f884dc87d-6wvs2" event={"ID":"3e152864-9096-47a7-b0b0-c288840093e7","Type":"ContainerDied","Data":"b03c2437bc3b020985e30c6a140c3c922aeb2a95cd6d3bf3c72a87ffaf8ce7ba"} Mar 18 18:25:01 crc kubenswrapper[4830]: E0318 18:25:01.119615 4830 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:25:01 crc kubenswrapper[4830]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 18 18:25:01 crc kubenswrapper[4830]: Mar 18 18:25:01 crc kubenswrapper[4830]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 18 18:25:01 crc kubenswrapper[4830]: Mar 18 18:25:01 crc kubenswrapper[4830]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 18 18:25:01 crc kubenswrapper[4830]: Mar 18 18:25:01 crc kubenswrapper[4830]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 18 18:25:01 crc kubenswrapper[4830]: Mar 18 18:25:01 crc kubenswrapper[4830]: if [ -n "placement" ]; then Mar 18 18:25:01 crc kubenswrapper[4830]: GRANT_DATABASE="placement" Mar 18 18:25:01 crc kubenswrapper[4830]: else Mar 18 18:25:01 crc kubenswrapper[4830]: GRANT_DATABASE="*" Mar 18 18:25:01 crc kubenswrapper[4830]: fi Mar 18 18:25:01 crc kubenswrapper[4830]: Mar 18 18:25:01 crc kubenswrapper[4830]: # going for maximum compatibility here: Mar 18 18:25:01 crc kubenswrapper[4830]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 18 18:25:01 crc kubenswrapper[4830]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 18 18:25:01 crc kubenswrapper[4830]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 18 18:25:01 crc kubenswrapper[4830]: # support updates Mar 18 18:25:01 crc kubenswrapper[4830]: Mar 18 18:25:01 crc kubenswrapper[4830]: $MYSQL_CMD < logger="UnhandledError" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.130046 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-4043-account-create-update-qpth4"] Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.130096 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b8f8-account-create-update-knfmq"] Mar 18 18:25:01 crc kubenswrapper[4830]: E0318 18:25:01.130363 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"placement-db-secret\\\" not found\"" pod="openstack/placement-2062-account-create-update-92hq2" podUID="b1e4500f-681e-433d-8283-008eec618721" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.133919 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b8f8-account-create-update-knfmq" event={"ID":"0df8dbfa-578e-4edf-ac2a-2030b582bc63","Type":"ContainerStarted","Data":"7ed51f6ee3c0f81f2ecb70d99000e5203727aa839556140bf48a3b994c0d3bdf"} Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.136845 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-lhdqd"] Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.140832 4830 generic.go:334] "Generic (PLEG): container finished" podID="b3ba738f-c556-4b36-a045-3516efdf886a" containerID="7269557d2134b5328a9871c88e8307ed1155b8cea2686c2ae04cc355079f438f" exitCode=143 Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.140915 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b3ba738f-c556-4b36-a045-3516efdf886a","Type":"ContainerDied","Data":"7269557d2134b5328a9871c88e8307ed1155b8cea2686c2ae04cc355079f438f"} Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.143101 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-2062-account-create-update-92hq2"] Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.145987 4830 generic.go:334] "Generic (PLEG): container finished" podID="48aa5450-29c8-47de-bb37-a7a6ffd441bc" containerID="164f985d7fecf295460783b0211bbc6afa41549a232c6b8704e09de623fb3cd3" exitCode=143 Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.146047 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78f6989b54-vkxh8" event={"ID":"48aa5450-29c8-47de-bb37-a7a6ffd441bc","Type":"ContainerDied","Data":"164f985d7fecf295460783b0211bbc6afa41549a232c6b8704e09de623fb3cd3"} Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.154668 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-lhdqd" event={"ID":"e3a34a0e-8390-4618-8b6e-c27ed8adc51a","Type":"ContainerStarted","Data":"62a2356e29e06291d1908bb5af963b79ec3b661964253de42b333344b86545b3"} Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.155191 4830 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-lhdqd" secret="" err="secret \"galera-openstack-cell1-dockercfg-c6d24\" not found" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.157311 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="56fb6c83-b748-4e21-9b1c-90fb37cefea1" containerName="rabbitmq" containerID="cri-o://27c872d698c8ee18fd3cec86b3f4b99ede08b94d9fdc72c7ec17bce05d4a979d" gracePeriod=604800 Mar 18 18:25:01 crc kubenswrapper[4830]: E0318 18:25:01.157548 4830 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:25:01 crc kubenswrapper[4830]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 18 18:25:01 crc kubenswrapper[4830]: Mar 18 18:25:01 crc kubenswrapper[4830]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 18 18:25:01 crc kubenswrapper[4830]: Mar 18 18:25:01 crc kubenswrapper[4830]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 18 18:25:01 crc kubenswrapper[4830]: Mar 18 18:25:01 crc kubenswrapper[4830]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 18 18:25:01 crc kubenswrapper[4830]: Mar 18 18:25:01 crc kubenswrapper[4830]: if [ -n "" ]; then Mar 18 18:25:01 crc kubenswrapper[4830]: GRANT_DATABASE="" Mar 18 18:25:01 crc kubenswrapper[4830]: else Mar 18 18:25:01 crc kubenswrapper[4830]: GRANT_DATABASE="*" Mar 18 18:25:01 crc kubenswrapper[4830]: fi Mar 18 18:25:01 crc kubenswrapper[4830]: Mar 18 18:25:01 crc kubenswrapper[4830]: # going for maximum compatibility here: Mar 18 18:25:01 crc kubenswrapper[4830]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 18 18:25:01 crc kubenswrapper[4830]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 18 18:25:01 crc kubenswrapper[4830]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 18 18:25:01 crc kubenswrapper[4830]: # support updates Mar 18 18:25:01 crc kubenswrapper[4830]: Mar 18 18:25:01 crc kubenswrapper[4830]: $MYSQL_CMD < logger="UnhandledError" Mar 18 18:25:01 crc kubenswrapper[4830]: E0318 18:25:01.158640 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-lhdqd" podUID="e3a34a0e-8390-4618-8b6e-c27ed8adc51a" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.173090 4830 generic.go:334] "Generic (PLEG): container finished" podID="23b737c7-6b5d-44f4-b05a-de278f4ca572" containerID="4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc" exitCode=0 Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.173191 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dv8kn" event={"ID":"23b737c7-6b5d-44f4-b05a-de278f4ca572","Type":"ContainerDied","Data":"4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc"} Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.215027 4830 generic.go:334] "Generic (PLEG): container finished" podID="ad760963-34af-440e-9931-fbc23783d7cb" containerID="1f65787d2e3aac204498b2bda1b107a09472a1e7a4c737c2468ded43190d999e" exitCode=143 Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.215148 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-676956db6-6grw2" event={"ID":"ad760963-34af-440e-9931-fbc23783d7cb","Type":"ContainerDied","Data":"1f65787d2e3aac204498b2bda1b107a09472a1e7a4c737c2468ded43190d999e"} Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.255331 4830 generic.go:334] "Generic (PLEG): container finished" podID="f82e5b2f-cb79-4b83-901f-eca64116c6dc" containerID="b5a1cb9ea4b62aec9ff11f16f75f04dd21b28a1d37ed79fbc6fa3de1b8390289" exitCode=0 Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.255694 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f82e5b2f-cb79-4b83-901f-eca64116c6dc","Type":"ContainerDied","Data":"b5a1cb9ea4b62aec9ff11f16f75f04dd21b28a1d37ed79fbc6fa3de1b8390289"} Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.282556 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_93739148-39fb-4db3-ae9d-d222feb368d7/ovsdbserver-nb/0.log" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.282613 4830 generic.go:334] "Generic (PLEG): container finished" podID="93739148-39fb-4db3-ae9d-d222feb368d7" containerID="3073305b4183467e7f6c2b40e18ca0a3dc5dd325e4392cdfee5efad929986263" exitCode=143 Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.283071 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"93739148-39fb-4db3-ae9d-d222feb368d7","Type":"ContainerDied","Data":"3073305b4183467e7f6c2b40e18ca0a3dc5dd325e4392cdfee5efad929986263"} Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.283115 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"93739148-39fb-4db3-ae9d-d222feb368d7","Type":"ContainerDied","Data":"c368f83eb29fcf234da438ba1f29502958e03086b3711a354500f4c7b5c5c05a"} Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.283344 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c368f83eb29fcf234da438ba1f29502958e03086b3711a354500f4c7b5c5c05a" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.296535 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4043-account-create-update-qpth4" event={"ID":"781fdccd-a9f3-40ce-9234-d651c079eb1e","Type":"ContainerStarted","Data":"b501e9203036a4afb723c9dc655a2c3127365dbd4dff4770f3f8216154935e6b"} Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.333273 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 18:25:01 crc kubenswrapper[4830]: E0318 18:25:01.365436 4830 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 18 18:25:01 crc kubenswrapper[4830]: E0318 18:25:01.365508 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e3a34a0e-8390-4618-8b6e-c27ed8adc51a-operator-scripts podName:e3a34a0e-8390-4618-8b6e-c27ed8adc51a nodeName:}" failed. No retries permitted until 2026-03-18 18:25:01.865492534 +0000 UTC m=+1336.433122866 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e3a34a0e-8390-4618-8b6e-c27ed8adc51a-operator-scripts") pod "root-account-create-update-lhdqd" (UID: "e3a34a0e-8390-4618-8b6e-c27ed8adc51a") : configmap "openstack-cell1-scripts" not found Mar 18 18:25:01 crc kubenswrapper[4830]: E0318 18:25:01.366341 4830 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 18 18:25:01 crc kubenswrapper[4830]: E0318 18:25:01.366381 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/56fb6c83-b748-4e21-9b1c-90fb37cefea1-config-data podName:56fb6c83-b748-4e21-9b1c-90fb37cefea1 nodeName:}" failed. No retries permitted until 2026-03-18 18:25:03.366357538 +0000 UTC m=+1337.933987870 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/56fb6c83-b748-4e21-9b1c-90fb37cefea1-config-data") pod "rabbitmq-server-0" (UID: "56fb6c83-b748-4e21-9b1c-90fb37cefea1") : configmap "rabbitmq-config-data" not found Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.421743 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-chwf9" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.428430 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-a2d6-account-create-update-ltf4b"] Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.442928 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bd85b459c-7ck7d" event={"ID":"86ecee90-92ea-4ef1-a871-49018c2ac648","Type":"ContainerDied","Data":"dbaa861ff7070c74290ea330d4c08aa1519ccc9ef2b2603c86dae354a5f498fa"} Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.443300 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_93739148-39fb-4db3-ae9d-d222feb368d7/ovsdbserver-nb/0.log" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.443376 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.443455 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bd85b459c-7ck7d" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.454180 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e6d11dd9-4b5b-463e-a834-91c7ecc8b021/ovsdbserver-sb/0.log" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.454264 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.481016 4830 generic.go:334] "Generic (PLEG): container finished" podID="dd0fbdd2-a99b-4758-9f27-1f5055ca0172" containerID="7ec14097d0f88bba4680e90148e2beece59c3678f735e4a8e1a973a7adfaf364" exitCode=143 Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.481102 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd0fbdd2-a99b-4758-9f27-1f5055ca0172","Type":"ContainerDied","Data":"7ec14097d0f88bba4680e90148e2beece59c3678f735e4a8e1a973a7adfaf364"} Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.497300 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-jkvj9"] Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.501112 4830 scope.go:117] "RemoveContainer" containerID="ad0f8b84dc205164a749c73530020347ca97fd9f6445a06f2cc16f1876d40ecc" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.501108 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="435574fa-a924-4289-a93a-dea05d57d105" containerName="galera" containerID="cri-o://01d8e91004d318c41a6579e547dd6425e1913b522dba6cd78012d1eca9d7aedf" gracePeriod=30 Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.502686 4830 generic.go:334] "Generic (PLEG): container finished" podID="11e19037-abf1-4269-b933-0950913973b9" containerID="904ded3c9841d4d431c9a8b7917b3f2eec10c31241a56280fbcc48164d2a5323" exitCode=143 Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.502752 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6bbb58d4c-74p8g" event={"ID":"11e19037-abf1-4269-b933-0950913973b9","Type":"ContainerDied","Data":"904ded3c9841d4d431c9a8b7917b3f2eec10c31241a56280fbcc48164d2a5323"} Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.505372 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-jkvj9"] Mar 18 18:25:01 crc kubenswrapper[4830]: W0318 18:25:01.509600 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4120b308_df6b_45df_ab90_abc5417228e5.slice/crio-e34b06348c1e98c466d929d0330c19f90981e845378263b31797f130d7616067 WatchSource:0}: Error finding container e34b06348c1e98c466d929d0330c19f90981e845378263b31797f130d7616067: Status 404 returned error can't find the container with id e34b06348c1e98c466d929d0330c19f90981e845378263b31797f130d7616067 Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.517851 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-7f98-account-create-update-lcc8p"] Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.525602 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-03fa-account-create-update-fsvc6"] Mar 18 18:25:01 crc kubenswrapper[4830]: E0318 18:25:01.533543 4830 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:25:01 crc kubenswrapper[4830]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 18 18:25:01 crc kubenswrapper[4830]: Mar 18 18:25:01 crc kubenswrapper[4830]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 18 18:25:01 crc kubenswrapper[4830]: Mar 18 18:25:01 crc kubenswrapper[4830]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 18 18:25:01 crc kubenswrapper[4830]: Mar 18 18:25:01 crc kubenswrapper[4830]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 18 18:25:01 crc kubenswrapper[4830]: Mar 18 18:25:01 crc kubenswrapper[4830]: if [ -n "nova_api" ]; then Mar 18 18:25:01 crc kubenswrapper[4830]: GRANT_DATABASE="nova_api" Mar 18 18:25:01 crc kubenswrapper[4830]: else Mar 18 18:25:01 crc kubenswrapper[4830]: GRANT_DATABASE="*" Mar 18 18:25:01 crc kubenswrapper[4830]: fi Mar 18 18:25:01 crc kubenswrapper[4830]: Mar 18 18:25:01 crc kubenswrapper[4830]: # going for maximum compatibility here: Mar 18 18:25:01 crc kubenswrapper[4830]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 18 18:25:01 crc kubenswrapper[4830]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 18 18:25:01 crc kubenswrapper[4830]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 18 18:25:01 crc kubenswrapper[4830]: # support updates Mar 18 18:25:01 crc kubenswrapper[4830]: Mar 18 18:25:01 crc kubenswrapper[4830]: $MYSQL_CMD < logger="UnhandledError" Mar 18 18:25:01 crc kubenswrapper[4830]: E0318 18:25:01.534704 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-a2d6-account-create-update-ltf4b" podUID="a3421452-ceb9-441f-8982-77c0a33c7a3b" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.538201 4830 generic.go:334] "Generic (PLEG): container finished" podID="e8631247-bdcb-45ff-a17d-ac7e7ff81800" containerID="ae17ba4052b5b73e7f8747e0bbd64f898ebbc5356b7377e5822b10903adec77d" exitCode=143 Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.538270 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e8631247-bdcb-45ff-a17d-ac7e7ff81800","Type":"ContainerDied","Data":"ae17ba4052b5b73e7f8747e0bbd64f898ebbc5356b7377e5822b10903adec77d"} Mar 18 18:25:01 crc kubenswrapper[4830]: E0318 18:25:01.539658 4830 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:25:01 crc kubenswrapper[4830]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 18 18:25:01 crc kubenswrapper[4830]: Mar 18 18:25:01 crc kubenswrapper[4830]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 18 18:25:01 crc kubenswrapper[4830]: Mar 18 18:25:01 crc kubenswrapper[4830]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 18 18:25:01 crc kubenswrapper[4830]: Mar 18 18:25:01 crc kubenswrapper[4830]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 18 18:25:01 crc kubenswrapper[4830]: Mar 18 18:25:01 crc kubenswrapper[4830]: if [ -n "nova_cell0" ]; then Mar 18 18:25:01 crc kubenswrapper[4830]: GRANT_DATABASE="nova_cell0" Mar 18 18:25:01 crc kubenswrapper[4830]: else Mar 18 18:25:01 crc kubenswrapper[4830]: GRANT_DATABASE="*" Mar 18 18:25:01 crc kubenswrapper[4830]: fi Mar 18 18:25:01 crc kubenswrapper[4830]: Mar 18 18:25:01 crc kubenswrapper[4830]: # going for maximum compatibility here: Mar 18 18:25:01 crc kubenswrapper[4830]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 18 18:25:01 crc kubenswrapper[4830]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 18 18:25:01 crc kubenswrapper[4830]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 18 18:25:01 crc kubenswrapper[4830]: # support updates Mar 18 18:25:01 crc kubenswrapper[4830]: Mar 18 18:25:01 crc kubenswrapper[4830]: $MYSQL_CMD < logger="UnhandledError" Mar 18 18:25:01 crc kubenswrapper[4830]: E0318 18:25:01.539713 4830 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:25:01 crc kubenswrapper[4830]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 18 18:25:01 crc kubenswrapper[4830]: Mar 18 18:25:01 crc kubenswrapper[4830]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 18 18:25:01 crc kubenswrapper[4830]: Mar 18 18:25:01 crc kubenswrapper[4830]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 18 18:25:01 crc kubenswrapper[4830]: Mar 18 18:25:01 crc kubenswrapper[4830]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 18 18:25:01 crc kubenswrapper[4830]: Mar 18 18:25:01 crc kubenswrapper[4830]: if [ -n "cinder" ]; then Mar 18 18:25:01 crc kubenswrapper[4830]: GRANT_DATABASE="cinder" Mar 18 18:25:01 crc kubenswrapper[4830]: else Mar 18 18:25:01 crc kubenswrapper[4830]: GRANT_DATABASE="*" Mar 18 18:25:01 crc kubenswrapper[4830]: fi Mar 18 18:25:01 crc kubenswrapper[4830]: Mar 18 18:25:01 crc kubenswrapper[4830]: # going for maximum compatibility here: Mar 18 18:25:01 crc kubenswrapper[4830]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 18 18:25:01 crc kubenswrapper[4830]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 18 18:25:01 crc kubenswrapper[4830]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 18 18:25:01 crc kubenswrapper[4830]: # support updates Mar 18 18:25:01 crc kubenswrapper[4830]: Mar 18 18:25:01 crc kubenswrapper[4830]: $MYSQL_CMD < logger="UnhandledError" Mar 18 18:25:01 crc kubenswrapper[4830]: E0318 18:25:01.545487 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack/cinder-7f98-account-create-update-lcc8p" podUID="4120b308-df6b-45df-ab90-abc5417228e5" Mar 18 18:25:01 crc kubenswrapper[4830]: E0318 18:25:01.546192 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack/nova-cell0-03fa-account-create-update-fsvc6" podUID="a4a0587e-8ede-4ec6-beb7-7bea2c0af8bd" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.552329 4830 generic.go:334] "Generic (PLEG): container finished" podID="9be76a38-b85f-458f-b5c9-181abf962109" containerID="c0416d3b3912bda28adfb32ff6910ca06aa3d2a68ff4208501b26467c7a964b5" exitCode=0 Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.552360 4830 generic.go:334] "Generic (PLEG): container finished" podID="9be76a38-b85f-458f-b5c9-181abf962109" containerID="10ea1ae62f7573f638e31db710f3455f544b39c9e8f84f23270b74eeae48b588" exitCode=0 Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.552419 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d76d78d97-bs4hd" event={"ID":"9be76a38-b85f-458f-b5c9-181abf962109","Type":"ContainerDied","Data":"c0416d3b3912bda28adfb32ff6910ca06aa3d2a68ff4208501b26467c7a964b5"} Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.552445 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d76d78d97-bs4hd" event={"ID":"9be76a38-b85f-458f-b5c9-181abf962109","Type":"ContainerDied","Data":"10ea1ae62f7573f638e31db710f3455f544b39c9e8f84f23270b74eeae48b588"} Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.558965 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bd85b459c-7ck7d"] Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.560973 4830 generic.go:334] "Generic (PLEG): container finished" podID="e184a0dc-c2fa-4cc2-9785-18a056ab0c46" containerID="2cdcb9ee439266520f74d448b0617ce7209026290de151d3b384a0c54cc23c3f" exitCode=0 Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.561049 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-88cd-account-create-update-8vhqn" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.561917 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85cbc86c69-bkfst" event={"ID":"e184a0dc-c2fa-4cc2-9785-18a056ab0c46","Type":"ContainerDied","Data":"2cdcb9ee439266520f74d448b0617ce7209026290de151d3b384a0c54cc23c3f"} Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.568862 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/544c01f7-a6da-45de-96f2-9ab9dea0567c-combined-ca-bundle\") pod \"544c01f7-a6da-45de-96f2-9ab9dea0567c\" (UID: \"544c01f7-a6da-45de-96f2-9ab9dea0567c\") " Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.568904 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/93739148-39fb-4db3-ae9d-d222feb368d7-metrics-certs-tls-certs\") pod \"93739148-39fb-4db3-ae9d-d222feb368d7\" (UID: \"93739148-39fb-4db3-ae9d-d222feb368d7\") " Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.568932 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/544c01f7-a6da-45de-96f2-9ab9dea0567c-var-run-ovn\") pod \"544c01f7-a6da-45de-96f2-9ab9dea0567c\" (UID: \"544c01f7-a6da-45de-96f2-9ab9dea0567c\") " Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.568975 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-ovsdb-rundir\") pod \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\" (UID: \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\") " Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.569000 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/93739148-39fb-4db3-ae9d-d222feb368d7-ovsdb-rundir\") pod \"93739148-39fb-4db3-ae9d-d222feb368d7\" (UID: \"93739148-39fb-4db3-ae9d-d222feb368d7\") " Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.569021 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kr6m\" (UniqueName: \"kubernetes.io/projected/93739148-39fb-4db3-ae9d-d222feb368d7-kube-api-access-8kr6m\") pod \"93739148-39fb-4db3-ae9d-d222feb368d7\" (UID: \"93739148-39fb-4db3-ae9d-d222feb368d7\") " Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.569069 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-combined-ca-bundle\") pod \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\" (UID: \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\") " Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.569101 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-ovsdbserver-sb-tls-certs\") pod \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\" (UID: \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\") " Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.569149 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/544c01f7-a6da-45de-96f2-9ab9dea0567c-ovn-controller-tls-certs\") pod \"544c01f7-a6da-45de-96f2-9ab9dea0567c\" (UID: \"544c01f7-a6da-45de-96f2-9ab9dea0567c\") " Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.569185 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-scripts\") pod \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\" (UID: \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\") " Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.569213 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/544c01f7-a6da-45de-96f2-9ab9dea0567c-var-log-ovn\") pod \"544c01f7-a6da-45de-96f2-9ab9dea0567c\" (UID: \"544c01f7-a6da-45de-96f2-9ab9dea0567c\") " Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.569234 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-config\") pod \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\" (UID: \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\") " Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.569267 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/544c01f7-a6da-45de-96f2-9ab9dea0567c-scripts\") pod \"544c01f7-a6da-45de-96f2-9ab9dea0567c\" (UID: \"544c01f7-a6da-45de-96f2-9ab9dea0567c\") " Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.569292 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93739148-39fb-4db3-ae9d-d222feb368d7-combined-ca-bundle\") pod \"93739148-39fb-4db3-ae9d-d222feb368d7\" (UID: \"93739148-39fb-4db3-ae9d-d222feb368d7\") " Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.569320 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93739148-39fb-4db3-ae9d-d222feb368d7-config\") pod \"93739148-39fb-4db3-ae9d-d222feb368d7\" (UID: \"93739148-39fb-4db3-ae9d-d222feb368d7\") " Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.569352 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\" (UID: \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\") " Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.569370 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/93739148-39fb-4db3-ae9d-d222feb368d7-scripts\") pod \"93739148-39fb-4db3-ae9d-d222feb368d7\" (UID: \"93739148-39fb-4db3-ae9d-d222feb368d7\") " Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.569396 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s98gw\" (UniqueName: \"kubernetes.io/projected/544c01f7-a6da-45de-96f2-9ab9dea0567c-kube-api-access-s98gw\") pod \"544c01f7-a6da-45de-96f2-9ab9dea0567c\" (UID: \"544c01f7-a6da-45de-96f2-9ab9dea0567c\") " Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.569413 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-metrics-certs-tls-certs\") pod \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\" (UID: \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\") " Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.569444 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4v24\" (UniqueName: \"kubernetes.io/projected/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-kube-api-access-l4v24\") pod \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\" (UID: \"e6d11dd9-4b5b-463e-a834-91c7ecc8b021\") " Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.569464 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"93739148-39fb-4db3-ae9d-d222feb368d7\" (UID: \"93739148-39fb-4db3-ae9d-d222feb368d7\") " Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.569484 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/93739148-39fb-4db3-ae9d-d222feb368d7-ovsdbserver-nb-tls-certs\") pod \"93739148-39fb-4db3-ae9d-d222feb368d7\" (UID: \"93739148-39fb-4db3-ae9d-d222feb368d7\") " Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.569501 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/544c01f7-a6da-45de-96f2-9ab9dea0567c-var-run\") pod \"544c01f7-a6da-45de-96f2-9ab9dea0567c\" (UID: \"544c01f7-a6da-45de-96f2-9ab9dea0567c\") " Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.570188 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/544c01f7-a6da-45de-96f2-9ab9dea0567c-var-run" (OuterVolumeSpecName: "var-run") pod "544c01f7-a6da-45de-96f2-9ab9dea0567c" (UID: "544c01f7-a6da-45de-96f2-9ab9dea0567c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.572385 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/544c01f7-a6da-45de-96f2-9ab9dea0567c-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "544c01f7-a6da-45de-96f2-9ab9dea0567c" (UID: "544c01f7-a6da-45de-96f2-9ab9dea0567c"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.575592 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93739148-39fb-4db3-ae9d-d222feb368d7-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "93739148-39fb-4db3-ae9d-d222feb368d7" (UID: "93739148-39fb-4db3-ae9d-d222feb368d7"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.576058 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93739148-39fb-4db3-ae9d-d222feb368d7-config" (OuterVolumeSpecName: "config") pod "93739148-39fb-4db3-ae9d-d222feb368d7" (UID: "93739148-39fb-4db3-ae9d-d222feb368d7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.581706 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "e6d11dd9-4b5b-463e-a834-91c7ecc8b021" (UID: "e6d11dd9-4b5b-463e-a834-91c7ecc8b021"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.582656 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-config" (OuterVolumeSpecName: "config") pod "e6d11dd9-4b5b-463e-a834-91c7ecc8b021" (UID: "e6d11dd9-4b5b-463e-a834-91c7ecc8b021"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.583083 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-scripts" (OuterVolumeSpecName: "scripts") pod "e6d11dd9-4b5b-463e-a834-91c7ecc8b021" (UID: "e6d11dd9-4b5b-463e-a834-91c7ecc8b021"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.583362 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bd85b459c-7ck7d"] Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.584208 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/544c01f7-a6da-45de-96f2-9ab9dea0567c-scripts" (OuterVolumeSpecName: "scripts") pod "544c01f7-a6da-45de-96f2-9ab9dea0567c" (UID: "544c01f7-a6da-45de-96f2-9ab9dea0567c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.584267 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/544c01f7-a6da-45de-96f2-9ab9dea0567c-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "544c01f7-a6da-45de-96f2-9ab9dea0567c" (UID: "544c01f7-a6da-45de-96f2-9ab9dea0567c"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.585317 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93739148-39fb-4db3-ae9d-d222feb368d7-scripts" (OuterVolumeSpecName: "scripts") pod "93739148-39fb-4db3-ae9d-d222feb368d7" (UID: "93739148-39fb-4db3-ae9d-d222feb368d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.588303 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-88cd-account-create-update-8vhqn" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.598016 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/544c01f7-a6da-45de-96f2-9ab9dea0567c-kube-api-access-s98gw" (OuterVolumeSpecName: "kube-api-access-s98gw") pod "544c01f7-a6da-45de-96f2-9ab9dea0567c" (UID: "544c01f7-a6da-45de-96f2-9ab9dea0567c"). InnerVolumeSpecName "kube-api-access-s98gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.619926 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "93739148-39fb-4db3-ae9d-d222feb368d7" (UID: "93739148-39fb-4db3-ae9d-d222feb368d7"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.620646 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "e6d11dd9-4b5b-463e-a834-91c7ecc8b021" (UID: "e6d11dd9-4b5b-463e-a834-91c7ecc8b021"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.644345 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93739148-39fb-4db3-ae9d-d222feb368d7-kube-api-access-8kr6m" (OuterVolumeSpecName: "kube-api-access-8kr6m") pod "93739148-39fb-4db3-ae9d-d222feb368d7" (UID: "93739148-39fb-4db3-ae9d-d222feb368d7"). InnerVolumeSpecName "kube-api-access-8kr6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.648572 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-kube-api-access-l4v24" (OuterVolumeSpecName: "kube-api-access-l4v24") pod "e6d11dd9-4b5b-463e-a834-91c7ecc8b021" (UID: "e6d11dd9-4b5b-463e-a834-91c7ecc8b021"). InnerVolumeSpecName "kube-api-access-l4v24". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.651256 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/544c01f7-a6da-45de-96f2-9ab9dea0567c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "544c01f7-a6da-45de-96f2-9ab9dea0567c" (UID: "544c01f7-a6da-45de-96f2-9ab9dea0567c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.662140 4830 scope.go:117] "RemoveContainer" containerID="1eb0db0b8dfbe3a3b14e7bb26b25f620aed32ce646e43dd05cbe50fab52b6163" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.672327 4830 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.672585 4830 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/544c01f7-a6da-45de-96f2-9ab9dea0567c-var-run\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.672595 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/544c01f7-a6da-45de-96f2-9ab9dea0567c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.672605 4830 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/544c01f7-a6da-45de-96f2-9ab9dea0567c-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.672614 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.672622 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/93739148-39fb-4db3-ae9d-d222feb368d7-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.672630 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kr6m\" (UniqueName: \"kubernetes.io/projected/93739148-39fb-4db3-ae9d-d222feb368d7-kube-api-access-8kr6m\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.672639 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.672648 4830 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/544c01f7-a6da-45de-96f2-9ab9dea0567c-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.672657 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.672669 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/544c01f7-a6da-45de-96f2-9ab9dea0567c-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.672677 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93739148-39fb-4db3-ae9d-d222feb368d7-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.672700 4830 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.672709 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/93739148-39fb-4db3-ae9d-d222feb368d7-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.672718 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s98gw\" (UniqueName: \"kubernetes.io/projected/544c01f7-a6da-45de-96f2-9ab9dea0567c-kube-api-access-s98gw\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.672729 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4v24\" (UniqueName: \"kubernetes.io/projected/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-kube-api-access-l4v24\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.727365 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6d11dd9-4b5b-463e-a834-91c7ecc8b021" (UID: "e6d11dd9-4b5b-463e-a834-91c7ecc8b021"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.738694 4830 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.750235 4830 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.774180 4830 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.774208 4830 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.774218 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.822303 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93739148-39fb-4db3-ae9d-d222feb368d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93739148-39fb-4db3-ae9d-d222feb368d7" (UID: "93739148-39fb-4db3-ae9d-d222feb368d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.825386 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "e6d11dd9-4b5b-463e-a834-91c7ecc8b021" (UID: "e6d11dd9-4b5b-463e-a834-91c7ecc8b021"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.876731 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93739148-39fb-4db3-ae9d-d222feb368d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.876851 4830 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:01 crc kubenswrapper[4830]: E0318 18:25:01.876935 4830 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 18 18:25:01 crc kubenswrapper[4830]: E0318 18:25:01.877007 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e3a34a0e-8390-4618-8b6e-c27ed8adc51a-operator-scripts podName:e3a34a0e-8390-4618-8b6e-c27ed8adc51a nodeName:}" failed. No retries permitted until 2026-03-18 18:25:02.876991788 +0000 UTC m=+1337.444622120 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e3a34a0e-8390-4618-8b6e-c27ed8adc51a-operator-scripts") pod "root-account-create-update-lhdqd" (UID: "e3a34a0e-8390-4618-8b6e-c27ed8adc51a") : configmap "openstack-cell1-scripts" not found Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.900361 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "e6d11dd9-4b5b-463e-a834-91c7ecc8b021" (UID: "e6d11dd9-4b5b-463e-a834-91c7ecc8b021"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.944783 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/544c01f7-a6da-45de-96f2-9ab9dea0567c-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "544c01f7-a6da-45de-96f2-9ab9dea0567c" (UID: "544c01f7-a6da-45de-96f2-9ab9dea0567c"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.982105 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6d11dd9-4b5b-463e-a834-91c7ecc8b021-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.982147 4830 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/544c01f7-a6da-45de-96f2-9ab9dea0567c-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.983517 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93739148-39fb-4db3-ae9d-d222feb368d7-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "93739148-39fb-4db3-ae9d-d222feb368d7" (UID: "93739148-39fb-4db3-ae9d-d222feb368d7"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:01 crc kubenswrapper[4830]: I0318 18:25:01.999585 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93739148-39fb-4db3-ae9d-d222feb368d7-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "93739148-39fb-4db3-ae9d-d222feb368d7" (UID: "93739148-39fb-4db3-ae9d-d222feb368d7"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.006208 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.034400 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4043-account-create-update-qpth4" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.043789 4830 scope.go:117] "RemoveContainer" containerID="5877e480f98ac8df8a3e7169161ee11df8c8ec531baa72e278500934c0be1c69" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.066436 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b8f8-account-create-update-knfmq" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.071306 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-d76d78d97-bs4hd" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.084948 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f82e5b2f-cb79-4b83-901f-eca64116c6dc-combined-ca-bundle\") pod \"f82e5b2f-cb79-4b83-901f-eca64116c6dc\" (UID: \"f82e5b2f-cb79-4b83-901f-eca64116c6dc\") " Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.085167 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f82e5b2f-cb79-4b83-901f-eca64116c6dc-etc-machine-id\") pod \"f82e5b2f-cb79-4b83-901f-eca64116c6dc\" (UID: \"f82e5b2f-cb79-4b83-901f-eca64116c6dc\") " Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.085191 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f82e5b2f-cb79-4b83-901f-eca64116c6dc-config-data-custom\") pod \"f82e5b2f-cb79-4b83-901f-eca64116c6dc\" (UID: \"f82e5b2f-cb79-4b83-901f-eca64116c6dc\") " Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.085207 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f82e5b2f-cb79-4b83-901f-eca64116c6dc-scripts\") pod \"f82e5b2f-cb79-4b83-901f-eca64116c6dc\" (UID: \"f82e5b2f-cb79-4b83-901f-eca64116c6dc\") " Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.085229 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f82e5b2f-cb79-4b83-901f-eca64116c6dc-config-data\") pod \"f82e5b2f-cb79-4b83-901f-eca64116c6dc\" (UID: \"f82e5b2f-cb79-4b83-901f-eca64116c6dc\") " Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.085278 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2z5q\" (UniqueName: \"kubernetes.io/projected/f82e5b2f-cb79-4b83-901f-eca64116c6dc-kube-api-access-g2z5q\") pod \"f82e5b2f-cb79-4b83-901f-eca64116c6dc\" (UID: \"f82e5b2f-cb79-4b83-901f-eca64116c6dc\") " Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.085682 4830 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/93739148-39fb-4db3-ae9d-d222feb368d7-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.085693 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/93739148-39fb-4db3-ae9d-d222feb368d7-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.086514 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f82e5b2f-cb79-4b83-901f-eca64116c6dc-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f82e5b2f-cb79-4b83-901f-eca64116c6dc" (UID: "f82e5b2f-cb79-4b83-901f-eca64116c6dc"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.090310 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f82e5b2f-cb79-4b83-901f-eca64116c6dc-kube-api-access-g2z5q" (OuterVolumeSpecName: "kube-api-access-g2z5q") pod "f82e5b2f-cb79-4b83-901f-eca64116c6dc" (UID: "f82e5b2f-cb79-4b83-901f-eca64116c6dc"). InnerVolumeSpecName "kube-api-access-g2z5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.090521 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f82e5b2f-cb79-4b83-901f-eca64116c6dc-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f82e5b2f-cb79-4b83-901f-eca64116c6dc" (UID: "f82e5b2f-cb79-4b83-901f-eca64116c6dc"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.092320 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f82e5b2f-cb79-4b83-901f-eca64116c6dc-scripts" (OuterVolumeSpecName: "scripts") pod "f82e5b2f-cb79-4b83-901f-eca64116c6dc" (UID: "f82e5b2f-cb79-4b83-901f-eca64116c6dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.148553 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f82e5b2f-cb79-4b83-901f-eca64116c6dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f82e5b2f-cb79-4b83-901f-eca64116c6dc" (UID: "f82e5b2f-cb79-4b83-901f-eca64116c6dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.187872 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0df8dbfa-578e-4edf-ac2a-2030b582bc63-operator-scripts\") pod \"0df8dbfa-578e-4edf-ac2a-2030b582bc63\" (UID: \"0df8dbfa-578e-4edf-ac2a-2030b582bc63\") " Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.188305 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/781fdccd-a9f3-40ce-9234-d651c079eb1e-operator-scripts\") pod \"781fdccd-a9f3-40ce-9234-d651c079eb1e\" (UID: \"781fdccd-a9f3-40ce-9234-d651c079eb1e\") " Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.188333 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9be76a38-b85f-458f-b5c9-181abf962109-combined-ca-bundle\") pod \"9be76a38-b85f-458f-b5c9-181abf962109\" (UID: \"9be76a38-b85f-458f-b5c9-181abf962109\") " Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.188359 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0df8dbfa-578e-4edf-ac2a-2030b582bc63-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0df8dbfa-578e-4edf-ac2a-2030b582bc63" (UID: "0df8dbfa-578e-4edf-ac2a-2030b582bc63"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.188398 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9be76a38-b85f-458f-b5c9-181abf962109-etc-swift\") pod \"9be76a38-b85f-458f-b5c9-181abf962109\" (UID: \"9be76a38-b85f-458f-b5c9-181abf962109\") " Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.188452 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gjp4\" (UniqueName: \"kubernetes.io/projected/781fdccd-a9f3-40ce-9234-d651c079eb1e-kube-api-access-6gjp4\") pod \"781fdccd-a9f3-40ce-9234-d651c079eb1e\" (UID: \"781fdccd-a9f3-40ce-9234-d651c079eb1e\") " Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.188483 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9be76a38-b85f-458f-b5c9-181abf962109-config-data\") pod \"9be76a38-b85f-458f-b5c9-181abf962109\" (UID: \"9be76a38-b85f-458f-b5c9-181abf962109\") " Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.188513 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9be76a38-b85f-458f-b5c9-181abf962109-internal-tls-certs\") pod \"9be76a38-b85f-458f-b5c9-181abf962109\" (UID: \"9be76a38-b85f-458f-b5c9-181abf962109\") " Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.188536 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9be76a38-b85f-458f-b5c9-181abf962109-public-tls-certs\") pod \"9be76a38-b85f-458f-b5c9-181abf962109\" (UID: \"9be76a38-b85f-458f-b5c9-181abf962109\") " Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.188613 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9be76a38-b85f-458f-b5c9-181abf962109-log-httpd\") pod \"9be76a38-b85f-458f-b5c9-181abf962109\" (UID: \"9be76a38-b85f-458f-b5c9-181abf962109\") " Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.188634 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7zpv\" (UniqueName: \"kubernetes.io/projected/0df8dbfa-578e-4edf-ac2a-2030b582bc63-kube-api-access-w7zpv\") pod \"0df8dbfa-578e-4edf-ac2a-2030b582bc63\" (UID: \"0df8dbfa-578e-4edf-ac2a-2030b582bc63\") " Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.188662 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fk6s\" (UniqueName: \"kubernetes.io/projected/9be76a38-b85f-458f-b5c9-181abf962109-kube-api-access-6fk6s\") pod \"9be76a38-b85f-458f-b5c9-181abf962109\" (UID: \"9be76a38-b85f-458f-b5c9-181abf962109\") " Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.188680 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9be76a38-b85f-458f-b5c9-181abf962109-run-httpd\") pod \"9be76a38-b85f-458f-b5c9-181abf962109\" (UID: \"9be76a38-b85f-458f-b5c9-181abf962109\") " Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.189161 4830 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f82e5b2f-cb79-4b83-901f-eca64116c6dc-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.189178 4830 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f82e5b2f-cb79-4b83-901f-eca64116c6dc-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.189188 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f82e5b2f-cb79-4b83-901f-eca64116c6dc-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.189197 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0df8dbfa-578e-4edf-ac2a-2030b582bc63-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.189206 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2z5q\" (UniqueName: \"kubernetes.io/projected/f82e5b2f-cb79-4b83-901f-eca64116c6dc-kube-api-access-g2z5q\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.189216 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f82e5b2f-cb79-4b83-901f-eca64116c6dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.189984 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9be76a38-b85f-458f-b5c9-181abf962109-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9be76a38-b85f-458f-b5c9-181abf962109" (UID: "9be76a38-b85f-458f-b5c9-181abf962109"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.190633 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/781fdccd-a9f3-40ce-9234-d651c079eb1e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "781fdccd-a9f3-40ce-9234-d651c079eb1e" (UID: "781fdccd-a9f3-40ce-9234-d651c079eb1e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.195224 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0df8dbfa-578e-4edf-ac2a-2030b582bc63-kube-api-access-w7zpv" (OuterVolumeSpecName: "kube-api-access-w7zpv") pod "0df8dbfa-578e-4edf-ac2a-2030b582bc63" (UID: "0df8dbfa-578e-4edf-ac2a-2030b582bc63"). InnerVolumeSpecName "kube-api-access-w7zpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.195258 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9be76a38-b85f-458f-b5c9-181abf962109-kube-api-access-6fk6s" (OuterVolumeSpecName: "kube-api-access-6fk6s") pod "9be76a38-b85f-458f-b5c9-181abf962109" (UID: "9be76a38-b85f-458f-b5c9-181abf962109"). InnerVolumeSpecName "kube-api-access-6fk6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.200274 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9be76a38-b85f-458f-b5c9-181abf962109-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9be76a38-b85f-458f-b5c9-181abf962109" (UID: "9be76a38-b85f-458f-b5c9-181abf962109"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.201101 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/781fdccd-a9f3-40ce-9234-d651c079eb1e-kube-api-access-6gjp4" (OuterVolumeSpecName: "kube-api-access-6gjp4") pod "781fdccd-a9f3-40ce-9234-d651c079eb1e" (UID: "781fdccd-a9f3-40ce-9234-d651c079eb1e"). InnerVolumeSpecName "kube-api-access-6gjp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.209964 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9be76a38-b85f-458f-b5c9-181abf962109-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9be76a38-b85f-458f-b5c9-181abf962109" (UID: "9be76a38-b85f-458f-b5c9-181abf962109"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.235446 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f82e5b2f-cb79-4b83-901f-eca64116c6dc-config-data" (OuterVolumeSpecName: "config-data") pod "f82e5b2f-cb79-4b83-901f-eca64116c6dc" (UID: "f82e5b2f-cb79-4b83-901f-eca64116c6dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.251823 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b3aa2e8-fa67-406b-b8cd-e21725c059c3" path="/var/lib/kubelet/pods/0b3aa2e8-fa67-406b-b8cd-e21725c059c3/volumes" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.255179 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c12321e-7436-4126-9ad3-597fa7216bc8" path="/var/lib/kubelet/pods/0c12321e-7436-4126-9ad3-597fa7216bc8/volumes" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.256696 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1965a180-09c8-4af1-852e-7792c02564ca" path="/var/lib/kubelet/pods/1965a180-09c8-4af1-852e-7792c02564ca/volumes" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.258082 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="381d2049-85b1-49f5-a548-c3f5449fee4d" path="/var/lib/kubelet/pods/381d2049-85b1-49f5-a548-c3f5449fee4d/volumes" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.258943 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59b25fae-fe31-4c24-b22c-9a459c4ecebc" path="/var/lib/kubelet/pods/59b25fae-fe31-4c24-b22c-9a459c4ecebc/volumes" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.259458 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e0852c0-51a3-4de2-9f84-e1c7042f4f13" path="/var/lib/kubelet/pods/5e0852c0-51a3-4de2-9f84-e1c7042f4f13/volumes" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.260188 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9be76a38-b85f-458f-b5c9-181abf962109-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9be76a38-b85f-458f-b5c9-181abf962109" (UID: "9be76a38-b85f-458f-b5c9-181abf962109"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.260640 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60094d0f-d530-424e-92d1-62e473acc664" path="/var/lib/kubelet/pods/60094d0f-d530-424e-92d1-62e473acc664/volumes" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.261628 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74526bf3-152f-40de-9923-9197ffddfc2d" path="/var/lib/kubelet/pods/74526bf3-152f-40de-9923-9197ffddfc2d/volumes" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.262142 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86ecee90-92ea-4ef1-a871-49018c2ac648" path="/var/lib/kubelet/pods/86ecee90-92ea-4ef1-a871-49018c2ac648/volumes" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.263375 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95625855-e07d-4366-8b59-7bc241752fab" path="/var/lib/kubelet/pods/95625855-e07d-4366-8b59-7bc241752fab/volumes" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.264087 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ae038dc-03d7-407c-81eb-ae1bae65d555" path="/var/lib/kubelet/pods/9ae038dc-03d7-407c-81eb-ae1bae65d555/volumes" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.265108 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ca42574-9bdb-4ef3-bd58-3973e9144285" path="/var/lib/kubelet/pods/9ca42574-9bdb-4ef3-bd58-3973e9144285/volumes" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.266020 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b61dfaea-fa74-44a5-b2a2-d6b7232008f9" path="/var/lib/kubelet/pods/b61dfaea-fa74-44a5-b2a2-d6b7232008f9/volumes" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.267109 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0a030d7-4344-449d-8edb-805be7b5604f" path="/var/lib/kubelet/pods/e0a030d7-4344-449d-8edb-805be7b5604f/volumes" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.288976 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9be76a38-b85f-458f-b5c9-181abf962109-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9be76a38-b85f-458f-b5c9-181abf962109" (UID: "9be76a38-b85f-458f-b5c9-181abf962109"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.290468 4830 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9be76a38-b85f-458f-b5c9-181abf962109-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.290589 4830 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9be76a38-b85f-458f-b5c9-181abf962109-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.290643 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7zpv\" (UniqueName: \"kubernetes.io/projected/0df8dbfa-578e-4edf-ac2a-2030b582bc63-kube-api-access-w7zpv\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.290697 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fk6s\" (UniqueName: \"kubernetes.io/projected/9be76a38-b85f-458f-b5c9-181abf962109-kube-api-access-6fk6s\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.290748 4830 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9be76a38-b85f-458f-b5c9-181abf962109-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.290832 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f82e5b2f-cb79-4b83-901f-eca64116c6dc-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.290916 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/781fdccd-a9f3-40ce-9234-d651c079eb1e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.290970 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9be76a38-b85f-458f-b5c9-181abf962109-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.291020 4830 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9be76a38-b85f-458f-b5c9-181abf962109-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.291070 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gjp4\" (UniqueName: \"kubernetes.io/projected/781fdccd-a9f3-40ce-9234-d651c079eb1e-kube-api-access-6gjp4\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.307583 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9be76a38-b85f-458f-b5c9-181abf962109-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9be76a38-b85f-458f-b5c9-181abf962109" (UID: "9be76a38-b85f-458f-b5c9-181abf962109"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.328468 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-clvpz"] Mar 18 18:25:02 crc kubenswrapper[4830]: E0318 18:25:02.328944 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93739148-39fb-4db3-ae9d-d222feb368d7" containerName="ovsdbserver-nb" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.328957 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="93739148-39fb-4db3-ae9d-d222feb368d7" containerName="ovsdbserver-nb" Mar 18 18:25:02 crc kubenswrapper[4830]: E0318 18:25:02.328971 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="544c01f7-a6da-45de-96f2-9ab9dea0567c" containerName="ovn-controller" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.328978 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="544c01f7-a6da-45de-96f2-9ab9dea0567c" containerName="ovn-controller" Mar 18 18:25:02 crc kubenswrapper[4830]: E0318 18:25:02.328998 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93739148-39fb-4db3-ae9d-d222feb368d7" containerName="openstack-network-exporter" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.329004 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="93739148-39fb-4db3-ae9d-d222feb368d7" containerName="openstack-network-exporter" Mar 18 18:25:02 crc kubenswrapper[4830]: E0318 18:25:02.329011 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9be76a38-b85f-458f-b5c9-181abf962109" containerName="proxy-httpd" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.329017 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="9be76a38-b85f-458f-b5c9-181abf962109" containerName="proxy-httpd" Mar 18 18:25:02 crc kubenswrapper[4830]: E0318 18:25:02.329027 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f82e5b2f-cb79-4b83-901f-eca64116c6dc" containerName="probe" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.329033 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f82e5b2f-cb79-4b83-901f-eca64116c6dc" containerName="probe" Mar 18 18:25:02 crc kubenswrapper[4830]: E0318 18:25:02.329044 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86ecee90-92ea-4ef1-a871-49018c2ac648" containerName="dnsmasq-dns" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.329050 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ecee90-92ea-4ef1-a871-49018c2ac648" containerName="dnsmasq-dns" Mar 18 18:25:02 crc kubenswrapper[4830]: E0318 18:25:02.329065 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f82e5b2f-cb79-4b83-901f-eca64116c6dc" containerName="cinder-scheduler" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.329072 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f82e5b2f-cb79-4b83-901f-eca64116c6dc" containerName="cinder-scheduler" Mar 18 18:25:02 crc kubenswrapper[4830]: E0318 18:25:02.329080 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60094d0f-d530-424e-92d1-62e473acc664" containerName="openstack-network-exporter" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.329085 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="60094d0f-d530-424e-92d1-62e473acc664" containerName="openstack-network-exporter" Mar 18 18:25:02 crc kubenswrapper[4830]: E0318 18:25:02.329096 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9be76a38-b85f-458f-b5c9-181abf962109" containerName="proxy-server" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.329101 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="9be76a38-b85f-458f-b5c9-181abf962109" containerName="proxy-server" Mar 18 18:25:02 crc kubenswrapper[4830]: E0318 18:25:02.329109 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6d11dd9-4b5b-463e-a834-91c7ecc8b021" containerName="openstack-network-exporter" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.329114 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6d11dd9-4b5b-463e-a834-91c7ecc8b021" containerName="openstack-network-exporter" Mar 18 18:25:02 crc kubenswrapper[4830]: E0318 18:25:02.329127 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86ecee90-92ea-4ef1-a871-49018c2ac648" containerName="init" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.329132 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ecee90-92ea-4ef1-a871-49018c2ac648" containerName="init" Mar 18 18:25:02 crc kubenswrapper[4830]: E0318 18:25:02.329141 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6d11dd9-4b5b-463e-a834-91c7ecc8b021" containerName="ovsdbserver-sb" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.329147 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6d11dd9-4b5b-463e-a834-91c7ecc8b021" containerName="ovsdbserver-sb" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.329301 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="93739148-39fb-4db3-ae9d-d222feb368d7" containerName="ovsdbserver-nb" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.329314 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="93739148-39fb-4db3-ae9d-d222feb368d7" containerName="openstack-network-exporter" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.329327 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="9be76a38-b85f-458f-b5c9-181abf962109" containerName="proxy-httpd" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.329337 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="544c01f7-a6da-45de-96f2-9ab9dea0567c" containerName="ovn-controller" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.329348 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="60094d0f-d530-424e-92d1-62e473acc664" containerName="openstack-network-exporter" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.329359 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="9be76a38-b85f-458f-b5c9-181abf962109" containerName="proxy-server" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.329371 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6d11dd9-4b5b-463e-a834-91c7ecc8b021" containerName="openstack-network-exporter" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.329382 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6d11dd9-4b5b-463e-a834-91c7ecc8b021" containerName="ovsdbserver-sb" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.329388 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f82e5b2f-cb79-4b83-901f-eca64116c6dc" containerName="cinder-scheduler" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.329394 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="86ecee90-92ea-4ef1-a871-49018c2ac648" containerName="dnsmasq-dns" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.329403 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f82e5b2f-cb79-4b83-901f-eca64116c6dc" containerName="probe" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.330035 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-clvpz" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.333961 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9be76a38-b85f-458f-b5c9-181abf962109-config-data" (OuterVolumeSpecName: "config-data") pod "9be76a38-b85f-458f-b5c9-181abf962109" (UID: "9be76a38-b85f-458f-b5c9-181abf962109"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.336044 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.337485 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-clvpz"] Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.383371 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.392744 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9be76a38-b85f-458f-b5c9-181abf962109-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.392882 4830 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9be76a38-b85f-458f-b5c9-181abf962109-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.494077 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bab36094-736f-460a-83d1-bd298dee7774-combined-ca-bundle\") pod \"bab36094-736f-460a-83d1-bd298dee7774\" (UID: \"bab36094-736f-460a-83d1-bd298dee7774\") " Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.494121 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwdv6\" (UniqueName: \"kubernetes.io/projected/bab36094-736f-460a-83d1-bd298dee7774-kube-api-access-xwdv6\") pod \"bab36094-736f-460a-83d1-bd298dee7774\" (UID: \"bab36094-736f-460a-83d1-bd298dee7774\") " Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.494148 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bab36094-736f-460a-83d1-bd298dee7774-nova-novncproxy-tls-certs\") pod \"bab36094-736f-460a-83d1-bd298dee7774\" (UID: \"bab36094-736f-460a-83d1-bd298dee7774\") " Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.494176 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bab36094-736f-460a-83d1-bd298dee7774-config-data\") pod \"bab36094-736f-460a-83d1-bd298dee7774\" (UID: \"bab36094-736f-460a-83d1-bd298dee7774\") " Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.494237 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bab36094-736f-460a-83d1-bd298dee7774-vencrypt-tls-certs\") pod \"bab36094-736f-460a-83d1-bd298dee7774\" (UID: \"bab36094-736f-460a-83d1-bd298dee7774\") " Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.500758 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jl6k\" (UniqueName: \"kubernetes.io/projected/77147fe4-670f-40ca-ab50-4d3220442eee-kube-api-access-5jl6k\") pod \"root-account-create-update-clvpz\" (UID: \"77147fe4-670f-40ca-ab50-4d3220442eee\") " pod="openstack/root-account-create-update-clvpz" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.500842 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77147fe4-670f-40ca-ab50-4d3220442eee-operator-scripts\") pod \"root-account-create-update-clvpz\" (UID: \"77147fe4-670f-40ca-ab50-4d3220442eee\") " pod="openstack/root-account-create-update-clvpz" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.503927 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bab36094-736f-460a-83d1-bd298dee7774-kube-api-access-xwdv6" (OuterVolumeSpecName: "kube-api-access-xwdv6") pod "bab36094-736f-460a-83d1-bd298dee7774" (UID: "bab36094-736f-460a-83d1-bd298dee7774"). InnerVolumeSpecName "kube-api-access-xwdv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.530930 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bab36094-736f-460a-83d1-bd298dee7774-config-data" (OuterVolumeSpecName: "config-data") pod "bab36094-736f-460a-83d1-bd298dee7774" (UID: "bab36094-736f-460a-83d1-bd298dee7774"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.543378 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bab36094-736f-460a-83d1-bd298dee7774-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bab36094-736f-460a-83d1-bd298dee7774" (UID: "bab36094-736f-460a-83d1-bd298dee7774"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.549858 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bab36094-736f-460a-83d1-bd298dee7774-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "bab36094-736f-460a-83d1-bd298dee7774" (UID: "bab36094-736f-460a-83d1-bd298dee7774"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.581899 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bab36094-736f-460a-83d1-bd298dee7774-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "bab36094-736f-460a-83d1-bd298dee7774" (UID: "bab36094-736f-460a-83d1-bd298dee7774"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.582263 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7f98-account-create-update-lcc8p" event={"ID":"4120b308-df6b-45df-ab90-abc5417228e5","Type":"ContainerStarted","Data":"e34b06348c1e98c466d929d0330c19f90981e845378263b31797f130d7616067"} Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.596743 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f82e5b2f-cb79-4b83-901f-eca64116c6dc","Type":"ContainerDied","Data":"cb2d6e60f8a2bd293191e5cf991ab5c2cf38315ea96e0ad1d379d20116844258"} Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.596811 4830 scope.go:117] "RemoveContainer" containerID="29d7529f10ab82210873010bcc63b7af8c9609591cd2c3e31e8d0689d2b017f6" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.596822 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.600360 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2062-account-create-update-92hq2" event={"ID":"b1e4500f-681e-433d-8283-008eec618721","Type":"ContainerStarted","Data":"065270537c3865a258900a7eb2eb661016789d139cd9f39c8a59db9a3babfd91"} Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.603082 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jl6k\" (UniqueName: \"kubernetes.io/projected/77147fe4-670f-40ca-ab50-4d3220442eee-kube-api-access-5jl6k\") pod \"root-account-create-update-clvpz\" (UID: \"77147fe4-670f-40ca-ab50-4d3220442eee\") " pod="openstack/root-account-create-update-clvpz" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.603184 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77147fe4-670f-40ca-ab50-4d3220442eee-operator-scripts\") pod \"root-account-create-update-clvpz\" (UID: \"77147fe4-670f-40ca-ab50-4d3220442eee\") " pod="openstack/root-account-create-update-clvpz" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.603435 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bab36094-736f-460a-83d1-bd298dee7774-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.603511 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwdv6\" (UniqueName: \"kubernetes.io/projected/bab36094-736f-460a-83d1-bd298dee7774-kube-api-access-xwdv6\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.603633 4830 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bab36094-736f-460a-83d1-bd298dee7774-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.603685 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bab36094-736f-460a-83d1-bd298dee7774-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.603734 4830 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bab36094-736f-460a-83d1-bd298dee7774-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.606861 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77147fe4-670f-40ca-ab50-4d3220442eee-operator-scripts\") pod \"root-account-create-update-clvpz\" (UID: \"77147fe4-670f-40ca-ab50-4d3220442eee\") " pod="openstack/root-account-create-update-clvpz" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.615245 4830 generic.go:334] "Generic (PLEG): container finished" podID="435574fa-a924-4289-a93a-dea05d57d105" containerID="01d8e91004d318c41a6579e547dd6425e1913b522dba6cd78012d1eca9d7aedf" exitCode=0 Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.615326 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"435574fa-a924-4289-a93a-dea05d57d105","Type":"ContainerDied","Data":"01d8e91004d318c41a6579e547dd6425e1913b522dba6cd78012d1eca9d7aedf"} Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.619663 4830 generic.go:334] "Generic (PLEG): container finished" podID="bab36094-736f-460a-83d1-bd298dee7774" containerID="62da465c43e927fc0029ba25702e0c328ba20afbde5a9049d2d8a147434c24a7" exitCode=0 Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.619760 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.620131 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bab36094-736f-460a-83d1-bd298dee7774","Type":"ContainerDied","Data":"62da465c43e927fc0029ba25702e0c328ba20afbde5a9049d2d8a147434c24a7"} Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.620219 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bab36094-736f-460a-83d1-bd298dee7774","Type":"ContainerDied","Data":"92cdf158326b48ba0322e8a532b52ede524b5b52f72cda66606bad3322557e82"} Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.620396 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.621188 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-a2d6-account-create-update-ltf4b" event={"ID":"a3421452-ceb9-441f-8982-77c0a33c7a3b","Type":"ContainerStarted","Data":"796a30488dcd236dc2b9b04475257c4ec18922738550da5b7bc46bb1c32b2b6c"} Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.641294 4830 scope.go:117] "RemoveContainer" containerID="b5a1cb9ea4b62aec9ff11f16f75f04dd21b28a1d37ed79fbc6fa3de1b8390289" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.646588 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jl6k\" (UniqueName: \"kubernetes.io/projected/77147fe4-670f-40ca-ab50-4d3220442eee-kube-api-access-5jl6k\") pod \"root-account-create-update-clvpz\" (UID: \"77147fe4-670f-40ca-ab50-4d3220442eee\") " pod="openstack/root-account-create-update-clvpz" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.669001 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.683704 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.698472 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e6d11dd9-4b5b-463e-a834-91c7ecc8b021/ovsdbserver-sb/0.log" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.698648 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e6d11dd9-4b5b-463e-a834-91c7ecc8b021","Type":"ContainerDied","Data":"cfed119a9d9df8a00ccdadf5a187c6fa1edd82ab5917850b8b4c39e4ed1bcd6b"} Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.698826 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.707694 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/435574fa-a924-4289-a93a-dea05d57d105-operator-scripts\") pod \"435574fa-a924-4289-a93a-dea05d57d105\" (UID: \"435574fa-a924-4289-a93a-dea05d57d105\") " Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.707736 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/435574fa-a924-4289-a93a-dea05d57d105-combined-ca-bundle\") pod \"435574fa-a924-4289-a93a-dea05d57d105\" (UID: \"435574fa-a924-4289-a93a-dea05d57d105\") " Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.707949 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/435574fa-a924-4289-a93a-dea05d57d105-config-data-generated\") pod \"435574fa-a924-4289-a93a-dea05d57d105\" (UID: \"435574fa-a924-4289-a93a-dea05d57d105\") " Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.707995 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/435574fa-a924-4289-a93a-dea05d57d105-galera-tls-certs\") pod \"435574fa-a924-4289-a93a-dea05d57d105\" (UID: \"435574fa-a924-4289-a93a-dea05d57d105\") " Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.708021 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ft9kh\" (UniqueName: \"kubernetes.io/projected/435574fa-a924-4289-a93a-dea05d57d105-kube-api-access-ft9kh\") pod \"435574fa-a924-4289-a93a-dea05d57d105\" (UID: \"435574fa-a924-4289-a93a-dea05d57d105\") " Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.708056 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/435574fa-a924-4289-a93a-dea05d57d105-config-data-default\") pod \"435574fa-a924-4289-a93a-dea05d57d105\" (UID: \"435574fa-a924-4289-a93a-dea05d57d105\") " Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.708090 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"435574fa-a924-4289-a93a-dea05d57d105\" (UID: \"435574fa-a924-4289-a93a-dea05d57d105\") " Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.708128 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/435574fa-a924-4289-a93a-dea05d57d105-kolla-config\") pod \"435574fa-a924-4289-a93a-dea05d57d105\" (UID: \"435574fa-a924-4289-a93a-dea05d57d105\") " Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.708530 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/435574fa-a924-4289-a93a-dea05d57d105-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "435574fa-a924-4289-a93a-dea05d57d105" (UID: "435574fa-a924-4289-a93a-dea05d57d105"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.709338 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/435574fa-a924-4289-a93a-dea05d57d105-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "435574fa-a924-4289-a93a-dea05d57d105" (UID: "435574fa-a924-4289-a93a-dea05d57d105"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.709441 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/435574fa-a924-4289-a93a-dea05d57d105-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "435574fa-a924-4289-a93a-dea05d57d105" (UID: "435574fa-a924-4289-a93a-dea05d57d105"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.710040 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/435574fa-a924-4289-a93a-dea05d57d105-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "435574fa-a924-4289-a93a-dea05d57d105" (UID: "435574fa-a924-4289-a93a-dea05d57d105"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.721215 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-clvpz" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.721929 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/435574fa-a924-4289-a93a-dea05d57d105-kube-api-access-ft9kh" (OuterVolumeSpecName: "kube-api-access-ft9kh") pod "435574fa-a924-4289-a93a-dea05d57d105" (UID: "435574fa-a924-4289-a93a-dea05d57d105"). InnerVolumeSpecName "kube-api-access-ft9kh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.727052 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "mysql-db") pod "435574fa-a924-4289-a93a-dea05d57d105" (UID: "435574fa-a924-4289-a93a-dea05d57d105"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.727287 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4043-account-create-update-qpth4" event={"ID":"781fdccd-a9f3-40ce-9234-d651c079eb1e","Type":"ContainerDied","Data":"b501e9203036a4afb723c9dc655a2c3127365dbd4dff4770f3f8216154935e6b"} Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.727384 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4043-account-create-update-qpth4" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.779766 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/435574fa-a924-4289-a93a-dea05d57d105-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "435574fa-a924-4289-a93a-dea05d57d105" (UID: "435574fa-a924-4289-a93a-dea05d57d105"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.788141 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d76d78d97-bs4hd" event={"ID":"9be76a38-b85f-458f-b5c9-181abf962109","Type":"ContainerDied","Data":"e4ae57fbf770084e2f021cc7025e5db67046ff08c356ae524ef0c1c7a7981718"} Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.788282 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-d76d78d97-bs4hd" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.791172 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-03fa-account-create-update-fsvc6" event={"ID":"a4a0587e-8ede-4ec6-beb7-7bea2c0af8bd","Type":"ContainerStarted","Data":"9a0359e08c652b70bb3371e22047c117cf60a060f58265a5fcaaadf6a0c827fc"} Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.805149 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.169:8776/healthcheck\": read tcp 10.217.0.2:56780->10.217.0.169:8776: read: connection reset by peer" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.820998 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b8f8-account-create-update-knfmq" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.821003 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b8f8-account-create-update-knfmq" event={"ID":"0df8dbfa-578e-4edf-ac2a-2030b582bc63","Type":"ContainerDied","Data":"7ed51f6ee3c0f81f2ecb70d99000e5203727aa839556140bf48a3b994c0d3bdf"} Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.825323 4830 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/435574fa-a924-4289-a93a-dea05d57d105-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.825689 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ft9kh\" (UniqueName: \"kubernetes.io/projected/435574fa-a924-4289-a93a-dea05d57d105-kube-api-access-ft9kh\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.825700 4830 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/435574fa-a924-4289-a93a-dea05d57d105-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.825724 4830 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.825735 4830 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/435574fa-a924-4289-a93a-dea05d57d105-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.825744 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/435574fa-a924-4289-a93a-dea05d57d105-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.825753 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/435574fa-a924-4289-a93a-dea05d57d105-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.839179 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-88cd-account-create-update-8vhqn" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.839504 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-chwf9" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.840671 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.848371 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/435574fa-a924-4289-a93a-dea05d57d105-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "435574fa-a924-4289-a93a-dea05d57d105" (UID: "435574fa-a924-4289-a93a-dea05d57d105"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.855076 4830 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.932096 4830 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/435574fa-a924-4289-a93a-dea05d57d105-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:02 crc kubenswrapper[4830]: I0318 18:25:02.932381 4830 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:02 crc kubenswrapper[4830]: E0318 18:25:02.932446 4830 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 18 18:25:02 crc kubenswrapper[4830]: E0318 18:25:02.932497 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e3a34a0e-8390-4618-8b6e-c27ed8adc51a-operator-scripts podName:e3a34a0e-8390-4618-8b6e-c27ed8adc51a nodeName:}" failed. No retries permitted until 2026-03-18 18:25:04.932481925 +0000 UTC m=+1339.500112257 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e3a34a0e-8390-4618-8b6e-c27ed8adc51a-operator-scripts") pod "root-account-create-update-lhdqd" (UID: "e3a34a0e-8390-4618-8b6e-c27ed8adc51a") : configmap "openstack-cell1-scripts" not found Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.035290 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcd8w\" (UniqueName: \"kubernetes.io/projected/eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5-kube-api-access-wcd8w\") pod \"nova-cell1-88cd-account-create-update-8vhqn\" (UID: \"eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5\") " pod="openstack/nova-cell1-88cd-account-create-update-8vhqn" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.035379 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5-operator-scripts\") pod \"nova-cell1-88cd-account-create-update-8vhqn\" (UID: \"eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5\") " pod="openstack/nova-cell1-88cd-account-create-update-8vhqn" Mar 18 18:25:03 crc kubenswrapper[4830]: E0318 18:25:03.035515 4830 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 18 18:25:03 crc kubenswrapper[4830]: E0318 18:25:03.035586 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5-operator-scripts podName:eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5 nodeName:}" failed. No retries permitted until 2026-03-18 18:25:07.035567928 +0000 UTC m=+1341.603198260 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5-operator-scripts") pod "nova-cell1-88cd-account-create-update-8vhqn" (UID: "eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5") : configmap "openstack-cell1-scripts" not found Mar 18 18:25:03 crc kubenswrapper[4830]: E0318 18:25:03.040412 4830 projected.go:194] Error preparing data for projected volume kube-api-access-wcd8w for pod openstack/nova-cell1-88cd-account-create-update-8vhqn: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Mar 18 18:25:03 crc kubenswrapper[4830]: E0318 18:25:03.040472 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5-kube-api-access-wcd8w podName:eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5 nodeName:}" failed. No retries permitted until 2026-03-18 18:25:07.040453086 +0000 UTC m=+1341.608083418 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-wcd8w" (UniqueName: "kubernetes.io/projected/eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5-kube-api-access-wcd8w") pod "nova-cell1-88cd-account-create-update-8vhqn" (UID: "eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.076988 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7f98-account-create-update-lcc8p" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.080256 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.080892 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eaec193f-d7b0-4d62-8133-3c1b094a1c71" containerName="ceilometer-central-agent" containerID="cri-o://65066b3a4fe0d4c7187ffc8f87f73fa9a33a31d62b060e41f62c950b0fe762f3" gracePeriod=30 Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.081414 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eaec193f-d7b0-4d62-8133-3c1b094a1c71" containerName="sg-core" containerID="cri-o://005913a3f36b52570d077d6af5c36588b42eb3f96b9355948ef0a743de24a6ba" gracePeriod=30 Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.081568 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eaec193f-d7b0-4d62-8133-3c1b094a1c71" containerName="proxy-httpd" containerID="cri-o://63b37854cf719feb1c02ec413574066ba4e4851ec5e2dcf31206e9b303fe11b9" gracePeriod=30 Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.085244 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eaec193f-d7b0-4d62-8133-3c1b094a1c71" containerName="ceilometer-notification-agent" containerID="cri-o://2e586d789cc93d7b5024e68fdd566bb1e63ba1b1e2a073f9da5ce7c5613a1dec" gracePeriod=30 Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.100045 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-4043-account-create-update-qpth4"] Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.107070 4830 scope.go:117] "RemoveContainer" containerID="62da465c43e927fc0029ba25702e0c328ba20afbde5a9049d2d8a147434c24a7" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.111188 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-4043-account-create-update-qpth4"] Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.159477 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-d76d78d97-bs4hd"] Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.167298 4830 scope.go:117] "RemoveContainer" containerID="62da465c43e927fc0029ba25702e0c328ba20afbde5a9049d2d8a147434c24a7" Mar 18 18:25:03 crc kubenswrapper[4830]: E0318 18:25:03.179941 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62da465c43e927fc0029ba25702e0c328ba20afbde5a9049d2d8a147434c24a7\": container with ID starting with 62da465c43e927fc0029ba25702e0c328ba20afbde5a9049d2d8a147434c24a7 not found: ID does not exist" containerID="62da465c43e927fc0029ba25702e0c328ba20afbde5a9049d2d8a147434c24a7" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.179988 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62da465c43e927fc0029ba25702e0c328ba20afbde5a9049d2d8a147434c24a7"} err="failed to get container status \"62da465c43e927fc0029ba25702e0c328ba20afbde5a9049d2d8a147434c24a7\": rpc error: code = NotFound desc = could not find container \"62da465c43e927fc0029ba25702e0c328ba20afbde5a9049d2d8a147434c24a7\": container with ID starting with 62da465c43e927fc0029ba25702e0c328ba20afbde5a9049d2d8a147434c24a7 not found: ID does not exist" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.180017 4830 scope.go:117] "RemoveContainer" containerID="6d02c3d4022f8ff71336fe32eb97efefa0f42dad83cb62b31f62c9f071d62b10" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.185951 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.186209 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="3e810512-a127-40b3-b1c2-559c3b86fcdb" containerName="kube-state-metrics" containerID="cri-o://966ee135f8e6e3d440939198bd3d2a3c627df5403d51fc43caced871d92ebe29" gracePeriod=30 Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.232917 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-d76d78d97-bs4hd"] Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.240468 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4120b308-df6b-45df-ab90-abc5417228e5-operator-scripts\") pod \"4120b308-df6b-45df-ab90-abc5417228e5\" (UID: \"4120b308-df6b-45df-ab90-abc5417228e5\") " Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.240621 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6llb\" (UniqueName: \"kubernetes.io/projected/4120b308-df6b-45df-ab90-abc5417228e5-kube-api-access-z6llb\") pod \"4120b308-df6b-45df-ab90-abc5417228e5\" (UID: \"4120b308-df6b-45df-ab90-abc5417228e5\") " Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.258293 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4120b308-df6b-45df-ab90-abc5417228e5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4120b308-df6b-45df-ab90-abc5417228e5" (UID: "4120b308-df6b-45df-ab90-abc5417228e5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.259301 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2062-account-create-update-92hq2" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.262333 4830 scope.go:117] "RemoveContainer" containerID="e27720e7dca97ec5784c549e6e6c7e84e6b4913613d159710e88f4288654e511" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.319693 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-chwf9"] Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.329895 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4120b308-df6b-45df-ab90-abc5417228e5-kube-api-access-z6llb" (OuterVolumeSpecName: "kube-api-access-z6llb") pod "4120b308-df6b-45df-ab90-abc5417228e5" (UID: "4120b308-df6b-45df-ab90-abc5417228e5"). InnerVolumeSpecName "kube-api-access-z6llb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.363095 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-chwf9"] Mar 18 18:25:03 crc kubenswrapper[4830]: E0318 18:25:03.373654 4830 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 18 18:25:03 crc kubenswrapper[4830]: E0318 18:25:03.373725 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/56fb6c83-b748-4e21-9b1c-90fb37cefea1-config-data podName:56fb6c83-b748-4e21-9b1c-90fb37cefea1 nodeName:}" failed. No retries permitted until 2026-03-18 18:25:07.373704184 +0000 UTC m=+1341.941334516 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/56fb6c83-b748-4e21-9b1c-90fb37cefea1-config-data") pod "rabbitmq-server-0" (UID: "56fb6c83-b748-4e21-9b1c-90fb37cefea1") : configmap "rabbitmq-config-data" not found Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.374217 4830 scope.go:117] "RemoveContainer" containerID="c0416d3b3912bda28adfb32ff6910ca06aa3d2a68ff4208501b26467c7a964b5" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.374281 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6llb\" (UniqueName: \"kubernetes.io/projected/4120b308-df6b-45df-ab90-abc5417228e5-kube-api-access-z6llb\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.374320 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4120b308-df6b-45df-ab90-abc5417228e5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.439423 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.453869 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.465681 4830 scope.go:117] "RemoveContainer" containerID="10ea1ae62f7573f638e31db710f3455f544b39c9e8f84f23270b74eeae48b588" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.473295 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b8f8-account-create-update-knfmq"] Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.476478 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1e4500f-681e-433d-8283-008eec618721-operator-scripts\") pod \"b1e4500f-681e-433d-8283-008eec618721\" (UID: \"b1e4500f-681e-433d-8283-008eec618721\") " Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.476635 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn6h4\" (UniqueName: \"kubernetes.io/projected/b1e4500f-681e-433d-8283-008eec618721-kube-api-access-tn6h4\") pod \"b1e4500f-681e-433d-8283-008eec618721\" (UID: \"b1e4500f-681e-433d-8283-008eec618721\") " Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.477613 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1e4500f-681e-433d-8283-008eec618721-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b1e4500f-681e-433d-8283-008eec618721" (UID: "b1e4500f-681e-433d-8283-008eec618721"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.486859 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b8f8-account-create-update-knfmq"] Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.505010 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1e4500f-681e-433d-8283-008eec618721-kube-api-access-tn6h4" (OuterVolumeSpecName: "kube-api-access-tn6h4") pod "b1e4500f-681e-433d-8283-008eec618721" (UID: "b1e4500f-681e-433d-8283-008eec618721"). InnerVolumeSpecName "kube-api-access-tn6h4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.513052 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.513270 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="a0a1e291-1a11-4747-96ed-32c95623dcbb" containerName="memcached" containerID="cri-o://c6c30f91c3c07f2417a561616bc4ab4ba1863961710fa17a2a7105a6e4af19cd" gracePeriod=30 Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.522524 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.525030 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-df3e-account-create-update-vd9pb"] Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.532819 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.536807 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-df3e-account-create-update-vd9pb"] Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.546846 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-df3e-account-create-update-jn48p"] Mar 18 18:25:03 crc kubenswrapper[4830]: E0318 18:25:03.547393 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bab36094-736f-460a-83d1-bd298dee7774" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.547414 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="bab36094-736f-460a-83d1-bd298dee7774" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 18:25:03 crc kubenswrapper[4830]: E0318 18:25:03.547430 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="435574fa-a924-4289-a93a-dea05d57d105" containerName="galera" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.547438 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="435574fa-a924-4289-a93a-dea05d57d105" containerName="galera" Mar 18 18:25:03 crc kubenswrapper[4830]: E0318 18:25:03.547447 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="435574fa-a924-4289-a93a-dea05d57d105" containerName="mysql-bootstrap" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.547453 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="435574fa-a924-4289-a93a-dea05d57d105" containerName="mysql-bootstrap" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.547628 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="435574fa-a924-4289-a93a-dea05d57d105" containerName="galera" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.547649 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="bab36094-736f-460a-83d1-bd298dee7774" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.548315 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-df3e-account-create-update-jn48p" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.550725 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.564899 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-88cd-account-create-update-8vhqn"] Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.573837 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-df3e-account-create-update-jn48p"] Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.580703 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0586337e-461c-4367-8f0a-0bc3593732ce-operator-scripts\") pod \"keystone-df3e-account-create-update-jn48p\" (UID: \"0586337e-461c-4367-8f0a-0bc3593732ce\") " pod="openstack/keystone-df3e-account-create-update-jn48p" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.580926 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h58f\" (UniqueName: \"kubernetes.io/projected/0586337e-461c-4367-8f0a-0bc3593732ce-kube-api-access-8h58f\") pod \"keystone-df3e-account-create-update-jn48p\" (UID: \"0586337e-461c-4367-8f0a-0bc3593732ce\") " pod="openstack/keystone-df3e-account-create-update-jn48p" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.581109 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tn6h4\" (UniqueName: \"kubernetes.io/projected/b1e4500f-681e-433d-8283-008eec618721-kube-api-access-tn6h4\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.581172 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1e4500f-681e-433d-8283-008eec618721-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.582373 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-88cd-account-create-update-8vhqn"] Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.597515 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-rkkhc"] Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.597580 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-rkkhc"] Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.598714 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-fknfr"] Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.613829 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-fknfr"] Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.619645 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6f6ff8b5bf-p5xgc"] Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.619902 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-6f6ff8b5bf-p5xgc" podUID="4ce021de-a1a0-43a6-a2fa-270ea1238bac" containerName="keystone-api" containerID="cri-o://9ac24ed62afef232745223270fdd95256063c89724522a58a2fc1a5183dbf7a7" gracePeriod=30 Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.628780 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.640095 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.670836 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.682406 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0586337e-461c-4367-8f0a-0bc3593732ce-operator-scripts\") pod \"keystone-df3e-account-create-update-jn48p\" (UID: \"0586337e-461c-4367-8f0a-0bc3593732ce\") " pod="openstack/keystone-df3e-account-create-update-jn48p" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.682509 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h58f\" (UniqueName: \"kubernetes.io/projected/0586337e-461c-4367-8f0a-0bc3593732ce-kube-api-access-8h58f\") pod \"keystone-df3e-account-create-update-jn48p\" (UID: \"0586337e-461c-4367-8f0a-0bc3593732ce\") " pod="openstack/keystone-df3e-account-create-update-jn48p" Mar 18 18:25:03 crc kubenswrapper[4830]: E0318 18:25:03.683053 4830 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 18 18:25:03 crc kubenswrapper[4830]: E0318 18:25:03.683166 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0586337e-461c-4367-8f0a-0bc3593732ce-operator-scripts podName:0586337e-461c-4367-8f0a-0bc3593732ce nodeName:}" failed. No retries permitted until 2026-03-18 18:25:04.183127348 +0000 UTC m=+1338.750757680 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0586337e-461c-4367-8f0a-0bc3593732ce-operator-scripts") pod "keystone-df3e-account-create-update-jn48p" (UID: "0586337e-461c-4367-8f0a-0bc3593732ce") : configmap "openstack-scripts" not found Mar 18 18:25:03 crc kubenswrapper[4830]: E0318 18:25:03.685795 4830 projected.go:194] Error preparing data for projected volume kube-api-access-8h58f for pod openstack/keystone-df3e-account-create-update-jn48p: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 18 18:25:03 crc kubenswrapper[4830]: E0318 18:25:03.685862 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0586337e-461c-4367-8f0a-0bc3593732ce-kube-api-access-8h58f podName:0586337e-461c-4367-8f0a-0bc3593732ce nodeName:}" failed. No retries permitted until 2026-03-18 18:25:04.185842864 +0000 UTC m=+1338.753473196 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-8h58f" (UniqueName: "kubernetes.io/projected/0586337e-461c-4367-8f0a-0bc3593732ce-kube-api-access-8h58f") pod "keystone-df3e-account-create-update-jn48p" (UID: "0586337e-461c-4367-8f0a-0bc3593732ce") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.701992 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-df3e-account-create-update-jn48p"] Mar 18 18:25:03 crc kubenswrapper[4830]: E0318 18:25:03.702748 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-8h58f operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-df3e-account-create-update-jn48p" podUID="0586337e-461c-4367-8f0a-0bc3593732ce" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.715897 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-btc59"] Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.736999 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-btc59"] Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.761859 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-clvpz"] Mar 18 18:25:03 crc kubenswrapper[4830]: E0318 18:25:03.774877 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc is running failed: container process not found" containerID="4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:25:03 crc kubenswrapper[4830]: E0318 18:25:03.775527 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="880631acc0141d0007f3a250db7aaba33c7a12bda1b531c7c202660030481e50" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:25:03 crc kubenswrapper[4830]: E0318 18:25:03.775812 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc is running failed: container process not found" containerID="4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:25:03 crc kubenswrapper[4830]: E0318 18:25:03.776033 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc is running failed: container process not found" containerID="4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:25:03 crc kubenswrapper[4830]: E0318 18:25:03.776066 4830 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-dv8kn" podUID="23b737c7-6b5d-44f4-b05a-de278f4ca572" containerName="ovsdb-server" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.784574 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcd8w\" (UniqueName: \"kubernetes.io/projected/eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5-kube-api-access-wcd8w\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.784599 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:03 crc kubenswrapper[4830]: E0318 18:25:03.785292 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="880631acc0141d0007f3a250db7aaba33c7a12bda1b531c7c202660030481e50" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:25:03 crc kubenswrapper[4830]: E0318 18:25:03.786571 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="880631acc0141d0007f3a250db7aaba33c7a12bda1b531c7c202660030481e50" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:25:03 crc kubenswrapper[4830]: E0318 18:25:03.786602 4830 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-dv8kn" podUID="23b737c7-6b5d-44f4-b05a-de278f4ca572" containerName="ovs-vswitchd" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.868921 4830 generic.go:334] "Generic (PLEG): container finished" podID="1e8e20bd-67c1-48a7-be43-c585d65656ea" containerID="4cc141c38da7f2f14e8af81b886f2466b15b63a804233f4ae743bb0e785d7d90" exitCode=0 Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.868991 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1e8e20bd-67c1-48a7-be43-c585d65656ea","Type":"ContainerDied","Data":"4cc141c38da7f2f14e8af81b886f2466b15b63a804233f4ae743bb0e785d7d90"} Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.869013 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1e8e20bd-67c1-48a7-be43-c585d65656ea","Type":"ContainerDied","Data":"af056868fa1366cf5665b0b5558680ca7fbae4a4157d29750ff0672dfb35222e"} Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.869024 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af056868fa1366cf5665b0b5558680ca7fbae4a4157d29750ff0672dfb35222e" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.871848 4830 generic.go:334] "Generic (PLEG): container finished" podID="ad760963-34af-440e-9931-fbc23783d7cb" containerID="06b5da3aa085e9b3e11d65936e872fab74b18aa97d39f5db82fc225e3ce954b4" exitCode=0 Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.871913 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-676956db6-6grw2" event={"ID":"ad760963-34af-440e-9931-fbc23783d7cb","Type":"ContainerDied","Data":"06b5da3aa085e9b3e11d65936e872fab74b18aa97d39f5db82fc225e3ce954b4"} Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.877222 4830 generic.go:334] "Generic (PLEG): container finished" podID="e8631247-bdcb-45ff-a17d-ac7e7ff81800" containerID="41f23f0d4fef2bb42d4c0645e34a4042e362df833aa1814c1dd80e578b447069" exitCode=0 Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.877259 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e8631247-bdcb-45ff-a17d-ac7e7ff81800","Type":"ContainerDied","Data":"41f23f0d4fef2bb42d4c0645e34a4042e362df833aa1814c1dd80e578b447069"} Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.881298 4830 generic.go:334] "Generic (PLEG): container finished" podID="0ac8a4f8-88e7-4cd0-ab89-210fb088b137" containerID="13a949ebe12567f356b288e72620234deec79f64d460b08c050f70b6131858f4" exitCode=0 Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.881424 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0ac8a4f8-88e7-4cd0-ab89-210fb088b137","Type":"ContainerDied","Data":"13a949ebe12567f356b288e72620234deec79f64d460b08c050f70b6131858f4"} Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.886057 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-03fa-account-create-update-fsvc6" event={"ID":"a4a0587e-8ede-4ec6-beb7-7bea2c0af8bd","Type":"ContainerDied","Data":"9a0359e08c652b70bb3371e22047c117cf60a060f58265a5fcaaadf6a0c827fc"} Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.886091 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a0359e08c652b70bb3371e22047c117cf60a060f58265a5fcaaadf6a0c827fc" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.892421 4830 generic.go:334] "Generic (PLEG): container finished" podID="9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b" containerID="29fc62aa8b0c7dff64144c93d1f53c7be2667c73d45b77f8b2e9fee0136dd279" exitCode=0 Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.892501 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b","Type":"ContainerDied","Data":"29fc62aa8b0c7dff64144c93d1f53c7be2667c73d45b77f8b2e9fee0136dd279"} Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.892520 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b","Type":"ContainerDied","Data":"c4362062ff7d150b86119d5b1cbf2b485cb23abf3d495b273bcb1819655c53b7"} Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.892529 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4362062ff7d150b86119d5b1cbf2b485cb23abf3d495b273bcb1819655c53b7" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.894539 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-clvpz"] Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.897053 4830 generic.go:334] "Generic (PLEG): container finished" podID="eaec193f-d7b0-4d62-8133-3c1b094a1c71" containerID="63b37854cf719feb1c02ec413574066ba4e4851ec5e2dcf31206e9b303fe11b9" exitCode=0 Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.897081 4830 generic.go:334] "Generic (PLEG): container finished" podID="eaec193f-d7b0-4d62-8133-3c1b094a1c71" containerID="005913a3f36b52570d077d6af5c36588b42eb3f96b9355948ef0a743de24a6ba" exitCode=2 Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.897089 4830 generic.go:334] "Generic (PLEG): container finished" podID="eaec193f-d7b0-4d62-8133-3c1b094a1c71" containerID="65066b3a4fe0d4c7187ffc8f87f73fa9a33a31d62b060e41f62c950b0fe762f3" exitCode=0 Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.897123 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eaec193f-d7b0-4d62-8133-3c1b094a1c71","Type":"ContainerDied","Data":"63b37854cf719feb1c02ec413574066ba4e4851ec5e2dcf31206e9b303fe11b9"} Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.897145 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eaec193f-d7b0-4d62-8133-3c1b094a1c71","Type":"ContainerDied","Data":"005913a3f36b52570d077d6af5c36588b42eb3f96b9355948ef0a743de24a6ba"} Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.897156 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eaec193f-d7b0-4d62-8133-3c1b094a1c71","Type":"ContainerDied","Data":"65066b3a4fe0d4c7187ffc8f87f73fa9a33a31d62b060e41f62c950b0fe762f3"} Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.898237 4830 generic.go:334] "Generic (PLEG): container finished" podID="3e810512-a127-40b3-b1c2-559c3b86fcdb" containerID="966ee135f8e6e3d440939198bd3d2a3c627df5403d51fc43caced871d92ebe29" exitCode=2 Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.898314 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3e810512-a127-40b3-b1c2-559c3b86fcdb","Type":"ContainerDied","Data":"966ee135f8e6e3d440939198bd3d2a3c627df5403d51fc43caced871d92ebe29"} Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.898329 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3e810512-a127-40b3-b1c2-559c3b86fcdb","Type":"ContainerDied","Data":"bd8f14e621ac53fe72e937b4acbc8fa7a12ab0dee9801f4c8c752ff688d0876f"} Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.898339 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd8f14e621ac53fe72e937b4acbc8fa7a12ab0dee9801f4c8c752ff688d0876f" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.899002 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2062-account-create-update-92hq2" event={"ID":"b1e4500f-681e-433d-8283-008eec618721","Type":"ContainerDied","Data":"065270537c3865a258900a7eb2eb661016789d139cd9f39c8a59db9a3babfd91"} Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.899052 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2062-account-create-update-92hq2" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.904541 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-lhdqd" event={"ID":"e3a34a0e-8390-4618-8b6e-c27ed8adc51a","Type":"ContainerDied","Data":"62a2356e29e06291d1908bb5af963b79ec3b661964253de42b333344b86545b3"} Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.904578 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62a2356e29e06291d1908bb5af963b79ec3b661964253de42b333344b86545b3" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.906163 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"435574fa-a924-4289-a93a-dea05d57d105","Type":"ContainerDied","Data":"cf96e6e3aa6cfbde5731d0a8ac5bfd6e0ea77de40696ffd41f2dcf7a5ab2da05"} Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.906189 4830 scope.go:117] "RemoveContainer" containerID="01d8e91004d318c41a6579e547dd6425e1913b522dba6cd78012d1eca9d7aedf" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.906292 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.913509 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-a2d6-account-create-update-ltf4b" event={"ID":"a3421452-ceb9-441f-8982-77c0a33c7a3b","Type":"ContainerDied","Data":"796a30488dcd236dc2b9b04475257c4ec18922738550da5b7bc46bb1c32b2b6c"} Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.913546 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="796a30488dcd236dc2b9b04475257c4ec18922738550da5b7bc46bb1c32b2b6c" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.915237 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-df3e-account-create-update-jn48p" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.915561 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7f98-account-create-update-lcc8p" Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.916258 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7f98-account-create-update-lcc8p" event={"ID":"4120b308-df6b-45df-ab90-abc5417228e5","Type":"ContainerDied","Data":"e34b06348c1e98c466d929d0330c19f90981e845378263b31797f130d7616067"} Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.960441 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15" containerName="galera" containerID="cri-o://1faa9ae26dd3b098b13664816e21acb92be7c62408cd3cf5567216f95dc7ad27" gracePeriod=30 Mar 18 18:25:03 crc kubenswrapper[4830]: I0318 18:25:03.962517 4830 scope.go:117] "RemoveContainer" containerID="8a2b2534baed3a130b8121d69c5626b8abb92c0dc65a019d61420e4ccd6e5352" Mar 18 18:25:03 crc kubenswrapper[4830]: E0318 18:25:03.997353 4830 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:25:03 crc kubenswrapper[4830]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 18 18:25:03 crc kubenswrapper[4830]: Mar 18 18:25:03 crc kubenswrapper[4830]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 18 18:25:03 crc kubenswrapper[4830]: Mar 18 18:25:03 crc kubenswrapper[4830]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 18 18:25:03 crc kubenswrapper[4830]: Mar 18 18:25:03 crc kubenswrapper[4830]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 18 18:25:03 crc kubenswrapper[4830]: Mar 18 18:25:03 crc kubenswrapper[4830]: if [ -n "" ]; then Mar 18 18:25:03 crc kubenswrapper[4830]: GRANT_DATABASE="" Mar 18 18:25:03 crc kubenswrapper[4830]: else Mar 18 18:25:03 crc kubenswrapper[4830]: GRANT_DATABASE="*" Mar 18 18:25:03 crc kubenswrapper[4830]: fi Mar 18 18:25:03 crc kubenswrapper[4830]: Mar 18 18:25:03 crc kubenswrapper[4830]: # going for maximum compatibility here: Mar 18 18:25:03 crc kubenswrapper[4830]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 18 18:25:03 crc kubenswrapper[4830]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 18 18:25:03 crc kubenswrapper[4830]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 18 18:25:03 crc kubenswrapper[4830]: # support updates Mar 18 18:25:03 crc kubenswrapper[4830]: Mar 18 18:25:03 crc kubenswrapper[4830]: $MYSQL_CMD < logger="UnhandledError" Mar 18 18:25:03 crc kubenswrapper[4830]: E0318 18:25:03.998715 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-clvpz" podUID="77147fe4-670f-40ca-ab50-4d3220442eee" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.084392 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-03fa-account-create-update-fsvc6" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.101513 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.107555 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86vlz\" (UniqueName: \"kubernetes.io/projected/a4a0587e-8ede-4ec6-beb7-7bea2c0af8bd-kube-api-access-86vlz\") pod \"a4a0587e-8ede-4ec6-beb7-7bea2c0af8bd\" (UID: \"a4a0587e-8ede-4ec6-beb7-7bea2c0af8bd\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.107645 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5plzm\" (UniqueName: \"kubernetes.io/projected/1e8e20bd-67c1-48a7-be43-c585d65656ea-kube-api-access-5plzm\") pod \"1e8e20bd-67c1-48a7-be43-c585d65656ea\" (UID: \"1e8e20bd-67c1-48a7-be43-c585d65656ea\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.107666 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e8e20bd-67c1-48a7-be43-c585d65656ea-combined-ca-bundle\") pod \"1e8e20bd-67c1-48a7-be43-c585d65656ea\" (UID: \"1e8e20bd-67c1-48a7-be43-c585d65656ea\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.107702 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4a0587e-8ede-4ec6-beb7-7bea2c0af8bd-operator-scripts\") pod \"a4a0587e-8ede-4ec6-beb7-7bea2c0af8bd\" (UID: \"a4a0587e-8ede-4ec6-beb7-7bea2c0af8bd\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.107731 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e8e20bd-67c1-48a7-be43-c585d65656ea-config-data\") pod \"1e8e20bd-67c1-48a7-be43-c585d65656ea\" (UID: \"1e8e20bd-67c1-48a7-be43-c585d65656ea\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.117640 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4a0587e-8ede-4ec6-beb7-7bea2c0af8bd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a4a0587e-8ede-4ec6-beb7-7bea2c0af8bd" (UID: "a4a0587e-8ede-4ec6-beb7-7bea2c0af8bd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.122097 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e8e20bd-67c1-48a7-be43-c585d65656ea-kube-api-access-5plzm" (OuterVolumeSpecName: "kube-api-access-5plzm") pod "1e8e20bd-67c1-48a7-be43-c585d65656ea" (UID: "1e8e20bd-67c1-48a7-be43-c585d65656ea"). InnerVolumeSpecName "kube-api-access-5plzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.126325 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-2062-account-create-update-92hq2"] Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.162980 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e8e20bd-67c1-48a7-be43-c585d65656ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e8e20bd-67c1-48a7-be43-c585d65656ea" (UID: "1e8e20bd-67c1-48a7-be43-c585d65656ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.163446 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4a0587e-8ede-4ec6-beb7-7bea2c0af8bd-kube-api-access-86vlz" (OuterVolumeSpecName: "kube-api-access-86vlz") pod "a4a0587e-8ede-4ec6-beb7-7bea2c0af8bd" (UID: "a4a0587e-8ede-4ec6-beb7-7bea2c0af8bd"). InnerVolumeSpecName "kube-api-access-86vlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.164539 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-2062-account-create-update-92hq2"] Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.178912 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e8e20bd-67c1-48a7-be43-c585d65656ea-config-data" (OuterVolumeSpecName: "config-data") pod "1e8e20bd-67c1-48a7-be43-c585d65656ea" (UID: "1e8e20bd-67c1-48a7-be43-c585d65656ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.194410 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lhdqd" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.194756 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.205870 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a2d6-account-create-update-ltf4b" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.206333 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.212915 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0586337e-461c-4367-8f0a-0bc3593732ce-operator-scripts\") pod \"keystone-df3e-account-create-update-jn48p\" (UID: \"0586337e-461c-4367-8f0a-0bc3593732ce\") " pod="openstack/keystone-df3e-account-create-update-jn48p" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.213018 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h58f\" (UniqueName: \"kubernetes.io/projected/0586337e-461c-4367-8f0a-0bc3593732ce-kube-api-access-8h58f\") pod \"keystone-df3e-account-create-update-jn48p\" (UID: \"0586337e-461c-4367-8f0a-0bc3593732ce\") " pod="openstack/keystone-df3e-account-create-update-jn48p" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.214367 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4a0587e-8ede-4ec6-beb7-7bea2c0af8bd-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.214415 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e8e20bd-67c1-48a7-be43-c585d65656ea-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.214448 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86vlz\" (UniqueName: \"kubernetes.io/projected/a4a0587e-8ede-4ec6-beb7-7bea2c0af8bd-kube-api-access-86vlz\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.214493 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5plzm\" (UniqueName: \"kubernetes.io/projected/1e8e20bd-67c1-48a7-be43-c585d65656ea-kube-api-access-5plzm\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.214523 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e8e20bd-67c1-48a7-be43-c585d65656ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.215965 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-df3e-account-create-update-jn48p" Mar 18 18:25:04 crc kubenswrapper[4830]: E0318 18:25:04.216027 4830 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 18 18:25:04 crc kubenswrapper[4830]: E0318 18:25:04.216108 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0586337e-461c-4367-8f0a-0bc3593732ce-operator-scripts podName:0586337e-461c-4367-8f0a-0bc3593732ce nodeName:}" failed. No retries permitted until 2026-03-18 18:25:05.216088039 +0000 UTC m=+1339.783718371 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0586337e-461c-4367-8f0a-0bc3593732ce-operator-scripts") pod "keystone-df3e-account-create-update-jn48p" (UID: "0586337e-461c-4367-8f0a-0bc3593732ce") : configmap "openstack-scripts" not found Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.218926 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7f884dc87d-6wvs2" podUID="3e152864-9096-47a7-b0b0-c288840093e7" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.171:9311/healthcheck\": read tcp 10.217.0.2:58230->10.217.0.171:9311: read: connection reset by peer" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.218955 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7f884dc87d-6wvs2" podUID="3e152864-9096-47a7-b0b0-c288840093e7" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.171:9311/healthcheck\": read tcp 10.217.0.2:58220->10.217.0.171:9311: read: connection reset by peer" Mar 18 18:25:04 crc kubenswrapper[4830]: E0318 18:25:04.219606 4830 projected.go:194] Error preparing data for projected volume kube-api-access-8h58f for pod openstack/keystone-df3e-account-create-update-jn48p: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 18 18:25:04 crc kubenswrapper[4830]: E0318 18:25:04.219641 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0586337e-461c-4367-8f0a-0bc3593732ce-kube-api-access-8h58f podName:0586337e-461c-4367-8f0a-0bc3593732ce nodeName:}" failed. No retries permitted until 2026-03-18 18:25:05.219632779 +0000 UTC m=+1339.787263111 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-8h58f" (UniqueName: "kubernetes.io/projected/0586337e-461c-4367-8f0a-0bc3593732ce-kube-api-access-8h58f") pod "keystone-df3e-account-create-update-jn48p" (UID: "0586337e-461c-4367-8f0a-0bc3593732ce") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.319601 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhxmg\" (UniqueName: \"kubernetes.io/projected/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-kube-api-access-rhxmg\") pod \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.319676 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-config-data-custom\") pod \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.319729 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scxbn\" (UniqueName: \"kubernetes.io/projected/3e810512-a127-40b3-b1c2-559c3b86fcdb-kube-api-access-scxbn\") pod \"3e810512-a127-40b3-b1c2-559c3b86fcdb\" (UID: \"3e810512-a127-40b3-b1c2-559c3b86fcdb\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.319810 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3a34a0e-8390-4618-8b6e-c27ed8adc51a-operator-scripts\") pod \"e3a34a0e-8390-4618-8b6e-c27ed8adc51a\" (UID: \"e3a34a0e-8390-4618-8b6e-c27ed8adc51a\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.319837 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e810512-a127-40b3-b1c2-559c3b86fcdb-combined-ca-bundle\") pod \"3e810512-a127-40b3-b1c2-559c3b86fcdb\" (UID: \"3e810512-a127-40b3-b1c2-559c3b86fcdb\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.319866 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-internal-tls-certs\") pod \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.319894 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2c4q\" (UniqueName: \"kubernetes.io/projected/a3421452-ceb9-441f-8982-77c0a33c7a3b-kube-api-access-p2c4q\") pod \"a3421452-ceb9-441f-8982-77c0a33c7a3b\" (UID: \"a3421452-ceb9-441f-8982-77c0a33c7a3b\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.319932 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zf6vv\" (UniqueName: \"kubernetes.io/projected/e3a34a0e-8390-4618-8b6e-c27ed8adc51a-kube-api-access-zf6vv\") pod \"e3a34a0e-8390-4618-8b6e-c27ed8adc51a\" (UID: \"e3a34a0e-8390-4618-8b6e-c27ed8adc51a\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.319965 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-combined-ca-bundle\") pod \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.319984 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-etc-machine-id\") pod \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.320013 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-config-data\") pod \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.320058 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-public-tls-certs\") pod \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.320084 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3421452-ceb9-441f-8982-77c0a33c7a3b-operator-scripts\") pod \"a3421452-ceb9-441f-8982-77c0a33c7a3b\" (UID: \"a3421452-ceb9-441f-8982-77c0a33c7a3b\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.320127 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e810512-a127-40b3-b1c2-559c3b86fcdb-kube-state-metrics-tls-certs\") pod \"3e810512-a127-40b3-b1c2-559c3b86fcdb\" (UID: \"3e810512-a127-40b3-b1c2-559c3b86fcdb\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.320149 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-logs\") pod \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.320194 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/3e810512-a127-40b3-b1c2-559c3b86fcdb-kube-state-metrics-tls-config\") pod \"3e810512-a127-40b3-b1c2-559c3b86fcdb\" (UID: \"3e810512-a127-40b3-b1c2-559c3b86fcdb\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.320218 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-scripts\") pod \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\" (UID: \"9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.327514 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b" (UID: "9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.327550 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3a34a0e-8390-4618-8b6e-c27ed8adc51a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e3a34a0e-8390-4618-8b6e-c27ed8adc51a" (UID: "e3a34a0e-8390-4618-8b6e-c27ed8adc51a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.332433 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3421452-ceb9-441f-8982-77c0a33c7a3b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a3421452-ceb9-441f-8982-77c0a33c7a3b" (UID: "a3421452-ceb9-441f-8982-77c0a33c7a3b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.336740 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b" (UID: "9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.341875 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-logs" (OuterVolumeSpecName: "logs") pod "9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b" (UID: "9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.352736 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3421452-ceb9-441f-8982-77c0a33c7a3b-kube-api-access-p2c4q" (OuterVolumeSpecName: "kube-api-access-p2c4q") pod "a3421452-ceb9-441f-8982-77c0a33c7a3b" (UID: "a3421452-ceb9-441f-8982-77c0a33c7a3b"). InnerVolumeSpecName "kube-api-access-p2c4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.352878 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-kube-api-access-rhxmg" (OuterVolumeSpecName: "kube-api-access-rhxmg") pod "9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b" (UID: "9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b"). InnerVolumeSpecName "kube-api-access-rhxmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.371141 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-scripts" (OuterVolumeSpecName: "scripts") pod "9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b" (UID: "9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.381700 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3a34a0e-8390-4618-8b6e-c27ed8adc51a-kube-api-access-zf6vv" (OuterVolumeSpecName: "kube-api-access-zf6vv") pod "e3a34a0e-8390-4618-8b6e-c27ed8adc51a" (UID: "e3a34a0e-8390-4618-8b6e-c27ed8adc51a"). InnerVolumeSpecName "kube-api-access-zf6vv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: E0318 18:25:04.409620 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d65ffc2b335a667737c6a18c2b396b9a709039acd32a58d2211316eb8df8aa6d" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.430334 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2c4q\" (UniqueName: \"kubernetes.io/projected/a3421452-ceb9-441f-8982-77c0a33c7a3b-kube-api-access-p2c4q\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.430371 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zf6vv\" (UniqueName: \"kubernetes.io/projected/e3a34a0e-8390-4618-8b6e-c27ed8adc51a-kube-api-access-zf6vv\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.430380 4830 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.430389 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3421452-ceb9-441f-8982-77c0a33c7a3b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.430397 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.430408 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.430415 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhxmg\" (UniqueName: \"kubernetes.io/projected/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-kube-api-access-rhxmg\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.430424 4830 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.430433 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3a34a0e-8390-4618-8b6e-c27ed8adc51a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.438961 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e810512-a127-40b3-b1c2-559c3b86fcdb-kube-api-access-scxbn" (OuterVolumeSpecName: "kube-api-access-scxbn") pod "3e810512-a127-40b3-b1c2-559c3b86fcdb" (UID: "3e810512-a127-40b3-b1c2-559c3b86fcdb"). InnerVolumeSpecName "kube-api-access-scxbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: E0318 18:25:04.442940 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d65ffc2b335a667737c6a18c2b396b9a709039acd32a58d2211316eb8df8aa6d" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 18:25:04 crc kubenswrapper[4830]: E0318 18:25:04.444462 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d65ffc2b335a667737c6a18c2b396b9a709039acd32a58d2211316eb8df8aa6d" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 18:25:04 crc kubenswrapper[4830]: E0318 18:25:04.444504 4830 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="44872ddd-52a8-4ca8-a07e-f84111475b8f" containerName="nova-cell1-conductor-conductor" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.470847 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0df8dbfa-578e-4edf-ac2a-2030b582bc63" path="/var/lib/kubelet/pods/0df8dbfa-578e-4edf-ac2a-2030b582bc63/volumes" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.471369 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d3ffcbf-d066-4c5f-bf95-8503bcb983cf" path="/var/lib/kubelet/pods/4d3ffcbf-d066-4c5f-bf95-8503bcb983cf/volumes" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.472754 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="544c01f7-a6da-45de-96f2-9ab9dea0567c" path="/var/lib/kubelet/pods/544c01f7-a6da-45de-96f2-9ab9dea0567c/volumes" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.474682 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="781fdccd-a9f3-40ce-9234-d651c079eb1e" path="/var/lib/kubelet/pods/781fdccd-a9f3-40ce-9234-d651c079eb1e/volumes" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.476247 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93739148-39fb-4db3-ae9d-d222feb368d7" path="/var/lib/kubelet/pods/93739148-39fb-4db3-ae9d-d222feb368d7/volumes" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.477720 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9be76a38-b85f-458f-b5c9-181abf962109" path="/var/lib/kubelet/pods/9be76a38-b85f-458f-b5c9-181abf962109/volumes" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.479483 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1e4500f-681e-433d-8283-008eec618721" path="/var/lib/kubelet/pods/b1e4500f-681e-433d-8283-008eec618721/volumes" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.480849 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b738a352-32c7-4373-8324-8a02c359d300" path="/var/lib/kubelet/pods/b738a352-32c7-4373-8324-8a02c359d300/volumes" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.481427 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bab36094-736f-460a-83d1-bd298dee7774" path="/var/lib/kubelet/pods/bab36094-736f-460a-83d1-bd298dee7774/volumes" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.481940 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d25630f0-a59e-44ba-94ba-bd0ae9216b42" path="/var/lib/kubelet/pods/d25630f0-a59e-44ba-94ba-bd0ae9216b42/volumes" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.483741 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6d11dd9-4b5b-463e-a834-91c7ecc8b021" path="/var/lib/kubelet/pods/e6d11dd9-4b5b-463e-a834-91c7ecc8b021/volumes" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.484250 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5" path="/var/lib/kubelet/pods/eb1e8c0c-7abc-4bb0-93d4-4b0ad52adbf5/volumes" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.484733 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee2185d9-90ed-4fc9-a38b-eab30e813652" path="/var/lib/kubelet/pods/ee2185d9-90ed-4fc9-a38b-eab30e813652/volumes" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.485537 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f82e5b2f-cb79-4b83-901f-eca64116c6dc" path="/var/lib/kubelet/pods/f82e5b2f-cb79-4b83-901f-eca64116c6dc/volumes" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.519246 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.528037 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.528402 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.528481 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-7f98-account-create-update-lcc8p"] Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.528540 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-7f98-account-create-update-lcc8p"] Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.528245 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b" (UID: "9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.529441 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e810512-a127-40b3-b1c2-559c3b86fcdb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e810512-a127-40b3-b1c2-559c3b86fcdb" (UID: "3e810512-a127-40b3-b1c2-559c3b86fcdb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.531276 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scxbn\" (UniqueName: \"kubernetes.io/projected/3e810512-a127-40b3-b1c2-559c3b86fcdb-kube-api-access-scxbn\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.531381 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e810512-a127-40b3-b1c2-559c3b86fcdb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.531453 4830 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.539540 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.541750 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b" (UID: "9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.542139 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-676956db6-6grw2" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.558327 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.566908 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e810512-a127-40b3-b1c2-559c3b86fcdb-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "3e810512-a127-40b3-b1c2-559c3b86fcdb" (UID: "3e810512-a127-40b3-b1c2-559c3b86fcdb"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.600885 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b" (UID: "9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.606945 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e810512-a127-40b3-b1c2-559c3b86fcdb-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "3e810512-a127-40b3-b1c2-559c3b86fcdb" (UID: "3e810512-a127-40b3-b1c2-559c3b86fcdb"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.625330 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-config-data" (OuterVolumeSpecName: "config-data") pod "9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b" (UID: "9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.632833 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8631247-bdcb-45ff-a17d-ac7e7ff81800-combined-ca-bundle\") pod \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\" (UID: \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.632871 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\" (UID: \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.632897 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d8cq\" (UniqueName: \"kubernetes.io/projected/e8631247-bdcb-45ff-a17d-ac7e7ff81800-kube-api-access-8d8cq\") pod \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\" (UID: \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.632919 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad760963-34af-440e-9931-fbc23783d7cb-public-tls-certs\") pod \"ad760963-34af-440e-9931-fbc23783d7cb\" (UID: \"ad760963-34af-440e-9931-fbc23783d7cb\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.632941 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad760963-34af-440e-9931-fbc23783d7cb-combined-ca-bundle\") pod \"ad760963-34af-440e-9931-fbc23783d7cb\" (UID: \"ad760963-34af-440e-9931-fbc23783d7cb\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.632987 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad760963-34af-440e-9931-fbc23783d7cb-config-data\") pod \"ad760963-34af-440e-9931-fbc23783d7cb\" (UID: \"ad760963-34af-440e-9931-fbc23783d7cb\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.633011 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-httpd-run\") pod \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\" (UID: \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.633029 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-combined-ca-bundle\") pod \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\" (UID: \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.633045 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-public-tls-certs\") pod \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\" (UID: \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.633059 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\" (UID: \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.633071 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad760963-34af-440e-9931-fbc23783d7cb-scripts\") pod \"ad760963-34af-440e-9931-fbc23783d7cb\" (UID: \"ad760963-34af-440e-9931-fbc23783d7cb\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.633088 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-config-data\") pod \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\" (UID: \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.633120 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd0fbdd2-a99b-4758-9f27-1f5055ca0172-config-data\") pod \"dd0fbdd2-a99b-4758-9f27-1f5055ca0172\" (UID: \"dd0fbdd2-a99b-4758-9f27-1f5055ca0172\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.633145 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad760963-34af-440e-9931-fbc23783d7cb-logs\") pod \"ad760963-34af-440e-9931-fbc23783d7cb\" (UID: \"ad760963-34af-440e-9931-fbc23783d7cb\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.633176 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd0fbdd2-a99b-4758-9f27-1f5055ca0172-nova-metadata-tls-certs\") pod \"dd0fbdd2-a99b-4758-9f27-1f5055ca0172\" (UID: \"dd0fbdd2-a99b-4758-9f27-1f5055ca0172\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.633190 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd0fbdd2-a99b-4758-9f27-1f5055ca0172-combined-ca-bundle\") pod \"dd0fbdd2-a99b-4758-9f27-1f5055ca0172\" (UID: \"dd0fbdd2-a99b-4758-9f27-1f5055ca0172\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.633210 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8631247-bdcb-45ff-a17d-ac7e7ff81800-config-data\") pod \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\" (UID: \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.633226 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8631247-bdcb-45ff-a17d-ac7e7ff81800-scripts\") pod \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\" (UID: \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.633251 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad760963-34af-440e-9931-fbc23783d7cb-internal-tls-certs\") pod \"ad760963-34af-440e-9931-fbc23783d7cb\" (UID: \"ad760963-34af-440e-9931-fbc23783d7cb\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.633283 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-scripts\") pod \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\" (UID: \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.633321 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8631247-bdcb-45ff-a17d-ac7e7ff81800-internal-tls-certs\") pod \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\" (UID: \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.633351 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fm9sr\" (UniqueName: \"kubernetes.io/projected/dd0fbdd2-a99b-4758-9f27-1f5055ca0172-kube-api-access-fm9sr\") pod \"dd0fbdd2-a99b-4758-9f27-1f5055ca0172\" (UID: \"dd0fbdd2-a99b-4758-9f27-1f5055ca0172\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.633370 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8631247-bdcb-45ff-a17d-ac7e7ff81800-httpd-run\") pod \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\" (UID: \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.633397 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrdng\" (UniqueName: \"kubernetes.io/projected/ad760963-34af-440e-9931-fbc23783d7cb-kube-api-access-lrdng\") pod \"ad760963-34af-440e-9931-fbc23783d7cb\" (UID: \"ad760963-34af-440e-9931-fbc23783d7cb\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.633414 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-logs\") pod \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\" (UID: \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.633430 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8631247-bdcb-45ff-a17d-ac7e7ff81800-logs\") pod \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\" (UID: \"e8631247-bdcb-45ff-a17d-ac7e7ff81800\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.633451 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vngf\" (UniqueName: \"kubernetes.io/projected/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-kube-api-access-5vngf\") pod \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\" (UID: \"0ac8a4f8-88e7-4cd0-ab89-210fb088b137\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.633476 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd0fbdd2-a99b-4758-9f27-1f5055ca0172-logs\") pod \"dd0fbdd2-a99b-4758-9f27-1f5055ca0172\" (UID: \"dd0fbdd2-a99b-4758-9f27-1f5055ca0172\") " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.633526 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0ac8a4f8-88e7-4cd0-ab89-210fb088b137" (UID: "0ac8a4f8-88e7-4cd0-ab89-210fb088b137"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.633820 4830 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.633836 4830 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.633846 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.633855 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.633865 4830 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e810512-a127-40b3-b1c2-559c3b86fcdb-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.633874 4830 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/3e810512-a127-40b3-b1c2-559c3b86fcdb-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.634169 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd0fbdd2-a99b-4758-9f27-1f5055ca0172-logs" (OuterVolumeSpecName: "logs") pod "dd0fbdd2-a99b-4758-9f27-1f5055ca0172" (UID: "dd0fbdd2-a99b-4758-9f27-1f5055ca0172"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.638549 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8631247-bdcb-45ff-a17d-ac7e7ff81800-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e8631247-bdcb-45ff-a17d-ac7e7ff81800" (UID: "e8631247-bdcb-45ff-a17d-ac7e7ff81800"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.646101 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "e8631247-bdcb-45ff-a17d-ac7e7ff81800" (UID: "e8631247-bdcb-45ff-a17d-ac7e7ff81800"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.648752 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad760963-34af-440e-9931-fbc23783d7cb-logs" (OuterVolumeSpecName: "logs") pod "ad760963-34af-440e-9931-fbc23783d7cb" (UID: "ad760963-34af-440e-9931-fbc23783d7cb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.650068 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-logs" (OuterVolumeSpecName: "logs") pod "0ac8a4f8-88e7-4cd0-ab89-210fb088b137" (UID: "0ac8a4f8-88e7-4cd0-ab89-210fb088b137"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.650589 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8631247-bdcb-45ff-a17d-ac7e7ff81800-kube-api-access-8d8cq" (OuterVolumeSpecName: "kube-api-access-8d8cq") pod "e8631247-bdcb-45ff-a17d-ac7e7ff81800" (UID: "e8631247-bdcb-45ff-a17d-ac7e7ff81800"). InnerVolumeSpecName "kube-api-access-8d8cq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.656929 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "0ac8a4f8-88e7-4cd0-ab89-210fb088b137" (UID: "0ac8a4f8-88e7-4cd0-ab89-210fb088b137"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.657249 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8631247-bdcb-45ff-a17d-ac7e7ff81800-scripts" (OuterVolumeSpecName: "scripts") pod "e8631247-bdcb-45ff-a17d-ac7e7ff81800" (UID: "e8631247-bdcb-45ff-a17d-ac7e7ff81800"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.657291 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-scripts" (OuterVolumeSpecName: "scripts") pod "0ac8a4f8-88e7-4cd0-ab89-210fb088b137" (UID: "0ac8a4f8-88e7-4cd0-ab89-210fb088b137"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.657354 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-kube-api-access-5vngf" (OuterVolumeSpecName: "kube-api-access-5vngf") pod "0ac8a4f8-88e7-4cd0-ab89-210fb088b137" (UID: "0ac8a4f8-88e7-4cd0-ab89-210fb088b137"). InnerVolumeSpecName "kube-api-access-5vngf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.658174 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8631247-bdcb-45ff-a17d-ac7e7ff81800-logs" (OuterVolumeSpecName: "logs") pod "e8631247-bdcb-45ff-a17d-ac7e7ff81800" (UID: "e8631247-bdcb-45ff-a17d-ac7e7ff81800"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.668207 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad760963-34af-440e-9931-fbc23783d7cb-scripts" (OuterVolumeSpecName: "scripts") pod "ad760963-34af-440e-9931-fbc23783d7cb" (UID: "ad760963-34af-440e-9931-fbc23783d7cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.668284 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad760963-34af-440e-9931-fbc23783d7cb-kube-api-access-lrdng" (OuterVolumeSpecName: "kube-api-access-lrdng") pod "ad760963-34af-440e-9931-fbc23783d7cb" (UID: "ad760963-34af-440e-9931-fbc23783d7cb"). InnerVolumeSpecName "kube-api-access-lrdng". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.671733 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd0fbdd2-a99b-4758-9f27-1f5055ca0172-kube-api-access-fm9sr" (OuterVolumeSpecName: "kube-api-access-fm9sr") pod "dd0fbdd2-a99b-4758-9f27-1f5055ca0172" (UID: "dd0fbdd2-a99b-4758-9f27-1f5055ca0172"). InnerVolumeSpecName "kube-api-access-fm9sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.709086 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ac8a4f8-88e7-4cd0-ab89-210fb088b137" (UID: "0ac8a4f8-88e7-4cd0-ab89-210fb088b137"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.743206 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.743254 4830 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.743266 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad760963-34af-440e-9931-fbc23783d7cb-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.743282 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad760963-34af-440e-9931-fbc23783d7cb-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.743296 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8631247-bdcb-45ff-a17d-ac7e7ff81800-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.743305 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.743314 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fm9sr\" (UniqueName: \"kubernetes.io/projected/dd0fbdd2-a99b-4758-9f27-1f5055ca0172-kube-api-access-fm9sr\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.743325 4830 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8631247-bdcb-45ff-a17d-ac7e7ff81800-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.743333 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrdng\" (UniqueName: \"kubernetes.io/projected/ad760963-34af-440e-9931-fbc23783d7cb-kube-api-access-lrdng\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.743341 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.743348 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8631247-bdcb-45ff-a17d-ac7e7ff81800-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.743356 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vngf\" (UniqueName: \"kubernetes.io/projected/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-kube-api-access-5vngf\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.743364 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd0fbdd2-a99b-4758-9f27-1f5055ca0172-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.743377 4830 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.743386 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8d8cq\" (UniqueName: \"kubernetes.io/projected/e8631247-bdcb-45ff-a17d-ac7e7ff81800-kube-api-access-8d8cq\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.746083 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8631247-bdcb-45ff-a17d-ac7e7ff81800-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8631247-bdcb-45ff-a17d-ac7e7ff81800" (UID: "e8631247-bdcb-45ff-a17d-ac7e7ff81800"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.779017 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd0fbdd2-a99b-4758-9f27-1f5055ca0172-config-data" (OuterVolumeSpecName: "config-data") pod "dd0fbdd2-a99b-4758-9f27-1f5055ca0172" (UID: "dd0fbdd2-a99b-4758-9f27-1f5055ca0172"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.821221 4830 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.860416 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd0fbdd2-a99b-4758-9f27-1f5055ca0172-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.860438 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8631247-bdcb-45ff-a17d-ac7e7ff81800-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.860447 4830 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.988542 4830 generic.go:334] "Generic (PLEG): container finished" podID="11e19037-abf1-4269-b933-0950913973b9" containerID="e3cd2ffc35cea964dcec2e27b4b151f289beecdcd0e3b5f7b932d52f599b93c0" exitCode=0 Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.988589 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6bbb58d4c-74p8g" event={"ID":"11e19037-abf1-4269-b933-0950913973b9","Type":"ContainerDied","Data":"e3cd2ffc35cea964dcec2e27b4b151f289beecdcd0e3b5f7b932d52f599b93c0"} Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.997378 4830 generic.go:334] "Generic (PLEG): container finished" podID="a0a1e291-1a11-4747-96ed-32c95623dcbb" containerID="c6c30f91c3c07f2417a561616bc4ab4ba1863961710fa17a2a7105a6e4af19cd" exitCode=0 Mar 18 18:25:04 crc kubenswrapper[4830]: I0318 18:25:04.997454 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"a0a1e291-1a11-4747-96ed-32c95623dcbb","Type":"ContainerDied","Data":"c6c30f91c3c07f2417a561616bc4ab4ba1863961710fa17a2a7105a6e4af19cd"} Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.000901 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.001232 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e8631247-bdcb-45ff-a17d-ac7e7ff81800","Type":"ContainerDied","Data":"82adec69ccede2e466b158cd9eeeee18db05fec83283556fdc16d31adf5888b0"} Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.001346 4830 scope.go:117] "RemoveContainer" containerID="41f23f0d4fef2bb42d4c0645e34a4042e362df833aa1814c1dd80e578b447069" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.012477 4830 generic.go:334] "Generic (PLEG): container finished" podID="48aa5450-29c8-47de-bb37-a7a6ffd441bc" containerID="fc53817ebacc0ce8c203daf49d972d55c5cc1843b058744c9a909e3088e8e2dc" exitCode=0 Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.012531 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78f6989b54-vkxh8" event={"ID":"48aa5450-29c8-47de-bb37-a7a6ffd441bc","Type":"ContainerDied","Data":"fc53817ebacc0ce8c203daf49d972d55c5cc1843b058744c9a909e3088e8e2dc"} Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.013683 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-clvpz" event={"ID":"77147fe4-670f-40ca-ab50-4d3220442eee","Type":"ContainerStarted","Data":"b025bd585a36a924f268852de9a6e2b8372995c5905c3efd0f80f7bb05716102"} Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.023427 4830 generic.go:334] "Generic (PLEG): container finished" podID="b3ba738f-c556-4b36-a045-3516efdf886a" containerID="91aff4166cbebec7917a849f1dae12a4f2caababfa680539bc75bf53f49cf551" exitCode=0 Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.023475 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b3ba738f-c556-4b36-a045-3516efdf886a","Type":"ContainerDied","Data":"91aff4166cbebec7917a849f1dae12a4f2caababfa680539bc75bf53f49cf551"} Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.023493 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b3ba738f-c556-4b36-a045-3516efdf886a","Type":"ContainerDied","Data":"f33077f9185a7354604c1c307deb7f9d8596ac8e975665c909a3a47886c7b2ac"} Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.023503 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f33077f9185a7354604c1c307deb7f9d8596ac8e975665c909a3a47886c7b2ac" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.030254 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd0fbdd2-a99b-4758-9f27-1f5055ca0172-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd0fbdd2-a99b-4758-9f27-1f5055ca0172" (UID: "dd0fbdd2-a99b-4758-9f27-1f5055ca0172"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.047446 4830 generic.go:334] "Generic (PLEG): container finished" podID="dd0fbdd2-a99b-4758-9f27-1f5055ca0172" containerID="e3b1b0010366275a52abd6d5d86dee987f708c05b0176a216ab2bd64c79302b6" exitCode=0 Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.047609 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd0fbdd2-a99b-4758-9f27-1f5055ca0172","Type":"ContainerDied","Data":"e3b1b0010366275a52abd6d5d86dee987f708c05b0176a216ab2bd64c79302b6"} Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.047640 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd0fbdd2-a99b-4758-9f27-1f5055ca0172","Type":"ContainerDied","Data":"9e11bb5f59234678f00d01cc0b778b5cdb4c6aa29fb289544c4f5a0c09d38e67"} Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.047724 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.055817 4830 generic.go:334] "Generic (PLEG): container finished" podID="44872ddd-52a8-4ca8-a07e-f84111475b8f" containerID="d65ffc2b335a667737c6a18c2b396b9a709039acd32a58d2211316eb8df8aa6d" exitCode=0 Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.055919 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"44872ddd-52a8-4ca8-a07e-f84111475b8f","Type":"ContainerDied","Data":"d65ffc2b335a667737c6a18c2b396b9a709039acd32a58d2211316eb8df8aa6d"} Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.057949 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-676956db6-6grw2" event={"ID":"ad760963-34af-440e-9931-fbc23783d7cb","Type":"ContainerDied","Data":"ba4e7878acb6a02897824a50a1bd651baa3285202223a312e374b33d07c03478"} Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.058038 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-676956db6-6grw2" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.063419 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a0e71339-fd75-44b3-bbb8-15d75455d90f","Type":"ContainerDied","Data":"b5f8b7f66219fddf66e22ef6b5a06dba84482b8f68cbbeea50a396ebe1d339d0"} Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.069083 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd0fbdd2-a99b-4758-9f27-1f5055ca0172-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.063262 4830 generic.go:334] "Generic (PLEG): container finished" podID="a0e71339-fd75-44b3-bbb8-15d75455d90f" containerID="b5f8b7f66219fddf66e22ef6b5a06dba84482b8f68cbbeea50a396ebe1d339d0" exitCode=0 Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.079962 4830 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.104026 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.104538 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0ac8a4f8-88e7-4cd0-ab89-210fb088b137","Type":"ContainerDied","Data":"a6a7b889d97bafd13659fbd280beab2f2e9328ce830ad8c489d224c19d4ad7f2"} Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.166613 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad760963-34af-440e-9931-fbc23783d7cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad760963-34af-440e-9931-fbc23783d7cb" (UID: "ad760963-34af-440e-9931-fbc23783d7cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.166757 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd0fbdd2-a99b-4758-9f27-1f5055ca0172-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "dd0fbdd2-a99b-4758-9f27-1f5055ca0172" (UID: "dd0fbdd2-a99b-4758-9f27-1f5055ca0172"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.170554 4830 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.170672 4830 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd0fbdd2-a99b-4758-9f27-1f5055ca0172-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.170749 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad760963-34af-440e-9931-fbc23783d7cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.172229 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad760963-34af-440e-9931-fbc23783d7cb-config-data" (OuterVolumeSpecName: "config-data") pod "ad760963-34af-440e-9931-fbc23783d7cb" (UID: "ad760963-34af-440e-9931-fbc23783d7cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.179716 4830 generic.go:334] "Generic (PLEG): container finished" podID="3e152864-9096-47a7-b0b0-c288840093e7" containerID="e20014a42907afd388ba14b211a6c05885fe859da4a4d5b322dfc735c19c8637" exitCode=0 Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.179855 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.181216 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.181689 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-03fa-account-create-update-fsvc6" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.182005 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-df3e-account-create-update-jn48p" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.182295 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f884dc87d-6wvs2" event={"ID":"3e152864-9096-47a7-b0b0-c288840093e7","Type":"ContainerDied","Data":"e20014a42907afd388ba14b211a6c05885fe859da4a4d5b322dfc735c19c8637"} Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.182584 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lhdqd" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.182695 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.182728 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a2d6-account-create-update-ltf4b" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.204822 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-config-data" (OuterVolumeSpecName: "config-data") pod "0ac8a4f8-88e7-4cd0-ab89-210fb088b137" (UID: "0ac8a4f8-88e7-4cd0-ab89-210fb088b137"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.238122 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0ac8a4f8-88e7-4cd0-ab89-210fb088b137" (UID: "0ac8a4f8-88e7-4cd0-ab89-210fb088b137"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.244908 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad760963-34af-440e-9931-fbc23783d7cb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ad760963-34af-440e-9931-fbc23783d7cb" (UID: "ad760963-34af-440e-9931-fbc23783d7cb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.253931 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8631247-bdcb-45ff-a17d-ac7e7ff81800-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e8631247-bdcb-45ff-a17d-ac7e7ff81800" (UID: "e8631247-bdcb-45ff-a17d-ac7e7ff81800"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.273978 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h58f\" (UniqueName: \"kubernetes.io/projected/0586337e-461c-4367-8f0a-0bc3593732ce-kube-api-access-8h58f\") pod \"keystone-df3e-account-create-update-jn48p\" (UID: \"0586337e-461c-4367-8f0a-0bc3593732ce\") " pod="openstack/keystone-df3e-account-create-update-jn48p" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.274092 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0586337e-461c-4367-8f0a-0bc3593732ce-operator-scripts\") pod \"keystone-df3e-account-create-update-jn48p\" (UID: \"0586337e-461c-4367-8f0a-0bc3593732ce\") " pod="openstack/keystone-df3e-account-create-update-jn48p" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.274153 4830 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8631247-bdcb-45ff-a17d-ac7e7ff81800-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.274164 4830 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad760963-34af-440e-9931-fbc23783d7cb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.274175 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad760963-34af-440e-9931-fbc23783d7cb-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.274184 4830 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.274193 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ac8a4f8-88e7-4cd0-ab89-210fb088b137-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: E0318 18:25:05.274251 4830 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 18 18:25:05 crc kubenswrapper[4830]: E0318 18:25:05.274307 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0586337e-461c-4367-8f0a-0bc3593732ce-operator-scripts podName:0586337e-461c-4367-8f0a-0bc3593732ce nodeName:}" failed. No retries permitted until 2026-03-18 18:25:07.274288172 +0000 UTC m=+1341.841918504 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0586337e-461c-4367-8f0a-0bc3593732ce-operator-scripts") pod "keystone-df3e-account-create-update-jn48p" (UID: "0586337e-461c-4367-8f0a-0bc3593732ce") : configmap "openstack-scripts" not found Mar 18 18:25:05 crc kubenswrapper[4830]: E0318 18:25:05.281131 4830 projected.go:194] Error preparing data for projected volume kube-api-access-8h58f for pod openstack/keystone-df3e-account-create-update-jn48p: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 18 18:25:05 crc kubenswrapper[4830]: E0318 18:25:05.281197 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0586337e-461c-4367-8f0a-0bc3593732ce-kube-api-access-8h58f podName:0586337e-461c-4367-8f0a-0bc3593732ce nodeName:}" failed. No retries permitted until 2026-03-18 18:25:07.281175576 +0000 UTC m=+1341.848805908 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-8h58f" (UniqueName: "kubernetes.io/projected/0586337e-461c-4367-8f0a-0bc3593732ce-kube-api-access-8h58f") pod "keystone-df3e-account-create-update-jn48p" (UID: "0586337e-461c-4367-8f0a-0bc3593732ce") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.288135 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad760963-34af-440e-9931-fbc23783d7cb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ad760963-34af-440e-9931-fbc23783d7cb" (UID: "ad760963-34af-440e-9931-fbc23783d7cb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.293856 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8631247-bdcb-45ff-a17d-ac7e7ff81800-config-data" (OuterVolumeSpecName: "config-data") pod "e8631247-bdcb-45ff-a17d-ac7e7ff81800" (UID: "e8631247-bdcb-45ff-a17d-ac7e7ff81800"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.375633 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8631247-bdcb-45ff-a17d-ac7e7ff81800-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.375655 4830 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad760963-34af-440e-9931-fbc23783d7cb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.439830 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.442566 4830 scope.go:117] "RemoveContainer" containerID="ae17ba4052b5b73e7f8747e0bbd64f898ebbc5356b7377e5822b10903adec77d" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.463433 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7f884dc87d-6wvs2" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.486482 4830 scope.go:117] "RemoveContainer" containerID="e3b1b0010366275a52abd6d5d86dee987f708c05b0176a216ab2bd64c79302b6" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.487216 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.515275 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-78f6989b54-vkxh8" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.544029 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.557213 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.557528 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.558395 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.559011 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.565278 4830 scope.go:117] "RemoveContainer" containerID="7ec14097d0f88bba4680e90148e2beece59c3678f735e4a8e1a973a7adfaf364" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.567032 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-clvpz" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.568181 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.579183 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48aa5450-29c8-47de-bb37-a7a6ffd441bc-config-data\") pod \"48aa5450-29c8-47de-bb37-a7a6ffd441bc\" (UID: \"48aa5450-29c8-47de-bb37-a7a6ffd441bc\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.579473 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znhqh\" (UniqueName: \"kubernetes.io/projected/48aa5450-29c8-47de-bb37-a7a6ffd441bc-kube-api-access-znhqh\") pod \"48aa5450-29c8-47de-bb37-a7a6ffd441bc\" (UID: \"48aa5450-29c8-47de-bb37-a7a6ffd441bc\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.579531 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e152864-9096-47a7-b0b0-c288840093e7-config-data\") pod \"3e152864-9096-47a7-b0b0-c288840093e7\" (UID: \"3e152864-9096-47a7-b0b0-c288840093e7\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.579559 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e152864-9096-47a7-b0b0-c288840093e7-config-data-custom\") pod \"3e152864-9096-47a7-b0b0-c288840093e7\" (UID: \"3e152864-9096-47a7-b0b0-c288840093e7\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.579625 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48aa5450-29c8-47de-bb37-a7a6ffd441bc-logs\") pod \"48aa5450-29c8-47de-bb37-a7a6ffd441bc\" (UID: \"48aa5450-29c8-47de-bb37-a7a6ffd441bc\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.579672 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqz9t\" (UniqueName: \"kubernetes.io/projected/3e152864-9096-47a7-b0b0-c288840093e7-kube-api-access-xqz9t\") pod \"3e152864-9096-47a7-b0b0-c288840093e7\" (UID: \"3e152864-9096-47a7-b0b0-c288840093e7\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.579696 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44872ddd-52a8-4ca8-a07e-f84111475b8f-combined-ca-bundle\") pod \"44872ddd-52a8-4ca8-a07e-f84111475b8f\" (UID: \"44872ddd-52a8-4ca8-a07e-f84111475b8f\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.579726 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e152864-9096-47a7-b0b0-c288840093e7-internal-tls-certs\") pod \"3e152864-9096-47a7-b0b0-c288840093e7\" (UID: \"3e152864-9096-47a7-b0b0-c288840093e7\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.579749 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdwh9\" (UniqueName: \"kubernetes.io/projected/44872ddd-52a8-4ca8-a07e-f84111475b8f-kube-api-access-jdwh9\") pod \"44872ddd-52a8-4ca8-a07e-f84111475b8f\" (UID: \"44872ddd-52a8-4ca8-a07e-f84111475b8f\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.579828 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ba738f-c556-4b36-a045-3516efdf886a-combined-ca-bundle\") pod \"b3ba738f-c556-4b36-a045-3516efdf886a\" (UID: \"b3ba738f-c556-4b36-a045-3516efdf886a\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.579858 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e152864-9096-47a7-b0b0-c288840093e7-public-tls-certs\") pod \"3e152864-9096-47a7-b0b0-c288840093e7\" (UID: \"3e152864-9096-47a7-b0b0-c288840093e7\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.579883 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ba738f-c556-4b36-a045-3516efdf886a-config-data\") pod \"b3ba738f-c556-4b36-a045-3516efdf886a\" (UID: \"b3ba738f-c556-4b36-a045-3516efdf886a\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.579921 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3ba738f-c556-4b36-a045-3516efdf886a-logs\") pod \"b3ba738f-c556-4b36-a045-3516efdf886a\" (UID: \"b3ba738f-c556-4b36-a045-3516efdf886a\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.579961 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48aa5450-29c8-47de-bb37-a7a6ffd441bc-config-data-custom\") pod \"48aa5450-29c8-47de-bb37-a7a6ffd441bc\" (UID: \"48aa5450-29c8-47de-bb37-a7a6ffd441bc\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.579993 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86k49\" (UniqueName: \"kubernetes.io/projected/b3ba738f-c556-4b36-a045-3516efdf886a-kube-api-access-86k49\") pod \"b3ba738f-c556-4b36-a045-3516efdf886a\" (UID: \"b3ba738f-c556-4b36-a045-3516efdf886a\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.580021 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3ba738f-c556-4b36-a045-3516efdf886a-public-tls-certs\") pod \"b3ba738f-c556-4b36-a045-3516efdf886a\" (UID: \"b3ba738f-c556-4b36-a045-3516efdf886a\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.580041 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48aa5450-29c8-47de-bb37-a7a6ffd441bc-combined-ca-bundle\") pod \"48aa5450-29c8-47de-bb37-a7a6ffd441bc\" (UID: \"48aa5450-29c8-47de-bb37-a7a6ffd441bc\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.580063 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44872ddd-52a8-4ca8-a07e-f84111475b8f-config-data\") pod \"44872ddd-52a8-4ca8-a07e-f84111475b8f\" (UID: \"44872ddd-52a8-4ca8-a07e-f84111475b8f\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.580086 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e152864-9096-47a7-b0b0-c288840093e7-logs\") pod \"3e152864-9096-47a7-b0b0-c288840093e7\" (UID: \"3e152864-9096-47a7-b0b0-c288840093e7\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.580119 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3ba738f-c556-4b36-a045-3516efdf886a-internal-tls-certs\") pod \"b3ba738f-c556-4b36-a045-3516efdf886a\" (UID: \"b3ba738f-c556-4b36-a045-3516efdf886a\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.580144 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e152864-9096-47a7-b0b0-c288840093e7-combined-ca-bundle\") pod \"3e152864-9096-47a7-b0b0-c288840093e7\" (UID: \"3e152864-9096-47a7-b0b0-c288840093e7\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.582288 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3ba738f-c556-4b36-a045-3516efdf886a-logs" (OuterVolumeSpecName: "logs") pod "b3ba738f-c556-4b36-a045-3516efdf886a" (UID: "b3ba738f-c556-4b36-a045-3516efdf886a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.584410 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e152864-9096-47a7-b0b0-c288840093e7-logs" (OuterVolumeSpecName: "logs") pod "3e152864-9096-47a7-b0b0-c288840093e7" (UID: "3e152864-9096-47a7-b0b0-c288840093e7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.592606 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-03fa-account-create-update-fsvc6"] Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.599100 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48aa5450-29c8-47de-bb37-a7a6ffd441bc-logs" (OuterVolumeSpecName: "logs") pod "48aa5450-29c8-47de-bb37-a7a6ffd441bc" (UID: "48aa5450-29c8-47de-bb37-a7a6ffd441bc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.602912 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48aa5450-29c8-47de-bb37-a7a6ffd441bc-kube-api-access-znhqh" (OuterVolumeSpecName: "kube-api-access-znhqh") pod "48aa5450-29c8-47de-bb37-a7a6ffd441bc" (UID: "48aa5450-29c8-47de-bb37-a7a6ffd441bc"). InnerVolumeSpecName "kube-api-access-znhqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.605336 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6bbb58d4c-74p8g" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.613734 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-03fa-account-create-update-fsvc6"] Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.614474 4830 scope.go:117] "RemoveContainer" containerID="e3b1b0010366275a52abd6d5d86dee987f708c05b0176a216ab2bd64c79302b6" Mar 18 18:25:05 crc kubenswrapper[4830]: E0318 18:25:05.621663 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3b1b0010366275a52abd6d5d86dee987f708c05b0176a216ab2bd64c79302b6\": container with ID starting with e3b1b0010366275a52abd6d5d86dee987f708c05b0176a216ab2bd64c79302b6 not found: ID does not exist" containerID="e3b1b0010366275a52abd6d5d86dee987f708c05b0176a216ab2bd64c79302b6" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.621700 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3b1b0010366275a52abd6d5d86dee987f708c05b0176a216ab2bd64c79302b6"} err="failed to get container status \"e3b1b0010366275a52abd6d5d86dee987f708c05b0176a216ab2bd64c79302b6\": rpc error: code = NotFound desc = could not find container \"e3b1b0010366275a52abd6d5d86dee987f708c05b0176a216ab2bd64c79302b6\": container with ID starting with e3b1b0010366275a52abd6d5d86dee987f708c05b0176a216ab2bd64c79302b6 not found: ID does not exist" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.621725 4830 scope.go:117] "RemoveContainer" containerID="7ec14097d0f88bba4680e90148e2beece59c3678f735e4a8e1a973a7adfaf364" Mar 18 18:25:05 crc kubenswrapper[4830]: E0318 18:25:05.622074 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ec14097d0f88bba4680e90148e2beece59c3678f735e4a8e1a973a7adfaf364\": container with ID starting with 7ec14097d0f88bba4680e90148e2beece59c3678f735e4a8e1a973a7adfaf364 not found: ID does not exist" containerID="7ec14097d0f88bba4680e90148e2beece59c3678f735e4a8e1a973a7adfaf364" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.622091 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ec14097d0f88bba4680e90148e2beece59c3678f735e4a8e1a973a7adfaf364"} err="failed to get container status \"7ec14097d0f88bba4680e90148e2beece59c3678f735e4a8e1a973a7adfaf364\": rpc error: code = NotFound desc = could not find container \"7ec14097d0f88bba4680e90148e2beece59c3678f735e4a8e1a973a7adfaf364\": container with ID starting with 7ec14097d0f88bba4680e90148e2beece59c3678f735e4a8e1a973a7adfaf364 not found: ID does not exist" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.622102 4830 scope.go:117] "RemoveContainer" containerID="06b5da3aa085e9b3e11d65936e872fab74b18aa97d39f5db82fc225e3ce954b4" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.626575 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44872ddd-52a8-4ca8-a07e-f84111475b8f-kube-api-access-jdwh9" (OuterVolumeSpecName: "kube-api-access-jdwh9") pod "44872ddd-52a8-4ca8-a07e-f84111475b8f" (UID: "44872ddd-52a8-4ca8-a07e-f84111475b8f"). InnerVolumeSpecName "kube-api-access-jdwh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.626942 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-676956db6-6grw2"] Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.625825 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e152864-9096-47a7-b0b0-c288840093e7-kube-api-access-xqz9t" (OuterVolumeSpecName: "kube-api-access-xqz9t") pod "3e152864-9096-47a7-b0b0-c288840093e7" (UID: "3e152864-9096-47a7-b0b0-c288840093e7"). InnerVolumeSpecName "kube-api-access-xqz9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.659826 4830 scope.go:117] "RemoveContainer" containerID="1f65787d2e3aac204498b2bda1b107a09472a1e7a4c737c2468ded43190d999e" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.667996 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-676956db6-6grw2"] Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.676928 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e152864-9096-47a7-b0b0-c288840093e7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3e152864-9096-47a7-b0b0-c288840093e7" (UID: "3e152864-9096-47a7-b0b0-c288840093e7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.677052 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3ba738f-c556-4b36-a045-3516efdf886a-kube-api-access-86k49" (OuterVolumeSpecName: "kube-api-access-86k49") pod "b3ba738f-c556-4b36-a045-3516efdf886a" (UID: "b3ba738f-c556-4b36-a045-3516efdf886a"). InnerVolumeSpecName "kube-api-access-86k49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.677171 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48aa5450-29c8-47de-bb37-a7a6ffd441bc-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "48aa5450-29c8-47de-bb37-a7a6ffd441bc" (UID: "48aa5450-29c8-47de-bb37-a7a6ffd441bc"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.685380 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11e19037-abf1-4269-b933-0950913973b9-combined-ca-bundle\") pod \"11e19037-abf1-4269-b933-0950913973b9\" (UID: \"11e19037-abf1-4269-b933-0950913973b9\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.685476 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a0a1e291-1a11-4747-96ed-32c95623dcbb-kolla-config\") pod \"a0a1e291-1a11-4747-96ed-32c95623dcbb\" (UID: \"a0a1e291-1a11-4747-96ed-32c95623dcbb\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.685533 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11e19037-abf1-4269-b933-0950913973b9-config-data-custom\") pod \"11e19037-abf1-4269-b933-0950913973b9\" (UID: \"11e19037-abf1-4269-b933-0950913973b9\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.685559 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11e19037-abf1-4269-b933-0950913973b9-logs\") pod \"11e19037-abf1-4269-b933-0950913973b9\" (UID: \"11e19037-abf1-4269-b933-0950913973b9\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.685665 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jl6k\" (UniqueName: \"kubernetes.io/projected/77147fe4-670f-40ca-ab50-4d3220442eee-kube-api-access-5jl6k\") pod \"77147fe4-670f-40ca-ab50-4d3220442eee\" (UID: \"77147fe4-670f-40ca-ab50-4d3220442eee\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.685707 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0e71339-fd75-44b3-bbb8-15d75455d90f-combined-ca-bundle\") pod \"a0e71339-fd75-44b3-bbb8-15d75455d90f\" (UID: \"a0e71339-fd75-44b3-bbb8-15d75455d90f\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.685749 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a0a1e291-1a11-4747-96ed-32c95623dcbb-config-data\") pod \"a0a1e291-1a11-4747-96ed-32c95623dcbb\" (UID: \"a0a1e291-1a11-4747-96ed-32c95623dcbb\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.686069 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfnj6\" (UniqueName: \"kubernetes.io/projected/11e19037-abf1-4269-b933-0950913973b9-kube-api-access-tfnj6\") pod \"11e19037-abf1-4269-b933-0950913973b9\" (UID: \"11e19037-abf1-4269-b933-0950913973b9\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.686119 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0a1e291-1a11-4747-96ed-32c95623dcbb-memcached-tls-certs\") pod \"a0a1e291-1a11-4747-96ed-32c95623dcbb\" (UID: \"a0a1e291-1a11-4747-96ed-32c95623dcbb\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.686179 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jxbg\" (UniqueName: \"kubernetes.io/projected/a0a1e291-1a11-4747-96ed-32c95623dcbb-kube-api-access-7jxbg\") pod \"a0a1e291-1a11-4747-96ed-32c95623dcbb\" (UID: \"a0a1e291-1a11-4747-96ed-32c95623dcbb\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.686239 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a1e291-1a11-4747-96ed-32c95623dcbb-combined-ca-bundle\") pod \"a0a1e291-1a11-4747-96ed-32c95623dcbb\" (UID: \"a0a1e291-1a11-4747-96ed-32c95623dcbb\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.686313 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77147fe4-670f-40ca-ab50-4d3220442eee-operator-scripts\") pod \"77147fe4-670f-40ca-ab50-4d3220442eee\" (UID: \"77147fe4-670f-40ca-ab50-4d3220442eee\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.686362 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0e71339-fd75-44b3-bbb8-15d75455d90f-config-data\") pod \"a0e71339-fd75-44b3-bbb8-15d75455d90f\" (UID: \"a0e71339-fd75-44b3-bbb8-15d75455d90f\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.686414 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11e19037-abf1-4269-b933-0950913973b9-config-data\") pod \"11e19037-abf1-4269-b933-0950913973b9\" (UID: \"11e19037-abf1-4269-b933-0950913973b9\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.686465 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7ndk\" (UniqueName: \"kubernetes.io/projected/a0e71339-fd75-44b3-bbb8-15d75455d90f-kube-api-access-g7ndk\") pod \"a0e71339-fd75-44b3-bbb8-15d75455d90f\" (UID: \"a0e71339-fd75-44b3-bbb8-15d75455d90f\") " Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.686519 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11e19037-abf1-4269-b933-0950913973b9-logs" (OuterVolumeSpecName: "logs") pod "11e19037-abf1-4269-b933-0950913973b9" (UID: "11e19037-abf1-4269-b933-0950913973b9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.687412 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znhqh\" (UniqueName: \"kubernetes.io/projected/48aa5450-29c8-47de-bb37-a7a6ffd441bc-kube-api-access-znhqh\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.687432 4830 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e152864-9096-47a7-b0b0-c288840093e7-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.687446 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48aa5450-29c8-47de-bb37-a7a6ffd441bc-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.687455 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqz9t\" (UniqueName: \"kubernetes.io/projected/3e152864-9096-47a7-b0b0-c288840093e7-kube-api-access-xqz9t\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.687465 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdwh9\" (UniqueName: \"kubernetes.io/projected/44872ddd-52a8-4ca8-a07e-f84111475b8f-kube-api-access-jdwh9\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.687473 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3ba738f-c556-4b36-a045-3516efdf886a-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.687484 4830 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48aa5450-29c8-47de-bb37-a7a6ffd441bc-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.687495 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11e19037-abf1-4269-b933-0950913973b9-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.687509 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86k49\" (UniqueName: \"kubernetes.io/projected/b3ba738f-c556-4b36-a045-3516efdf886a-kube-api-access-86k49\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.687520 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e152864-9096-47a7-b0b0-c288840093e7-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.688532 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77147fe4-670f-40ca-ab50-4d3220442eee-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "77147fe4-670f-40ca-ab50-4d3220442eee" (UID: "77147fe4-670f-40ca-ab50-4d3220442eee"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.710684 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0a1e291-1a11-4747-96ed-32c95623dcbb-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "a0a1e291-1a11-4747-96ed-32c95623dcbb" (UID: "a0a1e291-1a11-4747-96ed-32c95623dcbb"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.710718 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0e71339-fd75-44b3-bbb8-15d75455d90f-kube-api-access-g7ndk" (OuterVolumeSpecName: "kube-api-access-g7ndk") pod "a0e71339-fd75-44b3-bbb8-15d75455d90f" (UID: "a0e71339-fd75-44b3-bbb8-15d75455d90f"). InnerVolumeSpecName "kube-api-access-g7ndk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.710856 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0a1e291-1a11-4747-96ed-32c95623dcbb-kube-api-access-7jxbg" (OuterVolumeSpecName: "kube-api-access-7jxbg") pod "a0a1e291-1a11-4747-96ed-32c95623dcbb" (UID: "a0a1e291-1a11-4747-96ed-32c95623dcbb"). InnerVolumeSpecName "kube-api-access-7jxbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.711286 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0a1e291-1a11-4747-96ed-32c95623dcbb-config-data" (OuterVolumeSpecName: "config-data") pod "a0a1e291-1a11-4747-96ed-32c95623dcbb" (UID: "a0a1e291-1a11-4747-96ed-32c95623dcbb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.713353 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e152864-9096-47a7-b0b0-c288840093e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e152864-9096-47a7-b0b0-c288840093e7" (UID: "3e152864-9096-47a7-b0b0-c288840093e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.723249 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-df3e-account-create-update-jn48p"] Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.732822 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-df3e-account-create-update-jn48p"] Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.740085 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.749799 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11e19037-abf1-4269-b933-0950913973b9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "11e19037-abf1-4269-b933-0950913973b9" (UID: "11e19037-abf1-4269-b933-0950913973b9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.750117 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44872ddd-52a8-4ca8-a07e-f84111475b8f-config-data" (OuterVolumeSpecName: "config-data") pod "44872ddd-52a8-4ca8-a07e-f84111475b8f" (UID: "44872ddd-52a8-4ca8-a07e-f84111475b8f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.750265 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11e19037-abf1-4269-b933-0950913973b9-kube-api-access-tfnj6" (OuterVolumeSpecName: "kube-api-access-tfnj6") pod "11e19037-abf1-4269-b933-0950913973b9" (UID: "11e19037-abf1-4269-b933-0950913973b9"). InnerVolumeSpecName "kube-api-access-tfnj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.751084 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.751432 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77147fe4-670f-40ca-ab50-4d3220442eee-kube-api-access-5jl6k" (OuterVolumeSpecName: "kube-api-access-5jl6k") pod "77147fe4-670f-40ca-ab50-4d3220442eee" (UID: "77147fe4-670f-40ca-ab50-4d3220442eee"). InnerVolumeSpecName "kube-api-access-5jl6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.768981 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-a2d6-account-create-update-ltf4b"] Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.775012 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-a2d6-account-create-update-ltf4b"] Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.789328 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77147fe4-670f-40ca-ab50-4d3220442eee-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.789361 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7ndk\" (UniqueName: \"kubernetes.io/projected/a0e71339-fd75-44b3-bbb8-15d75455d90f-kube-api-access-g7ndk\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.789375 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0586337e-461c-4367-8f0a-0bc3593732ce-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.789386 4830 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a0a1e291-1a11-4747-96ed-32c95623dcbb-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.789400 4830 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11e19037-abf1-4269-b933-0950913973b9-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.789412 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44872ddd-52a8-4ca8-a07e-f84111475b8f-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.789423 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jl6k\" (UniqueName: \"kubernetes.io/projected/77147fe4-670f-40ca-ab50-4d3220442eee-kube-api-access-5jl6k\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.789434 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a0a1e291-1a11-4747-96ed-32c95623dcbb-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.789448 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e152864-9096-47a7-b0b0-c288840093e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.789459 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8h58f\" (UniqueName: \"kubernetes.io/projected/0586337e-461c-4367-8f0a-0bc3593732ce-kube-api-access-8h58f\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.789470 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfnj6\" (UniqueName: \"kubernetes.io/projected/11e19037-abf1-4269-b933-0950913973b9-kube-api-access-tfnj6\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.789481 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jxbg\" (UniqueName: \"kubernetes.io/projected/a0a1e291-1a11-4747-96ed-32c95623dcbb-kube-api-access-7jxbg\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.793940 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.806564 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.824429 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44872ddd-52a8-4ca8-a07e-f84111475b8f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44872ddd-52a8-4ca8-a07e-f84111475b8f" (UID: "44872ddd-52a8-4ca8-a07e-f84111475b8f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.827503 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.827814 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0a1e291-1a11-4747-96ed-32c95623dcbb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0a1e291-1a11-4747-96ed-32c95623dcbb" (UID: "a0a1e291-1a11-4747-96ed-32c95623dcbb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.827845 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ba738f-c556-4b36-a045-3516efdf886a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3ba738f-c556-4b36-a045-3516efdf886a" (UID: "b3ba738f-c556-4b36-a045-3516efdf886a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.828078 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11e19037-abf1-4269-b933-0950913973b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11e19037-abf1-4269-b933-0950913973b9" (UID: "11e19037-abf1-4269-b933-0950913973b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.849284 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.863982 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ba738f-c556-4b36-a045-3516efdf886a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b3ba738f-c556-4b36-a045-3516efdf886a" (UID: "b3ba738f-c556-4b36-a045-3516efdf886a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.865349 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.872304 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48aa5450-29c8-47de-bb37-a7a6ffd441bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48aa5450-29c8-47de-bb37-a7a6ffd441bc" (UID: "48aa5450-29c8-47de-bb37-a7a6ffd441bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.881628 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.881850 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0e71339-fd75-44b3-bbb8-15d75455d90f-config-data" (OuterVolumeSpecName: "config-data") pod "a0e71339-fd75-44b3-bbb8-15d75455d90f" (UID: "a0e71339-fd75-44b3-bbb8-15d75455d90f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.891714 4830 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3ba738f-c556-4b36-a045-3516efdf886a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.891748 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48aa5450-29c8-47de-bb37-a7a6ffd441bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.891759 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a1e291-1a11-4747-96ed-32c95623dcbb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.891791 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44872ddd-52a8-4ca8-a07e-f84111475b8f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.891804 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0e71339-fd75-44b3-bbb8-15d75455d90f-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.891816 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ba738f-c556-4b36-a045-3516efdf886a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.891827 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11e19037-abf1-4269-b933-0950913973b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.902849 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-lhdqd"] Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.903131 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48aa5450-29c8-47de-bb37-a7a6ffd441bc-config-data" (OuterVolumeSpecName: "config-data") pod "48aa5450-29c8-47de-bb37-a7a6ffd441bc" (UID: "48aa5450-29c8-47de-bb37-a7a6ffd441bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.910748 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-lhdqd"] Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.917137 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ba738f-c556-4b36-a045-3516efdf886a-config-data" (OuterVolumeSpecName: "config-data") pod "b3ba738f-c556-4b36-a045-3516efdf886a" (UID: "b3ba738f-c556-4b36-a045-3516efdf886a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.930155 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0a1e291-1a11-4747-96ed-32c95623dcbb-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "a0a1e291-1a11-4747-96ed-32c95623dcbb" (UID: "a0a1e291-1a11-4747-96ed-32c95623dcbb"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.950505 4830 scope.go:117] "RemoveContainer" containerID="13a949ebe12567f356b288e72620234deec79f64d460b08c050f70b6131858f4" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.954589 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ba738f-c556-4b36-a045-3516efdf886a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b3ba738f-c556-4b36-a045-3516efdf886a" (UID: "b3ba738f-c556-4b36-a045-3516efdf886a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.960529 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0e71339-fd75-44b3-bbb8-15d75455d90f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0e71339-fd75-44b3-bbb8-15d75455d90f" (UID: "a0e71339-fd75-44b3-bbb8-15d75455d90f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.973522 4830 scope.go:117] "RemoveContainer" containerID="0746b8bb66f5e7517a5d7f696d7212e472acad426b45aa47e1826fd52a0611e5" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.977914 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e152864-9096-47a7-b0b0-c288840093e7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3e152864-9096-47a7-b0b0-c288840093e7" (UID: "3e152864-9096-47a7-b0b0-c288840093e7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.988096 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_7b116575-f650-432e-9eb8-31b6f16b027c/ovn-northd/0.log" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.988157 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.993001 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e152864-9096-47a7-b0b0-c288840093e7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3e152864-9096-47a7-b0b0-c288840093e7" (UID: "3e152864-9096-47a7-b0b0-c288840093e7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.993907 4830 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e152864-9096-47a7-b0b0-c288840093e7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.993923 4830 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e152864-9096-47a7-b0b0-c288840093e7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.993932 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ba738f-c556-4b36-a045-3516efdf886a-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.993942 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0e71339-fd75-44b3-bbb8-15d75455d90f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.993950 4830 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3ba738f-c556-4b36-a045-3516efdf886a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.993958 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48aa5450-29c8-47de-bb37-a7a6ffd441bc-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.993967 4830 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0a1e291-1a11-4747-96ed-32c95623dcbb-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:05 crc kubenswrapper[4830]: I0318 18:25:05.998992 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e152864-9096-47a7-b0b0-c288840093e7-config-data" (OuterVolumeSpecName: "config-data") pod "3e152864-9096-47a7-b0b0-c288840093e7" (UID: "3e152864-9096-47a7-b0b0-c288840093e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.010302 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11e19037-abf1-4269-b933-0950913973b9-config-data" (OuterVolumeSpecName: "config-data") pod "11e19037-abf1-4269-b933-0950913973b9" (UID: "11e19037-abf1-4269-b933-0950913973b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.096338 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nz5mg\" (UniqueName: \"kubernetes.io/projected/7b116575-f650-432e-9eb8-31b6f16b027c-kube-api-access-nz5mg\") pod \"7b116575-f650-432e-9eb8-31b6f16b027c\" (UID: \"7b116575-f650-432e-9eb8-31b6f16b027c\") " Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.096421 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b116575-f650-432e-9eb8-31b6f16b027c-ovn-northd-tls-certs\") pod \"7b116575-f650-432e-9eb8-31b6f16b027c\" (UID: \"7b116575-f650-432e-9eb8-31b6f16b027c\") " Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.096447 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7b116575-f650-432e-9eb8-31b6f16b027c-ovn-rundir\") pod \"7b116575-f650-432e-9eb8-31b6f16b027c\" (UID: \"7b116575-f650-432e-9eb8-31b6f16b027c\") " Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.096477 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b116575-f650-432e-9eb8-31b6f16b027c-metrics-certs-tls-certs\") pod \"7b116575-f650-432e-9eb8-31b6f16b027c\" (UID: \"7b116575-f650-432e-9eb8-31b6f16b027c\") " Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.096502 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b116575-f650-432e-9eb8-31b6f16b027c-scripts\") pod \"7b116575-f650-432e-9eb8-31b6f16b027c\" (UID: \"7b116575-f650-432e-9eb8-31b6f16b027c\") " Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.096591 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b116575-f650-432e-9eb8-31b6f16b027c-config\") pod \"7b116575-f650-432e-9eb8-31b6f16b027c\" (UID: \"7b116575-f650-432e-9eb8-31b6f16b027c\") " Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.096653 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b116575-f650-432e-9eb8-31b6f16b027c-combined-ca-bundle\") pod \"7b116575-f650-432e-9eb8-31b6f16b027c\" (UID: \"7b116575-f650-432e-9eb8-31b6f16b027c\") " Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.097405 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e152864-9096-47a7-b0b0-c288840093e7-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.097422 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11e19037-abf1-4269-b933-0950913973b9-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.098010 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b116575-f650-432e-9eb8-31b6f16b027c-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "7b116575-f650-432e-9eb8-31b6f16b027c" (UID: "7b116575-f650-432e-9eb8-31b6f16b027c"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.100983 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b116575-f650-432e-9eb8-31b6f16b027c-kube-api-access-nz5mg" (OuterVolumeSpecName: "kube-api-access-nz5mg") pod "7b116575-f650-432e-9eb8-31b6f16b027c" (UID: "7b116575-f650-432e-9eb8-31b6f16b027c"). InnerVolumeSpecName "kube-api-access-nz5mg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.113436 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b116575-f650-432e-9eb8-31b6f16b027c-scripts" (OuterVolumeSpecName: "scripts") pod "7b116575-f650-432e-9eb8-31b6f16b027c" (UID: "7b116575-f650-432e-9eb8-31b6f16b027c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.124040 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b116575-f650-432e-9eb8-31b6f16b027c-config" (OuterVolumeSpecName: "config") pod "7b116575-f650-432e-9eb8-31b6f16b027c" (UID: "7b116575-f650-432e-9eb8-31b6f16b027c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.133913 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b116575-f650-432e-9eb8-31b6f16b027c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b116575-f650-432e-9eb8-31b6f16b027c" (UID: "7b116575-f650-432e-9eb8-31b6f16b027c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.212897 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b116575-f650-432e-9eb8-31b6f16b027c-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "7b116575-f650-432e-9eb8-31b6f16b027c" (UID: "7b116575-f650-432e-9eb8-31b6f16b027c"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.213144 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b116575-f650-432e-9eb8-31b6f16b027c-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.213168 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b116575-f650-432e-9eb8-31b6f16b027c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.213182 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nz5mg\" (UniqueName: \"kubernetes.io/projected/7b116575-f650-432e-9eb8-31b6f16b027c-kube-api-access-nz5mg\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.213195 4830 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7b116575-f650-432e-9eb8-31b6f16b027c-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.213207 4830 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b116575-f650-432e-9eb8-31b6f16b027c-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.213218 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b116575-f650-432e-9eb8-31b6f16b027c-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.216315 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b116575-f650-432e-9eb8-31b6f16b027c-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "7b116575-f650-432e-9eb8-31b6f16b027c" (UID: "7b116575-f650-432e-9eb8-31b6f16b027c"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.226281 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"44872ddd-52a8-4ca8-a07e-f84111475b8f","Type":"ContainerDied","Data":"ac97e09895ec9d7d87458ffd0aeb3d5d9139d115aea8746f3981aee77198c5a4"} Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.226451 4830 scope.go:117] "RemoveContainer" containerID="d65ffc2b335a667737c6a18c2b396b9a709039acd32a58d2211316eb8df8aa6d" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.226358 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.233214 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"a0a1e291-1a11-4747-96ed-32c95623dcbb","Type":"ContainerDied","Data":"d805b73651ccf98e97dab5ad8973d11d3a50eafb0529910eecf3d0ed2bdefb03"} Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.233323 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.246066 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.250423 4830 generic.go:334] "Generic (PLEG): container finished" podID="34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15" containerID="1faa9ae26dd3b098b13664816e21acb92be7c62408cd3cf5567216f95dc7ad27" exitCode=0 Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.253800 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_7b116575-f650-432e-9eb8-31b6f16b027c/ovn-northd/0.log" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.253839 4830 generic.go:334] "Generic (PLEG): container finished" podID="7b116575-f650-432e-9eb8-31b6f16b027c" containerID="5dd7c3004c5f8608ed4722eddd8ec0a5d064fff0cca450e65c3c344caa64b4da" exitCode=139 Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.255709 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-clvpz" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.255720 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-78f6989b54-vkxh8" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.255839 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.261580 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6bbb58d4c-74p8g" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.263234 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0586337e-461c-4367-8f0a-0bc3593732ce" path="/var/lib/kubelet/pods/0586337e-461c-4367-8f0a-0bc3593732ce/volumes" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.263847 4830 scope.go:117] "RemoveContainer" containerID="c6c30f91c3c07f2417a561616bc4ab4ba1863961710fa17a2a7105a6e4af19cd" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.264014 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ac8a4f8-88e7-4cd0-ab89-210fb088b137" path="/var/lib/kubelet/pods/0ac8a4f8-88e7-4cd0-ab89-210fb088b137/volumes" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.264688 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e8e20bd-67c1-48a7-be43-c585d65656ea" path="/var/lib/kubelet/pods/1e8e20bd-67c1-48a7-be43-c585d65656ea/volumes" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.265604 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e810512-a127-40b3-b1c2-559c3b86fcdb" path="/var/lib/kubelet/pods/3e810512-a127-40b3-b1c2-559c3b86fcdb/volumes" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.268154 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4120b308-df6b-45df-ab90-abc5417228e5" path="/var/lib/kubelet/pods/4120b308-df6b-45df-ab90-abc5417228e5/volumes" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.269578 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="435574fa-a924-4289-a93a-dea05d57d105" path="/var/lib/kubelet/pods/435574fa-a924-4289-a93a-dea05d57d105/volumes" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.270619 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.270626 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7f884dc87d-6wvs2" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.271970 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b" path="/var/lib/kubelet/pods/9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b/volumes" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.273668 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3421452-ceb9-441f-8982-77c0a33c7a3b" path="/var/lib/kubelet/pods/a3421452-ceb9-441f-8982-77c0a33c7a3b/volumes" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.274152 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4a0587e-8ede-4ec6-beb7-7bea2c0af8bd" path="/var/lib/kubelet/pods/a4a0587e-8ede-4ec6-beb7-7bea2c0af8bd/volumes" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.274698 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad760963-34af-440e-9931-fbc23783d7cb" path="/var/lib/kubelet/pods/ad760963-34af-440e-9931-fbc23783d7cb/volumes" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.275554 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd0fbdd2-a99b-4758-9f27-1f5055ca0172" path="/var/lib/kubelet/pods/dd0fbdd2-a99b-4758-9f27-1f5055ca0172/volumes" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.277213 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3a34a0e-8390-4618-8b6e-c27ed8adc51a" path="/var/lib/kubelet/pods/e3a34a0e-8390-4618-8b6e-c27ed8adc51a/volumes" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.298320 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8631247-bdcb-45ff-a17d-ac7e7ff81800" path="/var/lib/kubelet/pods/e8631247-bdcb-45ff-a17d-ac7e7ff81800/volumes" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.307723 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a0e71339-fd75-44b3-bbb8-15d75455d90f","Type":"ContainerDied","Data":"fd2d24062fbfa06740b59dda44258239efc1b473de2f340be4737096134f19b3"} Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.311959 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78f6989b54-vkxh8" event={"ID":"48aa5450-29c8-47de-bb37-a7a6ffd441bc","Type":"ContainerDied","Data":"56e7a13896b3a18f5d1fbe2b3c404e40bf121b90f6ede29fdbfac10fc578905d"} Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.312009 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15","Type":"ContainerDied","Data":"1faa9ae26dd3b098b13664816e21acb92be7c62408cd3cf5567216f95dc7ad27"} Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.312028 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-clvpz" event={"ID":"77147fe4-670f-40ca-ab50-4d3220442eee","Type":"ContainerDied","Data":"b025bd585a36a924f268852de9a6e2b8372995c5905c3efd0f80f7bb05716102"} Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.312049 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.312068 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.312086 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"7b116575-f650-432e-9eb8-31b6f16b027c","Type":"ContainerDied","Data":"5dd7c3004c5f8608ed4722eddd8ec0a5d064fff0cca450e65c3c344caa64b4da"} Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.312099 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"7b116575-f650-432e-9eb8-31b6f16b027c","Type":"ContainerDied","Data":"38175a96a48085f7db7e366f32d5fbfb42fa11538c532e62066eb897a627791b"} Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.312110 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6bbb58d4c-74p8g" event={"ID":"11e19037-abf1-4269-b933-0950913973b9","Type":"ContainerDied","Data":"7d5a15b177dc6f2188c753b993e032ffbf4e4b88ccdcdc22f26dcd0b1a630d90"} Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.312130 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f884dc87d-6wvs2" event={"ID":"3e152864-9096-47a7-b0b0-c288840093e7","Type":"ContainerDied","Data":"540de3175421e90b719f9dae4a86fc275fed2e759d940020ba95c8d739092d80"} Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.314667 4830 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b116575-f650-432e-9eb8-31b6f16b027c-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.320477 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.323689 4830 scope.go:117] "RemoveContainer" containerID="b5f8b7f66219fddf66e22ef6b5a06dba84482b8f68cbbeea50a396ebe1d339d0" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.328171 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.384487 4830 scope.go:117] "RemoveContainer" containerID="fc53817ebacc0ce8c203daf49d972d55c5cc1843b058744c9a909e3088e8e2dc" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.405095 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7f884dc87d-6wvs2"] Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.411309 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7f884dc87d-6wvs2"] Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.411410 4830 scope.go:117] "RemoveContainer" containerID="164f985d7fecf295460783b0211bbc6afa41549a232c6b8704e09de623fb3cd3" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.444620 4830 scope.go:117] "RemoveContainer" containerID="1cae5bbd9865bbf63fae7e180aeb6c01f50309bbfd9244d4790218d40ab51f78" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.476750 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6bbb58d4c-74p8g"] Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.494746 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-6bbb58d4c-74p8g"] Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.511174 4830 scope.go:117] "RemoveContainer" containerID="5dd7c3004c5f8608ed4722eddd8ec0a5d064fff0cca450e65c3c344caa64b4da" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.524336 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.537980 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.546781 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.547563 4830 scope.go:117] "RemoveContainer" containerID="1cae5bbd9865bbf63fae7e180aeb6c01f50309bbfd9244d4790218d40ab51f78" Mar 18 18:25:06 crc kubenswrapper[4830]: E0318 18:25:06.550552 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cae5bbd9865bbf63fae7e180aeb6c01f50309bbfd9244d4790218d40ab51f78\": container with ID starting with 1cae5bbd9865bbf63fae7e180aeb6c01f50309bbfd9244d4790218d40ab51f78 not found: ID does not exist" containerID="1cae5bbd9865bbf63fae7e180aeb6c01f50309bbfd9244d4790218d40ab51f78" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.550586 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cae5bbd9865bbf63fae7e180aeb6c01f50309bbfd9244d4790218d40ab51f78"} err="failed to get container status \"1cae5bbd9865bbf63fae7e180aeb6c01f50309bbfd9244d4790218d40ab51f78\": rpc error: code = NotFound desc = could not find container \"1cae5bbd9865bbf63fae7e180aeb6c01f50309bbfd9244d4790218d40ab51f78\": container with ID starting with 1cae5bbd9865bbf63fae7e180aeb6c01f50309bbfd9244d4790218d40ab51f78 not found: ID does not exist" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.550606 4830 scope.go:117] "RemoveContainer" containerID="5dd7c3004c5f8608ed4722eddd8ec0a5d064fff0cca450e65c3c344caa64b4da" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.552328 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-78f6989b54-vkxh8"] Mar 18 18:25:06 crc kubenswrapper[4830]: E0318 18:25:06.552672 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dd7c3004c5f8608ed4722eddd8ec0a5d064fff0cca450e65c3c344caa64b4da\": container with ID starting with 5dd7c3004c5f8608ed4722eddd8ec0a5d064fff0cca450e65c3c344caa64b4da not found: ID does not exist" containerID="5dd7c3004c5f8608ed4722eddd8ec0a5d064fff0cca450e65c3c344caa64b4da" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.552698 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dd7c3004c5f8608ed4722eddd8ec0a5d064fff0cca450e65c3c344caa64b4da"} err="failed to get container status \"5dd7c3004c5f8608ed4722eddd8ec0a5d064fff0cca450e65c3c344caa64b4da\": rpc error: code = NotFound desc = could not find container \"5dd7c3004c5f8608ed4722eddd8ec0a5d064fff0cca450e65c3c344caa64b4da\": container with ID starting with 5dd7c3004c5f8608ed4722eddd8ec0a5d064fff0cca450e65c3c344caa64b4da not found: ID does not exist" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.552715 4830 scope.go:117] "RemoveContainer" containerID="e3cd2ffc35cea964dcec2e27b4b151f289beecdcd0e3b5f7b932d52f599b93c0" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.566917 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-78f6989b54-vkxh8"] Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.581235 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-clvpz"] Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.583993 4830 scope.go:117] "RemoveContainer" containerID="904ded3c9841d4d431c9a8b7917b3f2eec10c31241a56280fbcc48164d2a5323" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.592562 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-clvpz"] Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.603240 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.604693 4830 scope.go:117] "RemoveContainer" containerID="e20014a42907afd388ba14b211a6c05885fe859da4a4d5b322dfc735c19c8637" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.611818 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.618050 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.620553 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\" (UID: \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\") " Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.620614 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mv9cp\" (UniqueName: \"kubernetes.io/projected/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-kube-api-access-mv9cp\") pod \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\" (UID: \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\") " Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.620637 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-galera-tls-certs\") pod \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\" (UID: \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\") " Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.620660 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-operator-scripts\") pod \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\" (UID: \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\") " Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.620681 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-config-data-default\") pod \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\" (UID: \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\") " Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.620703 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-kolla-config\") pod \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\" (UID: \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\") " Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.620726 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-combined-ca-bundle\") pod \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\" (UID: \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\") " Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.620760 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-config-data-generated\") pod \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\" (UID: \"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15\") " Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.621529 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15" (UID: "34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.623590 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15" (UID: "34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.623900 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15" (UID: "34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.624266 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15" (UID: "34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.626063 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.626747 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-kube-api-access-mv9cp" (OuterVolumeSpecName: "kube-api-access-mv9cp") pod "34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15" (UID: "34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15"). InnerVolumeSpecName "kube-api-access-mv9cp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.630060 4830 scope.go:117] "RemoveContainer" containerID="b03c2437bc3b020985e30c6a140c3c922aeb2a95cd6d3bf3c72a87ffaf8ce7ba" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.634911 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "mysql-db") pod "34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15" (UID: "34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.643293 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15" (UID: "34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.676854 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15" (UID: "34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.722649 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mv9cp\" (UniqueName: \"kubernetes.io/projected/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-kube-api-access-mv9cp\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.722801 4830 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.722868 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.722921 4830 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.722972 4830 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.723024 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.723099 4830 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.723170 4830 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.733027 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="eaec193f-d7b0-4d62-8133-3c1b094a1c71" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.213:3000/\": dial tcp 10.217.0.213:3000: connect: connection refused" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.741861 4830 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 18 18:25:06 crc kubenswrapper[4830]: I0318 18:25:06.824967 4830 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.219005 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6f6ff8b5bf-p5xgc" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.302278 4830 generic.go:334] "Generic (PLEG): container finished" podID="4ce021de-a1a0-43a6-a2fa-270ea1238bac" containerID="9ac24ed62afef232745223270fdd95256063c89724522a58a2fc1a5183dbf7a7" exitCode=0 Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.302588 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6f6ff8b5bf-p5xgc" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.302489 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6f6ff8b5bf-p5xgc" event={"ID":"4ce021de-a1a0-43a6-a2fa-270ea1238bac","Type":"ContainerDied","Data":"9ac24ed62afef232745223270fdd95256063c89724522a58a2fc1a5183dbf7a7"} Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.302718 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6f6ff8b5bf-p5xgc" event={"ID":"4ce021de-a1a0-43a6-a2fa-270ea1238bac","Type":"ContainerDied","Data":"110a5ec4f5768120e797dad14d9e9cdc7b4dca85aab727001f861f2b68696081"} Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.302749 4830 scope.go:117] "RemoveContainer" containerID="9ac24ed62afef232745223270fdd95256063c89724522a58a2fc1a5183dbf7a7" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.313469 4830 generic.go:334] "Generic (PLEG): container finished" podID="a639262d-5bc7-4b14-a6ef-59583fdffb07" containerID="dae4cab83feb5262c8c7a5b8b0cb453b9f964431009385de80e3e0c21a526b8f" exitCode=0 Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.313522 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a639262d-5bc7-4b14-a6ef-59583fdffb07","Type":"ContainerDied","Data":"dae4cab83feb5262c8c7a5b8b0cb453b9f964431009385de80e3e0c21a526b8f"} Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.315499 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15","Type":"ContainerDied","Data":"327b00fdc804b36736db73fc8c691c17804ae744740ed660740eaa0606739c62"} Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.315570 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.331330 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-internal-tls-certs\") pod \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\" (UID: \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\") " Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.331435 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-combined-ca-bundle\") pod \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\" (UID: \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\") " Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.331492 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-public-tls-certs\") pod \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\" (UID: \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\") " Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.331602 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-config-data\") pod \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\" (UID: \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\") " Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.331658 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-scripts\") pod \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\" (UID: \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\") " Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.331706 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-fernet-keys\") pod \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\" (UID: \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\") " Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.331737 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-credential-keys\") pod \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\" (UID: \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\") " Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.331807 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpfmg\" (UniqueName: \"kubernetes.io/projected/4ce021de-a1a0-43a6-a2fa-270ea1238bac-kube-api-access-kpfmg\") pod \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\" (UID: \"4ce021de-a1a0-43a6-a2fa-270ea1238bac\") " Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.341628 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ce021de-a1a0-43a6-a2fa-270ea1238bac-kube-api-access-kpfmg" (OuterVolumeSpecName: "kube-api-access-kpfmg") pod "4ce021de-a1a0-43a6-a2fa-270ea1238bac" (UID: "4ce021de-a1a0-43a6-a2fa-270ea1238bac"). InnerVolumeSpecName "kube-api-access-kpfmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.341982 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-scripts" (OuterVolumeSpecName: "scripts") pod "4ce021de-a1a0-43a6-a2fa-270ea1238bac" (UID: "4ce021de-a1a0-43a6-a2fa-270ea1238bac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.342920 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4ce021de-a1a0-43a6-a2fa-270ea1238bac" (UID: "4ce021de-a1a0-43a6-a2fa-270ea1238bac"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.343375 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4ce021de-a1a0-43a6-a2fa-270ea1238bac" (UID: "4ce021de-a1a0-43a6-a2fa-270ea1238bac"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.348924 4830 scope.go:117] "RemoveContainer" containerID="9ac24ed62afef232745223270fdd95256063c89724522a58a2fc1a5183dbf7a7" Mar 18 18:25:07 crc kubenswrapper[4830]: E0318 18:25:07.350251 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ac24ed62afef232745223270fdd95256063c89724522a58a2fc1a5183dbf7a7\": container with ID starting with 9ac24ed62afef232745223270fdd95256063c89724522a58a2fc1a5183dbf7a7 not found: ID does not exist" containerID="9ac24ed62afef232745223270fdd95256063c89724522a58a2fc1a5183dbf7a7" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.350283 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ac24ed62afef232745223270fdd95256063c89724522a58a2fc1a5183dbf7a7"} err="failed to get container status \"9ac24ed62afef232745223270fdd95256063c89724522a58a2fc1a5183dbf7a7\": rpc error: code = NotFound desc = could not find container \"9ac24ed62afef232745223270fdd95256063c89724522a58a2fc1a5183dbf7a7\": container with ID starting with 9ac24ed62afef232745223270fdd95256063c89724522a58a2fc1a5183dbf7a7 not found: ID does not exist" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.350306 4830 scope.go:117] "RemoveContainer" containerID="1faa9ae26dd3b098b13664816e21acb92be7c62408cd3cf5567216f95dc7ad27" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.366420 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ce021de-a1a0-43a6-a2fa-270ea1238bac" (UID: "4ce021de-a1a0-43a6-a2fa-270ea1238bac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.369321 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-config-data" (OuterVolumeSpecName: "config-data") pod "4ce021de-a1a0-43a6-a2fa-270ea1238bac" (UID: "4ce021de-a1a0-43a6-a2fa-270ea1238bac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.385131 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4ce021de-a1a0-43a6-a2fa-270ea1238bac" (UID: "4ce021de-a1a0-43a6-a2fa-270ea1238bac"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.397323 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4ce021de-a1a0-43a6-a2fa-270ea1238bac" (UID: "4ce021de-a1a0-43a6-a2fa-270ea1238bac"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.413777 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.418608 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 18:25:07 crc kubenswrapper[4830]: E0318 18:25:07.424847 4830 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56fb6c83_b748_4e21_9b1c_90fb37cefea1.slice/crio-conmon-27c872d698c8ee18fd3cec86b3f4b99ede08b94d9fdc72c7ec17bce05d4a979d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56fb6c83_b748_4e21_9b1c_90fb37cefea1.slice/crio-27c872d698c8ee18fd3cec86b3f4b99ede08b94d9fdc72c7ec17bce05d4a979d.scope\": RecentStats: unable to find data in memory cache]" Mar 18 18:25:07 crc kubenswrapper[4830]: E0318 18:25:07.435647 4830 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 18 18:25:07 crc kubenswrapper[4830]: E0318 18:25:07.435727 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/56fb6c83-b748-4e21-9b1c-90fb37cefea1-config-data podName:56fb6c83-b748-4e21-9b1c-90fb37cefea1 nodeName:}" failed. No retries permitted until 2026-03-18 18:25:15.435709972 +0000 UTC m=+1350.003340294 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/56fb6c83-b748-4e21-9b1c-90fb37cefea1-config-data") pod "rabbitmq-server-0" (UID: "56fb6c83-b748-4e21-9b1c-90fb37cefea1") : configmap "rabbitmq-config-data" not found Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.435671 4830 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.435814 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.435825 4830 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.435835 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.435843 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.435853 4830 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.435861 4830 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4ce021de-a1a0-43a6-a2fa-270ea1238bac-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.435872 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpfmg\" (UniqueName: \"kubernetes.io/projected/4ce021de-a1a0-43a6-a2fa-270ea1238bac-kube-api-access-kpfmg\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.447934 4830 scope.go:117] "RemoveContainer" containerID="e20da5e2c4bbdc2778d58c4baf6547e914e5cc2ec137efafe2f7cd9631a76c14" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.569485 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.646916 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6f6ff8b5bf-p5xgc"] Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.648943 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-6f6ff8b5bf-p5xgc"] Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.743161 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a639262d-5bc7-4b14-a6ef-59583fdffb07-rabbitmq-plugins\") pod \"a639262d-5bc7-4b14-a6ef-59583fdffb07\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.743298 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a639262d-5bc7-4b14-a6ef-59583fdffb07-rabbitmq-tls\") pod \"a639262d-5bc7-4b14-a6ef-59583fdffb07\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.743323 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a639262d-5bc7-4b14-a6ef-59583fdffb07-server-conf\") pod \"a639262d-5bc7-4b14-a6ef-59583fdffb07\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.743375 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a639262d-5bc7-4b14-a6ef-59583fdffb07-rabbitmq-confd\") pod \"a639262d-5bc7-4b14-a6ef-59583fdffb07\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.743405 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a639262d-5bc7-4b14-a6ef-59583fdffb07-plugins-conf\") pod \"a639262d-5bc7-4b14-a6ef-59583fdffb07\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.743471 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbgzn\" (UniqueName: \"kubernetes.io/projected/a639262d-5bc7-4b14-a6ef-59583fdffb07-kube-api-access-mbgzn\") pod \"a639262d-5bc7-4b14-a6ef-59583fdffb07\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.743488 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"a639262d-5bc7-4b14-a6ef-59583fdffb07\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.743524 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a639262d-5bc7-4b14-a6ef-59583fdffb07-pod-info\") pod \"a639262d-5bc7-4b14-a6ef-59583fdffb07\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.743557 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a639262d-5bc7-4b14-a6ef-59583fdffb07-rabbitmq-erlang-cookie\") pod \"a639262d-5bc7-4b14-a6ef-59583fdffb07\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.743896 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a639262d-5bc7-4b14-a6ef-59583fdffb07-config-data\") pod \"a639262d-5bc7-4b14-a6ef-59583fdffb07\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.743929 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a639262d-5bc7-4b14-a6ef-59583fdffb07-erlang-cookie-secret\") pod \"a639262d-5bc7-4b14-a6ef-59583fdffb07\" (UID: \"a639262d-5bc7-4b14-a6ef-59583fdffb07\") " Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.743808 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a639262d-5bc7-4b14-a6ef-59583fdffb07-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "a639262d-5bc7-4b14-a6ef-59583fdffb07" (UID: "a639262d-5bc7-4b14-a6ef-59583fdffb07"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.744128 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a639262d-5bc7-4b14-a6ef-59583fdffb07-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "a639262d-5bc7-4b14-a6ef-59583fdffb07" (UID: "a639262d-5bc7-4b14-a6ef-59583fdffb07"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.744255 4830 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a639262d-5bc7-4b14-a6ef-59583fdffb07-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.744273 4830 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a639262d-5bc7-4b14-a6ef-59583fdffb07-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.744364 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a639262d-5bc7-4b14-a6ef-59583fdffb07-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "a639262d-5bc7-4b14-a6ef-59583fdffb07" (UID: "a639262d-5bc7-4b14-a6ef-59583fdffb07"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.749035 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a639262d-5bc7-4b14-a6ef-59583fdffb07-kube-api-access-mbgzn" (OuterVolumeSpecName: "kube-api-access-mbgzn") pod "a639262d-5bc7-4b14-a6ef-59583fdffb07" (UID: "a639262d-5bc7-4b14-a6ef-59583fdffb07"). InnerVolumeSpecName "kube-api-access-mbgzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.749401 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a639262d-5bc7-4b14-a6ef-59583fdffb07-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "a639262d-5bc7-4b14-a6ef-59583fdffb07" (UID: "a639262d-5bc7-4b14-a6ef-59583fdffb07"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.750362 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a639262d-5bc7-4b14-a6ef-59583fdffb07-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "a639262d-5bc7-4b14-a6ef-59583fdffb07" (UID: "a639262d-5bc7-4b14-a6ef-59583fdffb07"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.755929 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "a639262d-5bc7-4b14-a6ef-59583fdffb07" (UID: "a639262d-5bc7-4b14-a6ef-59583fdffb07"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.762059 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a639262d-5bc7-4b14-a6ef-59583fdffb07-pod-info" (OuterVolumeSpecName: "pod-info") pod "a639262d-5bc7-4b14-a6ef-59583fdffb07" (UID: "a639262d-5bc7-4b14-a6ef-59583fdffb07"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.781226 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a639262d-5bc7-4b14-a6ef-59583fdffb07-config-data" (OuterVolumeSpecName: "config-data") pod "a639262d-5bc7-4b14-a6ef-59583fdffb07" (UID: "a639262d-5bc7-4b14-a6ef-59583fdffb07"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.786053 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a639262d-5bc7-4b14-a6ef-59583fdffb07-server-conf" (OuterVolumeSpecName: "server-conf") pod "a639262d-5bc7-4b14-a6ef-59583fdffb07" (UID: "a639262d-5bc7-4b14-a6ef-59583fdffb07"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.816962 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a639262d-5bc7-4b14-a6ef-59583fdffb07-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "a639262d-5bc7-4b14-a6ef-59583fdffb07" (UID: "a639262d-5bc7-4b14-a6ef-59583fdffb07"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.845579 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.846444 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbgzn\" (UniqueName: \"kubernetes.io/projected/a639262d-5bc7-4b14-a6ef-59583fdffb07-kube-api-access-mbgzn\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.846502 4830 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.846519 4830 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a639262d-5bc7-4b14-a6ef-59583fdffb07-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.846529 4830 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a639262d-5bc7-4b14-a6ef-59583fdffb07-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.846538 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a639262d-5bc7-4b14-a6ef-59583fdffb07-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.846548 4830 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a639262d-5bc7-4b14-a6ef-59583fdffb07-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.846556 4830 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a639262d-5bc7-4b14-a6ef-59583fdffb07-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.846565 4830 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a639262d-5bc7-4b14-a6ef-59583fdffb07-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.846573 4830 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a639262d-5bc7-4b14-a6ef-59583fdffb07-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.862277 4830 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.947938 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56fb6c83-b748-4e21-9b1c-90fb37cefea1-pod-info\") pod \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.947993 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56fb6c83-b748-4e21-9b1c-90fb37cefea1-plugins-conf\") pod \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.948051 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56fb6c83-b748-4e21-9b1c-90fb37cefea1-rabbitmq-confd\") pod \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.948087 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc9jv\" (UniqueName: \"kubernetes.io/projected/56fb6c83-b748-4e21-9b1c-90fb37cefea1-kube-api-access-cc9jv\") pod \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.948117 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.948150 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56fb6c83-b748-4e21-9b1c-90fb37cefea1-server-conf\") pod \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.948193 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56fb6c83-b748-4e21-9b1c-90fb37cefea1-config-data\") pod \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.948217 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56fb6c83-b748-4e21-9b1c-90fb37cefea1-rabbitmq-erlang-cookie\") pod \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.948277 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56fb6c83-b748-4e21-9b1c-90fb37cefea1-rabbitmq-plugins\") pod \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.948314 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56fb6c83-b748-4e21-9b1c-90fb37cefea1-erlang-cookie-secret\") pod \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.948335 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56fb6c83-b748-4e21-9b1c-90fb37cefea1-rabbitmq-tls\") pod \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\" (UID: \"56fb6c83-b748-4e21-9b1c-90fb37cefea1\") " Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.948599 4830 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.949195 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56fb6c83-b748-4e21-9b1c-90fb37cefea1-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "56fb6c83-b748-4e21-9b1c-90fb37cefea1" (UID: "56fb6c83-b748-4e21-9b1c-90fb37cefea1"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.949374 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56fb6c83-b748-4e21-9b1c-90fb37cefea1-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "56fb6c83-b748-4e21-9b1c-90fb37cefea1" (UID: "56fb6c83-b748-4e21-9b1c-90fb37cefea1"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.949545 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56fb6c83-b748-4e21-9b1c-90fb37cefea1-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "56fb6c83-b748-4e21-9b1c-90fb37cefea1" (UID: "56fb6c83-b748-4e21-9b1c-90fb37cefea1"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.951985 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "56fb6c83-b748-4e21-9b1c-90fb37cefea1" (UID: "56fb6c83-b748-4e21-9b1c-90fb37cefea1"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.953125 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/56fb6c83-b748-4e21-9b1c-90fb37cefea1-pod-info" (OuterVolumeSpecName: "pod-info") pod "56fb6c83-b748-4e21-9b1c-90fb37cefea1" (UID: "56fb6c83-b748-4e21-9b1c-90fb37cefea1"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.953507 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56fb6c83-b748-4e21-9b1c-90fb37cefea1-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "56fb6c83-b748-4e21-9b1c-90fb37cefea1" (UID: "56fb6c83-b748-4e21-9b1c-90fb37cefea1"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.954531 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56fb6c83-b748-4e21-9b1c-90fb37cefea1-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "56fb6c83-b748-4e21-9b1c-90fb37cefea1" (UID: "56fb6c83-b748-4e21-9b1c-90fb37cefea1"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.955346 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56fb6c83-b748-4e21-9b1c-90fb37cefea1-kube-api-access-cc9jv" (OuterVolumeSpecName: "kube-api-access-cc9jv") pod "56fb6c83-b748-4e21-9b1c-90fb37cefea1" (UID: "56fb6c83-b748-4e21-9b1c-90fb37cefea1"). InnerVolumeSpecName "kube-api-access-cc9jv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:07 crc kubenswrapper[4830]: I0318 18:25:07.965515 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56fb6c83-b748-4e21-9b1c-90fb37cefea1-config-data" (OuterVolumeSpecName: "config-data") pod "56fb6c83-b748-4e21-9b1c-90fb37cefea1" (UID: "56fb6c83-b748-4e21-9b1c-90fb37cefea1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.009877 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56fb6c83-b748-4e21-9b1c-90fb37cefea1-server-conf" (OuterVolumeSpecName: "server-conf") pod "56fb6c83-b748-4e21-9b1c-90fb37cefea1" (UID: "56fb6c83-b748-4e21-9b1c-90fb37cefea1"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.048529 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56fb6c83-b748-4e21-9b1c-90fb37cefea1-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "56fb6c83-b748-4e21-9b1c-90fb37cefea1" (UID: "56fb6c83-b748-4e21-9b1c-90fb37cefea1"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.049909 4830 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56fb6c83-b748-4e21-9b1c-90fb37cefea1-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.050001 4830 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56fb6c83-b748-4e21-9b1c-90fb37cefea1-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.050025 4830 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56fb6c83-b748-4e21-9b1c-90fb37cefea1-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.050048 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cc9jv\" (UniqueName: \"kubernetes.io/projected/56fb6c83-b748-4e21-9b1c-90fb37cefea1-kube-api-access-cc9jv\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.050100 4830 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.050120 4830 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56fb6c83-b748-4e21-9b1c-90fb37cefea1-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.050139 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56fb6c83-b748-4e21-9b1c-90fb37cefea1-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.050159 4830 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56fb6c83-b748-4e21-9b1c-90fb37cefea1-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.050178 4830 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56fb6c83-b748-4e21-9b1c-90fb37cefea1-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.050197 4830 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56fb6c83-b748-4e21-9b1c-90fb37cefea1-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.050216 4830 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56fb6c83-b748-4e21-9b1c-90fb37cefea1-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.066424 4830 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.152587 4830 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.247538 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11e19037-abf1-4269-b933-0950913973b9" path="/var/lib/kubelet/pods/11e19037-abf1-4269-b933-0950913973b9/volumes" Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.248726 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15" path="/var/lib/kubelet/pods/34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15/volumes" Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.249673 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e152864-9096-47a7-b0b0-c288840093e7" path="/var/lib/kubelet/pods/3e152864-9096-47a7-b0b0-c288840093e7/volumes" Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.251280 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44872ddd-52a8-4ca8-a07e-f84111475b8f" path="/var/lib/kubelet/pods/44872ddd-52a8-4ca8-a07e-f84111475b8f/volumes" Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.252202 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48aa5450-29c8-47de-bb37-a7a6ffd441bc" path="/var/lib/kubelet/pods/48aa5450-29c8-47de-bb37-a7a6ffd441bc/volumes" Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.253053 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ce021de-a1a0-43a6-a2fa-270ea1238bac" path="/var/lib/kubelet/pods/4ce021de-a1a0-43a6-a2fa-270ea1238bac/volumes" Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.254579 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77147fe4-670f-40ca-ab50-4d3220442eee" path="/var/lib/kubelet/pods/77147fe4-670f-40ca-ab50-4d3220442eee/volumes" Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.255143 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b116575-f650-432e-9eb8-31b6f16b027c" path="/var/lib/kubelet/pods/7b116575-f650-432e-9eb8-31b6f16b027c/volumes" Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.255889 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0a1e291-1a11-4747-96ed-32c95623dcbb" path="/var/lib/kubelet/pods/a0a1e291-1a11-4747-96ed-32c95623dcbb/volumes" Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.257128 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0e71339-fd75-44b3-bbb8-15d75455d90f" path="/var/lib/kubelet/pods/a0e71339-fd75-44b3-bbb8-15d75455d90f/volumes" Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.258951 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3ba738f-c556-4b36-a045-3516efdf886a" path="/var/lib/kubelet/pods/b3ba738f-c556-4b36-a045-3516efdf886a/volumes" Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.326219 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a639262d-5bc7-4b14-a6ef-59583fdffb07","Type":"ContainerDied","Data":"4cfa4a0484891280bed69c79151db54e20b954f679f73ed24fb22fdc6733635d"} Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.326494 4830 scope.go:117] "RemoveContainer" containerID="dae4cab83feb5262c8c7a5b8b0cb453b9f964431009385de80e3e0c21a526b8f" Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.326630 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.328131 4830 generic.go:334] "Generic (PLEG): container finished" podID="56fb6c83-b748-4e21-9b1c-90fb37cefea1" containerID="27c872d698c8ee18fd3cec86b3f4b99ede08b94d9fdc72c7ec17bce05d4a979d" exitCode=0 Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.329207 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.328731 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"56fb6c83-b748-4e21-9b1c-90fb37cefea1","Type":"ContainerDied","Data":"27c872d698c8ee18fd3cec86b3f4b99ede08b94d9fdc72c7ec17bce05d4a979d"} Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.329449 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"56fb6c83-b748-4e21-9b1c-90fb37cefea1","Type":"ContainerDied","Data":"81bc88812550989ad89124dc8826a79b66e9b1a3b524cf17759ea64751502fc0"} Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.353654 4830 scope.go:117] "RemoveContainer" containerID="4b3823ab703387f205ec3b36349fb621c98a8c89a6e4303228224586840c10d9" Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.354007 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.361045 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.371500 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.376158 4830 scope.go:117] "RemoveContainer" containerID="27c872d698c8ee18fd3cec86b3f4b99ede08b94d9fdc72c7ec17bce05d4a979d" Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.380439 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.406977 4830 scope.go:117] "RemoveContainer" containerID="15382a1e088fc1fb63a24c483cd644ccab35ae8fb871c5907d4ff1a797361333" Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.430798 4830 scope.go:117] "RemoveContainer" containerID="27c872d698c8ee18fd3cec86b3f4b99ede08b94d9fdc72c7ec17bce05d4a979d" Mar 18 18:25:08 crc kubenswrapper[4830]: E0318 18:25:08.431310 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27c872d698c8ee18fd3cec86b3f4b99ede08b94d9fdc72c7ec17bce05d4a979d\": container with ID starting with 27c872d698c8ee18fd3cec86b3f4b99ede08b94d9fdc72c7ec17bce05d4a979d not found: ID does not exist" containerID="27c872d698c8ee18fd3cec86b3f4b99ede08b94d9fdc72c7ec17bce05d4a979d" Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.431403 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27c872d698c8ee18fd3cec86b3f4b99ede08b94d9fdc72c7ec17bce05d4a979d"} err="failed to get container status \"27c872d698c8ee18fd3cec86b3f4b99ede08b94d9fdc72c7ec17bce05d4a979d\": rpc error: code = NotFound desc = could not find container \"27c872d698c8ee18fd3cec86b3f4b99ede08b94d9fdc72c7ec17bce05d4a979d\": container with ID starting with 27c872d698c8ee18fd3cec86b3f4b99ede08b94d9fdc72c7ec17bce05d4a979d not found: ID does not exist" Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.431484 4830 scope.go:117] "RemoveContainer" containerID="15382a1e088fc1fb63a24c483cd644ccab35ae8fb871c5907d4ff1a797361333" Mar 18 18:25:08 crc kubenswrapper[4830]: E0318 18:25:08.432128 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15382a1e088fc1fb63a24c483cd644ccab35ae8fb871c5907d4ff1a797361333\": container with ID starting with 15382a1e088fc1fb63a24c483cd644ccab35ae8fb871c5907d4ff1a797361333 not found: ID does not exist" containerID="15382a1e088fc1fb63a24c483cd644ccab35ae8fb871c5907d4ff1a797361333" Mar 18 18:25:08 crc kubenswrapper[4830]: I0318 18:25:08.432216 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15382a1e088fc1fb63a24c483cd644ccab35ae8fb871c5907d4ff1a797361333"} err="failed to get container status \"15382a1e088fc1fb63a24c483cd644ccab35ae8fb871c5907d4ff1a797361333\": rpc error: code = NotFound desc = could not find container \"15382a1e088fc1fb63a24c483cd644ccab35ae8fb871c5907d4ff1a797361333\": container with ID starting with 15382a1e088fc1fb63a24c483cd644ccab35ae8fb871c5907d4ff1a797361333 not found: ID does not exist" Mar 18 18:25:08 crc kubenswrapper[4830]: E0318 18:25:08.775104 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc is running failed: container process not found" containerID="4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:25:08 crc kubenswrapper[4830]: E0318 18:25:08.776086 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc is running failed: container process not found" containerID="4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:25:08 crc kubenswrapper[4830]: E0318 18:25:08.776585 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc is running failed: container process not found" containerID="4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:25:08 crc kubenswrapper[4830]: E0318 18:25:08.776699 4830 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-dv8kn" podUID="23b737c7-6b5d-44f4-b05a-de278f4ca572" containerName="ovsdb-server" Mar 18 18:25:08 crc kubenswrapper[4830]: E0318 18:25:08.777210 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="880631acc0141d0007f3a250db7aaba33c7a12bda1b531c7c202660030481e50" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:25:08 crc kubenswrapper[4830]: E0318 18:25:08.778925 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="880631acc0141d0007f3a250db7aaba33c7a12bda1b531c7c202660030481e50" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:25:08 crc kubenswrapper[4830]: E0318 18:25:08.780298 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="880631acc0141d0007f3a250db7aaba33c7a12bda1b531c7c202660030481e50" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:25:08 crc kubenswrapper[4830]: E0318 18:25:08.780399 4830 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-dv8kn" podUID="23b737c7-6b5d-44f4-b05a-de278f4ca572" containerName="ovs-vswitchd" Mar 18 18:25:09 crc kubenswrapper[4830]: I0318 18:25:09.861543 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:25:09 crc kubenswrapper[4830]: I0318 18:25:09.979524 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaec193f-d7b0-4d62-8133-3c1b094a1c71-run-httpd\") pod \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\" (UID: \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\") " Mar 18 18:25:09 crc kubenswrapper[4830]: I0318 18:25:09.979635 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaec193f-d7b0-4d62-8133-3c1b094a1c71-config-data\") pod \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\" (UID: \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\") " Mar 18 18:25:09 crc kubenswrapper[4830]: I0318 18:25:09.979700 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4kh5\" (UniqueName: \"kubernetes.io/projected/eaec193f-d7b0-4d62-8133-3c1b094a1c71-kube-api-access-b4kh5\") pod \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\" (UID: \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\") " Mar 18 18:25:09 crc kubenswrapper[4830]: I0318 18:25:09.979747 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaec193f-d7b0-4d62-8133-3c1b094a1c71-log-httpd\") pod \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\" (UID: \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\") " Mar 18 18:25:09 crc kubenswrapper[4830]: I0318 18:25:09.979850 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaec193f-d7b0-4d62-8133-3c1b094a1c71-combined-ca-bundle\") pod \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\" (UID: \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\") " Mar 18 18:25:09 crc kubenswrapper[4830]: I0318 18:25:09.979885 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaec193f-d7b0-4d62-8133-3c1b094a1c71-ceilometer-tls-certs\") pod \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\" (UID: \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\") " Mar 18 18:25:09 crc kubenswrapper[4830]: I0318 18:25:09.979922 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaec193f-d7b0-4d62-8133-3c1b094a1c71-scripts\") pod \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\" (UID: \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\") " Mar 18 18:25:09 crc kubenswrapper[4830]: I0318 18:25:09.979946 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eaec193f-d7b0-4d62-8133-3c1b094a1c71-sg-core-conf-yaml\") pod \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\" (UID: \"eaec193f-d7b0-4d62-8133-3c1b094a1c71\") " Mar 18 18:25:09 crc kubenswrapper[4830]: I0318 18:25:09.980047 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaec193f-d7b0-4d62-8133-3c1b094a1c71-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "eaec193f-d7b0-4d62-8133-3c1b094a1c71" (UID: "eaec193f-d7b0-4d62-8133-3c1b094a1c71"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:25:09 crc kubenswrapper[4830]: I0318 18:25:09.980301 4830 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaec193f-d7b0-4d62-8133-3c1b094a1c71-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:09 crc kubenswrapper[4830]: I0318 18:25:09.980521 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaec193f-d7b0-4d62-8133-3c1b094a1c71-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "eaec193f-d7b0-4d62-8133-3c1b094a1c71" (UID: "eaec193f-d7b0-4d62-8133-3c1b094a1c71"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:25:09 crc kubenswrapper[4830]: I0318 18:25:09.984428 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaec193f-d7b0-4d62-8133-3c1b094a1c71-kube-api-access-b4kh5" (OuterVolumeSpecName: "kube-api-access-b4kh5") pod "eaec193f-d7b0-4d62-8133-3c1b094a1c71" (UID: "eaec193f-d7b0-4d62-8133-3c1b094a1c71"). InnerVolumeSpecName "kube-api-access-b4kh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:09 crc kubenswrapper[4830]: I0318 18:25:09.984871 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaec193f-d7b0-4d62-8133-3c1b094a1c71-scripts" (OuterVolumeSpecName: "scripts") pod "eaec193f-d7b0-4d62-8133-3c1b094a1c71" (UID: "eaec193f-d7b0-4d62-8133-3c1b094a1c71"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:10 crc kubenswrapper[4830]: I0318 18:25:10.012609 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaec193f-d7b0-4d62-8133-3c1b094a1c71-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "eaec193f-d7b0-4d62-8133-3c1b094a1c71" (UID: "eaec193f-d7b0-4d62-8133-3c1b094a1c71"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:10 crc kubenswrapper[4830]: I0318 18:25:10.038806 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaec193f-d7b0-4d62-8133-3c1b094a1c71-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "eaec193f-d7b0-4d62-8133-3c1b094a1c71" (UID: "eaec193f-d7b0-4d62-8133-3c1b094a1c71"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:10 crc kubenswrapper[4830]: I0318 18:25:10.044554 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaec193f-d7b0-4d62-8133-3c1b094a1c71-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eaec193f-d7b0-4d62-8133-3c1b094a1c71" (UID: "eaec193f-d7b0-4d62-8133-3c1b094a1c71"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:10 crc kubenswrapper[4830]: I0318 18:25:10.082141 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4kh5\" (UniqueName: \"kubernetes.io/projected/eaec193f-d7b0-4d62-8133-3c1b094a1c71-kube-api-access-b4kh5\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:10 crc kubenswrapper[4830]: I0318 18:25:10.082178 4830 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaec193f-d7b0-4d62-8133-3c1b094a1c71-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:10 crc kubenswrapper[4830]: I0318 18:25:10.082193 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaec193f-d7b0-4d62-8133-3c1b094a1c71-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:10 crc kubenswrapper[4830]: I0318 18:25:10.082204 4830 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaec193f-d7b0-4d62-8133-3c1b094a1c71-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:10 crc kubenswrapper[4830]: I0318 18:25:10.082216 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaec193f-d7b0-4d62-8133-3c1b094a1c71-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:10 crc kubenswrapper[4830]: I0318 18:25:10.082229 4830 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eaec193f-d7b0-4d62-8133-3c1b094a1c71-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:10 crc kubenswrapper[4830]: I0318 18:25:10.085555 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaec193f-d7b0-4d62-8133-3c1b094a1c71-config-data" (OuterVolumeSpecName: "config-data") pod "eaec193f-d7b0-4d62-8133-3c1b094a1c71" (UID: "eaec193f-d7b0-4d62-8133-3c1b094a1c71"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:10 crc kubenswrapper[4830]: I0318 18:25:10.185604 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaec193f-d7b0-4d62-8133-3c1b094a1c71-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:10 crc kubenswrapper[4830]: I0318 18:25:10.255276 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56fb6c83-b748-4e21-9b1c-90fb37cefea1" path="/var/lib/kubelet/pods/56fb6c83-b748-4e21-9b1c-90fb37cefea1/volumes" Mar 18 18:25:10 crc kubenswrapper[4830]: I0318 18:25:10.256822 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a639262d-5bc7-4b14-a6ef-59583fdffb07" path="/var/lib/kubelet/pods/a639262d-5bc7-4b14-a6ef-59583fdffb07/volumes" Mar 18 18:25:10 crc kubenswrapper[4830]: I0318 18:25:10.372724 4830 generic.go:334] "Generic (PLEG): container finished" podID="eaec193f-d7b0-4d62-8133-3c1b094a1c71" containerID="2e586d789cc93d7b5024e68fdd566bb1e63ba1b1e2a073f9da5ce7c5613a1dec" exitCode=0 Mar 18 18:25:10 crc kubenswrapper[4830]: I0318 18:25:10.372782 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eaec193f-d7b0-4d62-8133-3c1b094a1c71","Type":"ContainerDied","Data":"2e586d789cc93d7b5024e68fdd566bb1e63ba1b1e2a073f9da5ce7c5613a1dec"} Mar 18 18:25:10 crc kubenswrapper[4830]: I0318 18:25:10.372811 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eaec193f-d7b0-4d62-8133-3c1b094a1c71","Type":"ContainerDied","Data":"a63310a9928b07dd1360ac3b6c497432a25106199f7212391886bee0dcf8cbb8"} Mar 18 18:25:10 crc kubenswrapper[4830]: I0318 18:25:10.372831 4830 scope.go:117] "RemoveContainer" containerID="63b37854cf719feb1c02ec413574066ba4e4851ec5e2dcf31206e9b303fe11b9" Mar 18 18:25:10 crc kubenswrapper[4830]: I0318 18:25:10.372856 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:25:10 crc kubenswrapper[4830]: I0318 18:25:10.400751 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:25:10 crc kubenswrapper[4830]: I0318 18:25:10.405373 4830 scope.go:117] "RemoveContainer" containerID="005913a3f36b52570d077d6af5c36588b42eb3f96b9355948ef0a743de24a6ba" Mar 18 18:25:10 crc kubenswrapper[4830]: I0318 18:25:10.407350 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:25:10 crc kubenswrapper[4830]: I0318 18:25:10.433565 4830 scope.go:117] "RemoveContainer" containerID="2e586d789cc93d7b5024e68fdd566bb1e63ba1b1e2a073f9da5ce7c5613a1dec" Mar 18 18:25:10 crc kubenswrapper[4830]: I0318 18:25:10.454201 4830 scope.go:117] "RemoveContainer" containerID="65066b3a4fe0d4c7187ffc8f87f73fa9a33a31d62b060e41f62c950b0fe762f3" Mar 18 18:25:10 crc kubenswrapper[4830]: I0318 18:25:10.478135 4830 scope.go:117] "RemoveContainer" containerID="63b37854cf719feb1c02ec413574066ba4e4851ec5e2dcf31206e9b303fe11b9" Mar 18 18:25:10 crc kubenswrapper[4830]: E0318 18:25:10.478578 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63b37854cf719feb1c02ec413574066ba4e4851ec5e2dcf31206e9b303fe11b9\": container with ID starting with 63b37854cf719feb1c02ec413574066ba4e4851ec5e2dcf31206e9b303fe11b9 not found: ID does not exist" containerID="63b37854cf719feb1c02ec413574066ba4e4851ec5e2dcf31206e9b303fe11b9" Mar 18 18:25:10 crc kubenswrapper[4830]: I0318 18:25:10.478648 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63b37854cf719feb1c02ec413574066ba4e4851ec5e2dcf31206e9b303fe11b9"} err="failed to get container status \"63b37854cf719feb1c02ec413574066ba4e4851ec5e2dcf31206e9b303fe11b9\": rpc error: code = NotFound desc = could not find container \"63b37854cf719feb1c02ec413574066ba4e4851ec5e2dcf31206e9b303fe11b9\": container with ID starting with 63b37854cf719feb1c02ec413574066ba4e4851ec5e2dcf31206e9b303fe11b9 not found: ID does not exist" Mar 18 18:25:10 crc kubenswrapper[4830]: I0318 18:25:10.478682 4830 scope.go:117] "RemoveContainer" containerID="005913a3f36b52570d077d6af5c36588b42eb3f96b9355948ef0a743de24a6ba" Mar 18 18:25:10 crc kubenswrapper[4830]: E0318 18:25:10.479197 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"005913a3f36b52570d077d6af5c36588b42eb3f96b9355948ef0a743de24a6ba\": container with ID starting with 005913a3f36b52570d077d6af5c36588b42eb3f96b9355948ef0a743de24a6ba not found: ID does not exist" containerID="005913a3f36b52570d077d6af5c36588b42eb3f96b9355948ef0a743de24a6ba" Mar 18 18:25:10 crc kubenswrapper[4830]: I0318 18:25:10.479280 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"005913a3f36b52570d077d6af5c36588b42eb3f96b9355948ef0a743de24a6ba"} err="failed to get container status \"005913a3f36b52570d077d6af5c36588b42eb3f96b9355948ef0a743de24a6ba\": rpc error: code = NotFound desc = could not find container \"005913a3f36b52570d077d6af5c36588b42eb3f96b9355948ef0a743de24a6ba\": container with ID starting with 005913a3f36b52570d077d6af5c36588b42eb3f96b9355948ef0a743de24a6ba not found: ID does not exist" Mar 18 18:25:10 crc kubenswrapper[4830]: I0318 18:25:10.479335 4830 scope.go:117] "RemoveContainer" containerID="2e586d789cc93d7b5024e68fdd566bb1e63ba1b1e2a073f9da5ce7c5613a1dec" Mar 18 18:25:10 crc kubenswrapper[4830]: E0318 18:25:10.479646 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e586d789cc93d7b5024e68fdd566bb1e63ba1b1e2a073f9da5ce7c5613a1dec\": container with ID starting with 2e586d789cc93d7b5024e68fdd566bb1e63ba1b1e2a073f9da5ce7c5613a1dec not found: ID does not exist" containerID="2e586d789cc93d7b5024e68fdd566bb1e63ba1b1e2a073f9da5ce7c5613a1dec" Mar 18 18:25:10 crc kubenswrapper[4830]: I0318 18:25:10.479681 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e586d789cc93d7b5024e68fdd566bb1e63ba1b1e2a073f9da5ce7c5613a1dec"} err="failed to get container status \"2e586d789cc93d7b5024e68fdd566bb1e63ba1b1e2a073f9da5ce7c5613a1dec\": rpc error: code = NotFound desc = could not find container \"2e586d789cc93d7b5024e68fdd566bb1e63ba1b1e2a073f9da5ce7c5613a1dec\": container with ID starting with 2e586d789cc93d7b5024e68fdd566bb1e63ba1b1e2a073f9da5ce7c5613a1dec not found: ID does not exist" Mar 18 18:25:10 crc kubenswrapper[4830]: I0318 18:25:10.479701 4830 scope.go:117] "RemoveContainer" containerID="65066b3a4fe0d4c7187ffc8f87f73fa9a33a31d62b060e41f62c950b0fe762f3" Mar 18 18:25:10 crc kubenswrapper[4830]: E0318 18:25:10.480148 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65066b3a4fe0d4c7187ffc8f87f73fa9a33a31d62b060e41f62c950b0fe762f3\": container with ID starting with 65066b3a4fe0d4c7187ffc8f87f73fa9a33a31d62b060e41f62c950b0fe762f3 not found: ID does not exist" containerID="65066b3a4fe0d4c7187ffc8f87f73fa9a33a31d62b060e41f62c950b0fe762f3" Mar 18 18:25:10 crc kubenswrapper[4830]: I0318 18:25:10.480182 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65066b3a4fe0d4c7187ffc8f87f73fa9a33a31d62b060e41f62c950b0fe762f3"} err="failed to get container status \"65066b3a4fe0d4c7187ffc8f87f73fa9a33a31d62b060e41f62c950b0fe762f3\": rpc error: code = NotFound desc = could not find container \"65066b3a4fe0d4c7187ffc8f87f73fa9a33a31d62b060e41f62c950b0fe762f3\": container with ID starting with 65066b3a4fe0d4c7187ffc8f87f73fa9a33a31d62b060e41f62c950b0fe762f3 not found: ID does not exist" Mar 18 18:25:11 crc kubenswrapper[4830]: I0318 18:25:11.387966 4830 generic.go:334] "Generic (PLEG): container finished" podID="e184a0dc-c2fa-4cc2-9785-18a056ab0c46" containerID="22ea3fa0cc5c2b7b61047286d5c724a062a16f2a3599d4207776fd36457bdcd2" exitCode=0 Mar 18 18:25:11 crc kubenswrapper[4830]: I0318 18:25:11.388041 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85cbc86c69-bkfst" event={"ID":"e184a0dc-c2fa-4cc2-9785-18a056ab0c46","Type":"ContainerDied","Data":"22ea3fa0cc5c2b7b61047286d5c724a062a16f2a3599d4207776fd36457bdcd2"} Mar 18 18:25:11 crc kubenswrapper[4830]: I0318 18:25:11.867278 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-85cbc86c69-bkfst" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.014090 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbgnp\" (UniqueName: \"kubernetes.io/projected/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-kube-api-access-gbgnp\") pod \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\" (UID: \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\") " Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.014439 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-combined-ca-bundle\") pod \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\" (UID: \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\") " Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.014576 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-public-tls-certs\") pod \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\" (UID: \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\") " Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.014744 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-httpd-config\") pod \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\" (UID: \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\") " Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.015031 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-ovndb-tls-certs\") pod \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\" (UID: \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\") " Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.015205 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-config\") pod \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\" (UID: \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\") " Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.015350 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-internal-tls-certs\") pod \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\" (UID: \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\") " Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.037954 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "e184a0dc-c2fa-4cc2-9785-18a056ab0c46" (UID: "e184a0dc-c2fa-4cc2-9785-18a056ab0c46"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.043791 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-kube-api-access-gbgnp" (OuterVolumeSpecName: "kube-api-access-gbgnp") pod "e184a0dc-c2fa-4cc2-9785-18a056ab0c46" (UID: "e184a0dc-c2fa-4cc2-9785-18a056ab0c46"). InnerVolumeSpecName "kube-api-access-gbgnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.075251 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e184a0dc-c2fa-4cc2-9785-18a056ab0c46" (UID: "e184a0dc-c2fa-4cc2-9785-18a056ab0c46"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.096051 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e184a0dc-c2fa-4cc2-9785-18a056ab0c46" (UID: "e184a0dc-c2fa-4cc2-9785-18a056ab0c46"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.100927 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e184a0dc-c2fa-4cc2-9785-18a056ab0c46" (UID: "e184a0dc-c2fa-4cc2-9785-18a056ab0c46"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.103517 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "e184a0dc-c2fa-4cc2-9785-18a056ab0c46" (UID: "e184a0dc-c2fa-4cc2-9785-18a056ab0c46"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.116595 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-config" (OuterVolumeSpecName: "config") pod "e184a0dc-c2fa-4cc2-9785-18a056ab0c46" (UID: "e184a0dc-c2fa-4cc2-9785-18a056ab0c46"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.117074 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-config\") pod \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\" (UID: \"e184a0dc-c2fa-4cc2-9785-18a056ab0c46\") " Mar 18 18:25:12 crc kubenswrapper[4830]: W0318 18:25:12.117243 4830 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/e184a0dc-c2fa-4cc2-9785-18a056ab0c46/volumes/kubernetes.io~secret/config Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.117269 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-config" (OuterVolumeSpecName: "config") pod "e184a0dc-c2fa-4cc2-9785-18a056ab0c46" (UID: "e184a0dc-c2fa-4cc2-9785-18a056ab0c46"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.117419 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.117438 4830 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.117450 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbgnp\" (UniqueName: \"kubernetes.io/projected/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-kube-api-access-gbgnp\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.117460 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.117469 4830 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.117477 4830 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.117485 4830 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e184a0dc-c2fa-4cc2-9785-18a056ab0c46-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.243712 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaec193f-d7b0-4d62-8133-3c1b094a1c71" path="/var/lib/kubelet/pods/eaec193f-d7b0-4d62-8133-3c1b094a1c71/volumes" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.405290 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85cbc86c69-bkfst" event={"ID":"e184a0dc-c2fa-4cc2-9785-18a056ab0c46","Type":"ContainerDied","Data":"550094753c0c72a83f4a09c9aab80db49d41471f4c5ea0cc699a845e44d93dde"} Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.405638 4830 scope.go:117] "RemoveContainer" containerID="2cdcb9ee439266520f74d448b0617ce7209026290de151d3b384a0c54cc23c3f" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.405869 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-85cbc86c69-bkfst" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.430220 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-85cbc86c69-bkfst"] Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.442038 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-85cbc86c69-bkfst"] Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.443643 4830 scope.go:117] "RemoveContainer" containerID="22ea3fa0cc5c2b7b61047286d5c724a062a16f2a3599d4207776fd36457bdcd2" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.618636 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gwsh9"] Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619000 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd0fbdd2-a99b-4758-9f27-1f5055ca0172" containerName="nova-metadata-metadata" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619019 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd0fbdd2-a99b-4758-9f27-1f5055ca0172" containerName="nova-metadata-metadata" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619030 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15" containerName="mysql-bootstrap" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619039 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15" containerName="mysql-bootstrap" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619052 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b" containerName="cinder-api" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619059 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b" containerName="cinder-api" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619081 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56fb6c83-b748-4e21-9b1c-90fb37cefea1" containerName="rabbitmq" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619088 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="56fb6c83-b748-4e21-9b1c-90fb37cefea1" containerName="rabbitmq" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619097 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ac8a4f8-88e7-4cd0-ab89-210fb088b137" containerName="glance-httpd" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619103 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ac8a4f8-88e7-4cd0-ab89-210fb088b137" containerName="glance-httpd" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619111 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaec193f-d7b0-4d62-8133-3c1b094a1c71" containerName="proxy-httpd" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619116 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaec193f-d7b0-4d62-8133-3c1b094a1c71" containerName="proxy-httpd" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619131 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48aa5450-29c8-47de-bb37-a7a6ffd441bc" containerName="barbican-keystone-listener-log" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619140 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="48aa5450-29c8-47de-bb37-a7a6ffd441bc" containerName="barbican-keystone-listener-log" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619152 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0a1e291-1a11-4747-96ed-32c95623dcbb" containerName="memcached" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619159 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0a1e291-1a11-4747-96ed-32c95623dcbb" containerName="memcached" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619169 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad760963-34af-440e-9931-fbc23783d7cb" containerName="placement-api" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619176 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad760963-34af-440e-9931-fbc23783d7cb" containerName="placement-api" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619191 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b116575-f650-432e-9eb8-31b6f16b027c" containerName="openstack-network-exporter" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619199 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b116575-f650-432e-9eb8-31b6f16b027c" containerName="openstack-network-exporter" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619211 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11e19037-abf1-4269-b933-0950913973b9" containerName="barbican-worker" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619218 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="11e19037-abf1-4269-b933-0950913973b9" containerName="barbican-worker" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619230 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e152864-9096-47a7-b0b0-c288840093e7" containerName="barbican-api-log" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619235 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e152864-9096-47a7-b0b0-c288840093e7" containerName="barbican-api-log" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619245 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd0fbdd2-a99b-4758-9f27-1f5055ca0172" containerName="nova-metadata-log" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619251 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd0fbdd2-a99b-4758-9f27-1f5055ca0172" containerName="nova-metadata-log" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619261 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e8e20bd-67c1-48a7-be43-c585d65656ea" containerName="nova-scheduler-scheduler" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619267 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e8e20bd-67c1-48a7-be43-c585d65656ea" containerName="nova-scheduler-scheduler" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619275 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a639262d-5bc7-4b14-a6ef-59583fdffb07" containerName="setup-container" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619280 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a639262d-5bc7-4b14-a6ef-59583fdffb07" containerName="setup-container" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619290 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b" containerName="cinder-api-log" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619304 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b" containerName="cinder-api-log" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619317 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad760963-34af-440e-9931-fbc23783d7cb" containerName="placement-log" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619324 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad760963-34af-440e-9931-fbc23783d7cb" containerName="placement-log" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619333 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15" containerName="galera" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619339 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15" containerName="galera" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619348 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44872ddd-52a8-4ca8-a07e-f84111475b8f" containerName="nova-cell1-conductor-conductor" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619355 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="44872ddd-52a8-4ca8-a07e-f84111475b8f" containerName="nova-cell1-conductor-conductor" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619368 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b116575-f650-432e-9eb8-31b6f16b027c" containerName="ovn-northd" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619377 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b116575-f650-432e-9eb8-31b6f16b027c" containerName="ovn-northd" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619391 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaec193f-d7b0-4d62-8133-3c1b094a1c71" containerName="ceilometer-central-agent" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619398 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaec193f-d7b0-4d62-8133-3c1b094a1c71" containerName="ceilometer-central-agent" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619409 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e152864-9096-47a7-b0b0-c288840093e7" containerName="barbican-api" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619417 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e152864-9096-47a7-b0b0-c288840093e7" containerName="barbican-api" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619434 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3ba738f-c556-4b36-a045-3516efdf886a" containerName="nova-api-api" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619444 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3ba738f-c556-4b36-a045-3516efdf886a" containerName="nova-api-api" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619455 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e184a0dc-c2fa-4cc2-9785-18a056ab0c46" containerName="neutron-api" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619463 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e184a0dc-c2fa-4cc2-9785-18a056ab0c46" containerName="neutron-api" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619473 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11e19037-abf1-4269-b933-0950913973b9" containerName="barbican-worker-log" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619480 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="11e19037-abf1-4269-b933-0950913973b9" containerName="barbican-worker-log" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619506 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56fb6c83-b748-4e21-9b1c-90fb37cefea1" containerName="setup-container" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619513 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="56fb6c83-b748-4e21-9b1c-90fb37cefea1" containerName="setup-container" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619522 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8631247-bdcb-45ff-a17d-ac7e7ff81800" containerName="glance-httpd" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619530 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8631247-bdcb-45ff-a17d-ac7e7ff81800" containerName="glance-httpd" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619544 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ac8a4f8-88e7-4cd0-ab89-210fb088b137" containerName="glance-log" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619551 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ac8a4f8-88e7-4cd0-ab89-210fb088b137" containerName="glance-log" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619564 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e184a0dc-c2fa-4cc2-9785-18a056ab0c46" containerName="neutron-httpd" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619571 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e184a0dc-c2fa-4cc2-9785-18a056ab0c46" containerName="neutron-httpd" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619585 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3ba738f-c556-4b36-a045-3516efdf886a" containerName="nova-api-log" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619591 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3ba738f-c556-4b36-a045-3516efdf886a" containerName="nova-api-log" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619603 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0e71339-fd75-44b3-bbb8-15d75455d90f" containerName="nova-cell0-conductor-conductor" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619610 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0e71339-fd75-44b3-bbb8-15d75455d90f" containerName="nova-cell0-conductor-conductor" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619622 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8631247-bdcb-45ff-a17d-ac7e7ff81800" containerName="glance-log" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619629 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8631247-bdcb-45ff-a17d-ac7e7ff81800" containerName="glance-log" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619651 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48aa5450-29c8-47de-bb37-a7a6ffd441bc" containerName="barbican-keystone-listener" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619659 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="48aa5450-29c8-47de-bb37-a7a6ffd441bc" containerName="barbican-keystone-listener" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619672 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ce021de-a1a0-43a6-a2fa-270ea1238bac" containerName="keystone-api" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619690 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ce021de-a1a0-43a6-a2fa-270ea1238bac" containerName="keystone-api" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619700 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaec193f-d7b0-4d62-8133-3c1b094a1c71" containerName="ceilometer-notification-agent" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619718 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaec193f-d7b0-4d62-8133-3c1b094a1c71" containerName="ceilometer-notification-agent" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619730 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaec193f-d7b0-4d62-8133-3c1b094a1c71" containerName="sg-core" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619737 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaec193f-d7b0-4d62-8133-3c1b094a1c71" containerName="sg-core" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619839 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a639262d-5bc7-4b14-a6ef-59583fdffb07" containerName="rabbitmq" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619848 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a639262d-5bc7-4b14-a6ef-59583fdffb07" containerName="rabbitmq" Mar 18 18:25:12 crc kubenswrapper[4830]: E0318 18:25:12.619881 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e810512-a127-40b3-b1c2-559c3b86fcdb" containerName="kube-state-metrics" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.619890 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e810512-a127-40b3-b1c2-559c3b86fcdb" containerName="kube-state-metrics" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.620087 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3ba738f-c556-4b36-a045-3516efdf886a" containerName="nova-api-log" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.620114 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="34dba61e-bb8b-4d99-9a3c-7d9c5dd4ad15" containerName="galera" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.620127 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0a1e291-1a11-4747-96ed-32c95623dcbb" containerName="memcached" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.620136 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd0fbdd2-a99b-4758-9f27-1f5055ca0172" containerName="nova-metadata-log" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.620147 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="e184a0dc-c2fa-4cc2-9785-18a056ab0c46" containerName="neutron-api" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.620156 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e810512-a127-40b3-b1c2-559c3b86fcdb" containerName="kube-state-metrics" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.620165 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8631247-bdcb-45ff-a17d-ac7e7ff81800" containerName="glance-httpd" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.620175 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0e71339-fd75-44b3-bbb8-15d75455d90f" containerName="nova-cell0-conductor-conductor" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.620185 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ce021de-a1a0-43a6-a2fa-270ea1238bac" containerName="keystone-api" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.620192 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaec193f-d7b0-4d62-8133-3c1b094a1c71" containerName="proxy-httpd" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.620203 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="e184a0dc-c2fa-4cc2-9785-18a056ab0c46" containerName="neutron-httpd" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.620212 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaec193f-d7b0-4d62-8133-3c1b094a1c71" containerName="sg-core" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.620223 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e152864-9096-47a7-b0b0-c288840093e7" containerName="barbican-api-log" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.620231 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e8e20bd-67c1-48a7-be43-c585d65656ea" containerName="nova-scheduler-scheduler" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.620247 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad760963-34af-440e-9931-fbc23783d7cb" containerName="placement-log" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.620258 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e152864-9096-47a7-b0b0-c288840093e7" containerName="barbican-api" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.620265 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ac8a4f8-88e7-4cd0-ab89-210fb088b137" containerName="glance-log" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.620278 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8631247-bdcb-45ff-a17d-ac7e7ff81800" containerName="glance-log" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.620286 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaec193f-d7b0-4d62-8133-3c1b094a1c71" containerName="ceilometer-notification-agent" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.620294 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="48aa5450-29c8-47de-bb37-a7a6ffd441bc" containerName="barbican-keystone-listener" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.620304 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="11e19037-abf1-4269-b933-0950913973b9" containerName="barbican-worker-log" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.620316 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="44872ddd-52a8-4ca8-a07e-f84111475b8f" containerName="nova-cell1-conductor-conductor" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.620340 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="56fb6c83-b748-4e21-9b1c-90fb37cefea1" containerName="rabbitmq" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.620349 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad760963-34af-440e-9931-fbc23783d7cb" containerName="placement-api" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.620358 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b" containerName="cinder-api-log" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.620367 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd0fbdd2-a99b-4758-9f27-1f5055ca0172" containerName="nova-metadata-metadata" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.620375 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="a639262d-5bc7-4b14-a6ef-59583fdffb07" containerName="rabbitmq" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.620386 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b116575-f650-432e-9eb8-31b6f16b027c" containerName="ovn-northd" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.620396 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3ba738f-c556-4b36-a045-3516efdf886a" containerName="nova-api-api" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.620404 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b116575-f650-432e-9eb8-31b6f16b027c" containerName="openstack-network-exporter" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.620412 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaec193f-d7b0-4d62-8133-3c1b094a1c71" containerName="ceilometer-central-agent" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.620425 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="11e19037-abf1-4269-b933-0950913973b9" containerName="barbican-worker" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.620438 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ac8a4f8-88e7-4cd0-ab89-210fb088b137" containerName="glance-httpd" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.620450 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c081c8d-74a0-4cd7-b5f5-d3ed585ef80b" containerName="cinder-api" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.620462 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="48aa5450-29c8-47de-bb37-a7a6ffd441bc" containerName="barbican-keystone-listener-log" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.621586 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gwsh9" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.627211 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gwsh9"] Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.725455 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6c71011-0d91-4dbe-8520-43816df80374-catalog-content\") pod \"redhat-operators-gwsh9\" (UID: \"f6c71011-0d91-4dbe-8520-43816df80374\") " pod="openshift-marketplace/redhat-operators-gwsh9" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.725537 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjmtv\" (UniqueName: \"kubernetes.io/projected/f6c71011-0d91-4dbe-8520-43816df80374-kube-api-access-pjmtv\") pod \"redhat-operators-gwsh9\" (UID: \"f6c71011-0d91-4dbe-8520-43816df80374\") " pod="openshift-marketplace/redhat-operators-gwsh9" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.725631 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6c71011-0d91-4dbe-8520-43816df80374-utilities\") pod \"redhat-operators-gwsh9\" (UID: \"f6c71011-0d91-4dbe-8520-43816df80374\") " pod="openshift-marketplace/redhat-operators-gwsh9" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.827114 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6c71011-0d91-4dbe-8520-43816df80374-utilities\") pod \"redhat-operators-gwsh9\" (UID: \"f6c71011-0d91-4dbe-8520-43816df80374\") " pod="openshift-marketplace/redhat-operators-gwsh9" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.827231 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6c71011-0d91-4dbe-8520-43816df80374-catalog-content\") pod \"redhat-operators-gwsh9\" (UID: \"f6c71011-0d91-4dbe-8520-43816df80374\") " pod="openshift-marketplace/redhat-operators-gwsh9" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.827276 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjmtv\" (UniqueName: \"kubernetes.io/projected/f6c71011-0d91-4dbe-8520-43816df80374-kube-api-access-pjmtv\") pod \"redhat-operators-gwsh9\" (UID: \"f6c71011-0d91-4dbe-8520-43816df80374\") " pod="openshift-marketplace/redhat-operators-gwsh9" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.827689 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6c71011-0d91-4dbe-8520-43816df80374-utilities\") pod \"redhat-operators-gwsh9\" (UID: \"f6c71011-0d91-4dbe-8520-43816df80374\") " pod="openshift-marketplace/redhat-operators-gwsh9" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.827703 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6c71011-0d91-4dbe-8520-43816df80374-catalog-content\") pod \"redhat-operators-gwsh9\" (UID: \"f6c71011-0d91-4dbe-8520-43816df80374\") " pod="openshift-marketplace/redhat-operators-gwsh9" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.847853 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjmtv\" (UniqueName: \"kubernetes.io/projected/f6c71011-0d91-4dbe-8520-43816df80374-kube-api-access-pjmtv\") pod \"redhat-operators-gwsh9\" (UID: \"f6c71011-0d91-4dbe-8520-43816df80374\") " pod="openshift-marketplace/redhat-operators-gwsh9" Mar 18 18:25:12 crc kubenswrapper[4830]: I0318 18:25:12.944918 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gwsh9" Mar 18 18:25:13 crc kubenswrapper[4830]: I0318 18:25:13.403081 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gwsh9"] Mar 18 18:25:13 crc kubenswrapper[4830]: E0318 18:25:13.776492 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc is running failed: container process not found" containerID="4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:25:13 crc kubenswrapper[4830]: E0318 18:25:13.776890 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="880631acc0141d0007f3a250db7aaba33c7a12bda1b531c7c202660030481e50" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:25:13 crc kubenswrapper[4830]: E0318 18:25:13.776991 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc is running failed: container process not found" containerID="4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:25:13 crc kubenswrapper[4830]: E0318 18:25:13.778168 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc is running failed: container process not found" containerID="4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:25:13 crc kubenswrapper[4830]: E0318 18:25:13.778241 4830 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-dv8kn" podUID="23b737c7-6b5d-44f4-b05a-de278f4ca572" containerName="ovsdb-server" Mar 18 18:25:13 crc kubenswrapper[4830]: E0318 18:25:13.785039 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="880631acc0141d0007f3a250db7aaba33c7a12bda1b531c7c202660030481e50" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:25:13 crc kubenswrapper[4830]: E0318 18:25:13.786391 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="880631acc0141d0007f3a250db7aaba33c7a12bda1b531c7c202660030481e50" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:25:13 crc kubenswrapper[4830]: E0318 18:25:13.786453 4830 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-dv8kn" podUID="23b737c7-6b5d-44f4-b05a-de278f4ca572" containerName="ovs-vswitchd" Mar 18 18:25:14 crc kubenswrapper[4830]: I0318 18:25:14.247188 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e184a0dc-c2fa-4cc2-9785-18a056ab0c46" path="/var/lib/kubelet/pods/e184a0dc-c2fa-4cc2-9785-18a056ab0c46/volumes" Mar 18 18:25:14 crc kubenswrapper[4830]: I0318 18:25:14.425120 4830 generic.go:334] "Generic (PLEG): container finished" podID="f6c71011-0d91-4dbe-8520-43816df80374" containerID="8d2c8c2ecc9f860131130c7f010ff406095349f38bb9af99b4f872834e2f0e4c" exitCode=0 Mar 18 18:25:14 crc kubenswrapper[4830]: I0318 18:25:14.425181 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gwsh9" event={"ID":"f6c71011-0d91-4dbe-8520-43816df80374","Type":"ContainerDied","Data":"8d2c8c2ecc9f860131130c7f010ff406095349f38bb9af99b4f872834e2f0e4c"} Mar 18 18:25:14 crc kubenswrapper[4830]: I0318 18:25:14.425215 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gwsh9" event={"ID":"f6c71011-0d91-4dbe-8520-43816df80374","Type":"ContainerStarted","Data":"c36dbc7e4c11f90af3de7069f42c906c461e17b44b5ae972b255bcf8c3e038f8"} Mar 18 18:25:14 crc kubenswrapper[4830]: I0318 18:25:14.427631 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 18:25:15 crc kubenswrapper[4830]: I0318 18:25:15.445579 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gwsh9" event={"ID":"f6c71011-0d91-4dbe-8520-43816df80374","Type":"ContainerStarted","Data":"f2a67747825fa9e9684a5b2631bb1727df201c710ceaf61da3ea49d949e8ba27"} Mar 18 18:25:16 crc kubenswrapper[4830]: I0318 18:25:16.455954 4830 generic.go:334] "Generic (PLEG): container finished" podID="f6c71011-0d91-4dbe-8520-43816df80374" containerID="f2a67747825fa9e9684a5b2631bb1727df201c710ceaf61da3ea49d949e8ba27" exitCode=0 Mar 18 18:25:16 crc kubenswrapper[4830]: I0318 18:25:16.456007 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gwsh9" event={"ID":"f6c71011-0d91-4dbe-8520-43816df80374","Type":"ContainerDied","Data":"f2a67747825fa9e9684a5b2631bb1727df201c710ceaf61da3ea49d949e8ba27"} Mar 18 18:25:18 crc kubenswrapper[4830]: I0318 18:25:18.478143 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gwsh9" event={"ID":"f6c71011-0d91-4dbe-8520-43816df80374","Type":"ContainerStarted","Data":"23ec5f5359d2f176556cc62a37ce3d12b155f12fc668df61bb421fd59f375ff6"} Mar 18 18:25:18 crc kubenswrapper[4830]: E0318 18:25:18.775844 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc is running failed: container process not found" containerID="4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:25:18 crc kubenswrapper[4830]: E0318 18:25:18.776449 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc is running failed: container process not found" containerID="4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:25:18 crc kubenswrapper[4830]: E0318 18:25:18.777071 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc is running failed: container process not found" containerID="4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:25:18 crc kubenswrapper[4830]: E0318 18:25:18.777108 4830 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-dv8kn" podUID="23b737c7-6b5d-44f4-b05a-de278f4ca572" containerName="ovsdb-server" Mar 18 18:25:18 crc kubenswrapper[4830]: E0318 18:25:18.779631 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="880631acc0141d0007f3a250db7aaba33c7a12bda1b531c7c202660030481e50" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:25:18 crc kubenswrapper[4830]: E0318 18:25:18.781721 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="880631acc0141d0007f3a250db7aaba33c7a12bda1b531c7c202660030481e50" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:25:18 crc kubenswrapper[4830]: E0318 18:25:18.783601 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="880631acc0141d0007f3a250db7aaba33c7a12bda1b531c7c202660030481e50" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:25:18 crc kubenswrapper[4830]: E0318 18:25:18.783716 4830 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-dv8kn" podUID="23b737c7-6b5d-44f4-b05a-de278f4ca572" containerName="ovs-vswitchd" Mar 18 18:25:22 crc kubenswrapper[4830]: I0318 18:25:22.945447 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gwsh9" Mar 18 18:25:22 crc kubenswrapper[4830]: I0318 18:25:22.945835 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gwsh9" Mar 18 18:25:23 crc kubenswrapper[4830]: E0318 18:25:23.775859 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc is running failed: container process not found" containerID="4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:25:23 crc kubenswrapper[4830]: E0318 18:25:23.776319 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc is running failed: container process not found" containerID="4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:25:23 crc kubenswrapper[4830]: E0318 18:25:23.776750 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc is running failed: container process not found" containerID="4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:25:23 crc kubenswrapper[4830]: E0318 18:25:23.776821 4830 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-dv8kn" podUID="23b737c7-6b5d-44f4-b05a-de278f4ca572" containerName="ovsdb-server" Mar 18 18:25:23 crc kubenswrapper[4830]: E0318 18:25:23.778175 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="880631acc0141d0007f3a250db7aaba33c7a12bda1b531c7c202660030481e50" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:25:23 crc kubenswrapper[4830]: E0318 18:25:23.780465 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="880631acc0141d0007f3a250db7aaba33c7a12bda1b531c7c202660030481e50" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:25:23 crc kubenswrapper[4830]: E0318 18:25:23.782316 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="880631acc0141d0007f3a250db7aaba33c7a12bda1b531c7c202660030481e50" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:25:23 crc kubenswrapper[4830]: E0318 18:25:23.782386 4830 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-dv8kn" podUID="23b737c7-6b5d-44f4-b05a-de278f4ca572" containerName="ovs-vswitchd" Mar 18 18:25:23 crc kubenswrapper[4830]: I0318 18:25:23.990371 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gwsh9" podUID="f6c71011-0d91-4dbe-8520-43816df80374" containerName="registry-server" probeResult="failure" output=< Mar 18 18:25:23 crc kubenswrapper[4830]: timeout: failed to connect service ":50051" within 1s Mar 18 18:25:23 crc kubenswrapper[4830]: > Mar 18 18:25:28 crc kubenswrapper[4830]: E0318 18:25:28.775820 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc is running failed: container process not found" containerID="4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:25:28 crc kubenswrapper[4830]: E0318 18:25:28.777728 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="880631acc0141d0007f3a250db7aaba33c7a12bda1b531c7c202660030481e50" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:25:28 crc kubenswrapper[4830]: E0318 18:25:28.778411 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc is running failed: container process not found" containerID="4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:25:28 crc kubenswrapper[4830]: E0318 18:25:28.779072 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc is running failed: container process not found" containerID="4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:25:28 crc kubenswrapper[4830]: E0318 18:25:28.779135 4830 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-dv8kn" podUID="23b737c7-6b5d-44f4-b05a-de278f4ca572" containerName="ovsdb-server" Mar 18 18:25:28 crc kubenswrapper[4830]: E0318 18:25:28.780476 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="880631acc0141d0007f3a250db7aaba33c7a12bda1b531c7c202660030481e50" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:25:28 crc kubenswrapper[4830]: E0318 18:25:28.782447 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="880631acc0141d0007f3a250db7aaba33c7a12bda1b531c7c202660030481e50" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:25:28 crc kubenswrapper[4830]: E0318 18:25:28.782552 4830 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-dv8kn" podUID="23b737c7-6b5d-44f4-b05a-de278f4ca572" containerName="ovs-vswitchd" Mar 18 18:25:29 crc kubenswrapper[4830]: I0318 18:25:29.509528 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:25:29 crc kubenswrapper[4830]: I0318 18:25:29.509844 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:25:29 crc kubenswrapper[4830]: I0318 18:25:29.587313 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dv8kn_23b737c7-6b5d-44f4-b05a-de278f4ca572/ovs-vswitchd/0.log" Mar 18 18:25:29 crc kubenswrapper[4830]: I0318 18:25:29.587960 4830 generic.go:334] "Generic (PLEG): container finished" podID="23b737c7-6b5d-44f4-b05a-de278f4ca572" containerID="880631acc0141d0007f3a250db7aaba33c7a12bda1b531c7c202660030481e50" exitCode=137 Mar 18 18:25:29 crc kubenswrapper[4830]: I0318 18:25:29.587997 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dv8kn" event={"ID":"23b737c7-6b5d-44f4-b05a-de278f4ca572","Type":"ContainerDied","Data":"880631acc0141d0007f3a250db7aaba33c7a12bda1b531c7c202660030481e50"} Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.028614 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dv8kn_23b737c7-6b5d-44f4-b05a-de278f4ca572/ovs-vswitchd/0.log" Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.029253 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-dv8kn" Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.050124 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gwsh9" podStartSLOduration=15.144614676 podStartE2EDuration="18.050105203s" podCreationTimestamp="2026-03-18 18:25:12 +0000 UTC" firstStartedPulling="2026-03-18 18:25:14.427257176 +0000 UTC m=+1348.994887538" lastFinishedPulling="2026-03-18 18:25:17.332747723 +0000 UTC m=+1351.900378065" observedRunningTime="2026-03-18 18:25:18.499420902 +0000 UTC m=+1353.067051254" watchObservedRunningTime="2026-03-18 18:25:30.050105203 +0000 UTC m=+1364.617735535" Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.119004 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/23b737c7-6b5d-44f4-b05a-de278f4ca572-etc-ovs\") pod \"23b737c7-6b5d-44f4-b05a-de278f4ca572\" (UID: \"23b737c7-6b5d-44f4-b05a-de278f4ca572\") " Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.119057 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/23b737c7-6b5d-44f4-b05a-de278f4ca572-var-run\") pod \"23b737c7-6b5d-44f4-b05a-de278f4ca572\" (UID: \"23b737c7-6b5d-44f4-b05a-de278f4ca572\") " Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.119104 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/23b737c7-6b5d-44f4-b05a-de278f4ca572-var-log\") pod \"23b737c7-6b5d-44f4-b05a-de278f4ca572\" (UID: \"23b737c7-6b5d-44f4-b05a-de278f4ca572\") " Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.119131 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/23b737c7-6b5d-44f4-b05a-de278f4ca572-var-lib\") pod \"23b737c7-6b5d-44f4-b05a-de278f4ca572\" (UID: \"23b737c7-6b5d-44f4-b05a-de278f4ca572\") " Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.119136 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23b737c7-6b5d-44f4-b05a-de278f4ca572-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "23b737c7-6b5d-44f4-b05a-de278f4ca572" (UID: "23b737c7-6b5d-44f4-b05a-de278f4ca572"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.119171 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23b737c7-6b5d-44f4-b05a-de278f4ca572-var-run" (OuterVolumeSpecName: "var-run") pod "23b737c7-6b5d-44f4-b05a-de278f4ca572" (UID: "23b737c7-6b5d-44f4-b05a-de278f4ca572"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.119216 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c44jd\" (UniqueName: \"kubernetes.io/projected/23b737c7-6b5d-44f4-b05a-de278f4ca572-kube-api-access-c44jd\") pod \"23b737c7-6b5d-44f4-b05a-de278f4ca572\" (UID: \"23b737c7-6b5d-44f4-b05a-de278f4ca572\") " Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.119215 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23b737c7-6b5d-44f4-b05a-de278f4ca572-var-log" (OuterVolumeSpecName: "var-log") pod "23b737c7-6b5d-44f4-b05a-de278f4ca572" (UID: "23b737c7-6b5d-44f4-b05a-de278f4ca572"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.119244 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23b737c7-6b5d-44f4-b05a-de278f4ca572-var-lib" (OuterVolumeSpecName: "var-lib") pod "23b737c7-6b5d-44f4-b05a-de278f4ca572" (UID: "23b737c7-6b5d-44f4-b05a-de278f4ca572"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.119251 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/23b737c7-6b5d-44f4-b05a-de278f4ca572-scripts\") pod \"23b737c7-6b5d-44f4-b05a-de278f4ca572\" (UID: \"23b737c7-6b5d-44f4-b05a-de278f4ca572\") " Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.119594 4830 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/23b737c7-6b5d-44f4-b05a-de278f4ca572-etc-ovs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.119611 4830 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/23b737c7-6b5d-44f4-b05a-de278f4ca572-var-run\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.119623 4830 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/23b737c7-6b5d-44f4-b05a-de278f4ca572-var-log\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.119634 4830 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/23b737c7-6b5d-44f4-b05a-de278f4ca572-var-lib\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.120329 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23b737c7-6b5d-44f4-b05a-de278f4ca572-scripts" (OuterVolumeSpecName: "scripts") pod "23b737c7-6b5d-44f4-b05a-de278f4ca572" (UID: "23b737c7-6b5d-44f4-b05a-de278f4ca572"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.134166 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23b737c7-6b5d-44f4-b05a-de278f4ca572-kube-api-access-c44jd" (OuterVolumeSpecName: "kube-api-access-c44jd") pod "23b737c7-6b5d-44f4-b05a-de278f4ca572" (UID: "23b737c7-6b5d-44f4-b05a-de278f4ca572"). InnerVolumeSpecName "kube-api-access-c44jd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.221472 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/23b737c7-6b5d-44f4-b05a-de278f4ca572-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.221853 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c44jd\" (UniqueName: \"kubernetes.io/projected/23b737c7-6b5d-44f4-b05a-de278f4ca572-kube-api-access-c44jd\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.609390 4830 generic.go:334] "Generic (PLEG): container finished" podID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerID="a20dea92408e3316b920fe3e34c3564167b91f44ec33c56fd94553ec6a29e550" exitCode=137 Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.609516 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccc6cbaa-b562-49fc-9add-94aac04d60ed","Type":"ContainerDied","Data":"a20dea92408e3316b920fe3e34c3564167b91f44ec33c56fd94553ec6a29e550"} Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.609549 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccc6cbaa-b562-49fc-9add-94aac04d60ed","Type":"ContainerDied","Data":"b0a04edc260b00fec775e4448595db34b84f53d267f187b4f82e648c4ae48655"} Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.609564 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0a04edc260b00fec775e4448595db34b84f53d267f187b4f82e648c4ae48655" Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.612199 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dv8kn_23b737c7-6b5d-44f4-b05a-de278f4ca572/ovs-vswitchd/0.log" Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.613182 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dv8kn" event={"ID":"23b737c7-6b5d-44f4-b05a-de278f4ca572","Type":"ContainerDied","Data":"d486014fb7ebdc66db36195db223508854fbe9541cb559f93347e98398c82346"} Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.613226 4830 scope.go:117] "RemoveContainer" containerID="880631acc0141d0007f3a250db7aaba33c7a12bda1b531c7c202660030481e50" Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.613393 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-dv8kn" Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.635059 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.661554 4830 scope.go:117] "RemoveContainer" containerID="4313aeca09319496e49ef3792913dc8a2add41ae7a995cfc84318657fe1f23cc" Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.673134 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-dv8kn"] Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.682255 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-dv8kn"] Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.703485 4830 scope.go:117] "RemoveContainer" containerID="18ea142ceb8d413ddf0c7ab0f2cbe4c96f2ce6a59c01ff0b773b207d1a0b8f74" Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.728591 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ccc6cbaa-b562-49fc-9add-94aac04d60ed-etc-swift\") pod \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\" (UID: \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\") " Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.728663 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\" (UID: \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\") " Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.728697 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ccc6cbaa-b562-49fc-9add-94aac04d60ed-cache\") pod \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\" (UID: \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\") " Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.728957 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jrmf\" (UniqueName: \"kubernetes.io/projected/ccc6cbaa-b562-49fc-9add-94aac04d60ed-kube-api-access-5jrmf\") pod \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\" (UID: \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\") " Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.729118 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ccc6cbaa-b562-49fc-9add-94aac04d60ed-lock\") pod \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\" (UID: \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\") " Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.729153 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccc6cbaa-b562-49fc-9add-94aac04d60ed-combined-ca-bundle\") pod \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\" (UID: \"ccc6cbaa-b562-49fc-9add-94aac04d60ed\") " Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.730597 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccc6cbaa-b562-49fc-9add-94aac04d60ed-cache" (OuterVolumeSpecName: "cache") pod "ccc6cbaa-b562-49fc-9add-94aac04d60ed" (UID: "ccc6cbaa-b562-49fc-9add-94aac04d60ed"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.734907 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccc6cbaa-b562-49fc-9add-94aac04d60ed-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ccc6cbaa-b562-49fc-9add-94aac04d60ed" (UID: "ccc6cbaa-b562-49fc-9add-94aac04d60ed"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.735382 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccc6cbaa-b562-49fc-9add-94aac04d60ed-lock" (OuterVolumeSpecName: "lock") pod "ccc6cbaa-b562-49fc-9add-94aac04d60ed" (UID: "ccc6cbaa-b562-49fc-9add-94aac04d60ed"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.738028 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccc6cbaa-b562-49fc-9add-94aac04d60ed-kube-api-access-5jrmf" (OuterVolumeSpecName: "kube-api-access-5jrmf") pod "ccc6cbaa-b562-49fc-9add-94aac04d60ed" (UID: "ccc6cbaa-b562-49fc-9add-94aac04d60ed"). InnerVolumeSpecName "kube-api-access-5jrmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.750965 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "swift") pod "ccc6cbaa-b562-49fc-9add-94aac04d60ed" (UID: "ccc6cbaa-b562-49fc-9add-94aac04d60ed"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.831198 4830 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ccc6cbaa-b562-49fc-9add-94aac04d60ed-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.831257 4830 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.831270 4830 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ccc6cbaa-b562-49fc-9add-94aac04d60ed-cache\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.831281 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jrmf\" (UniqueName: \"kubernetes.io/projected/ccc6cbaa-b562-49fc-9add-94aac04d60ed-kube-api-access-5jrmf\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.831296 4830 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ccc6cbaa-b562-49fc-9add-94aac04d60ed-lock\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.846400 4830 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.933103 4830 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:30 crc kubenswrapper[4830]: I0318 18:25:30.953414 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccc6cbaa-b562-49fc-9add-94aac04d60ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ccc6cbaa-b562-49fc-9add-94aac04d60ed" (UID: "ccc6cbaa-b562-49fc-9add-94aac04d60ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:31 crc kubenswrapper[4830]: I0318 18:25:31.034197 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccc6cbaa-b562-49fc-9add-94aac04d60ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:31 crc kubenswrapper[4830]: I0318 18:25:31.624024 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 18 18:25:31 crc kubenswrapper[4830]: I0318 18:25:31.672363 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 18 18:25:31 crc kubenswrapper[4830]: I0318 18:25:31.687611 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Mar 18 18:25:32 crc kubenswrapper[4830]: I0318 18:25:32.244116 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23b737c7-6b5d-44f4-b05a-de278f4ca572" path="/var/lib/kubelet/pods/23b737c7-6b5d-44f4-b05a-de278f4ca572/volumes" Mar 18 18:25:32 crc kubenswrapper[4830]: I0318 18:25:32.244820 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" path="/var/lib/kubelet/pods/ccc6cbaa-b562-49fc-9add-94aac04d60ed/volumes" Mar 18 18:25:32 crc kubenswrapper[4830]: I0318 18:25:32.984375 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gwsh9" Mar 18 18:25:33 crc kubenswrapper[4830]: I0318 18:25:33.028528 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gwsh9" Mar 18 18:25:33 crc kubenswrapper[4830]: I0318 18:25:33.215187 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gwsh9"] Mar 18 18:25:34 crc kubenswrapper[4830]: I0318 18:25:34.080018 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="0ac8a4f8-88e7-4cd0-ab89-210fb088b137" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.187:9292/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 18:25:34 crc kubenswrapper[4830]: I0318 18:25:34.080081 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="0ac8a4f8-88e7-4cd0-ab89-210fb088b137" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.187:9292/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 18:25:34 crc kubenswrapper[4830]: I0318 18:25:34.647571 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gwsh9" podUID="f6c71011-0d91-4dbe-8520-43816df80374" containerName="registry-server" containerID="cri-o://23ec5f5359d2f176556cc62a37ce3d12b155f12fc668df61bb421fd59f375ff6" gracePeriod=2 Mar 18 18:25:35 crc kubenswrapper[4830]: I0318 18:25:35.022335 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gwsh9" Mar 18 18:25:35 crc kubenswrapper[4830]: I0318 18:25:35.087099 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6c71011-0d91-4dbe-8520-43816df80374-catalog-content\") pod \"f6c71011-0d91-4dbe-8520-43816df80374\" (UID: \"f6c71011-0d91-4dbe-8520-43816df80374\") " Mar 18 18:25:35 crc kubenswrapper[4830]: I0318 18:25:35.087181 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6c71011-0d91-4dbe-8520-43816df80374-utilities\") pod \"f6c71011-0d91-4dbe-8520-43816df80374\" (UID: \"f6c71011-0d91-4dbe-8520-43816df80374\") " Mar 18 18:25:35 crc kubenswrapper[4830]: I0318 18:25:35.087335 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjmtv\" (UniqueName: \"kubernetes.io/projected/f6c71011-0d91-4dbe-8520-43816df80374-kube-api-access-pjmtv\") pod \"f6c71011-0d91-4dbe-8520-43816df80374\" (UID: \"f6c71011-0d91-4dbe-8520-43816df80374\") " Mar 18 18:25:35 crc kubenswrapper[4830]: I0318 18:25:35.089249 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6c71011-0d91-4dbe-8520-43816df80374-utilities" (OuterVolumeSpecName: "utilities") pod "f6c71011-0d91-4dbe-8520-43816df80374" (UID: "f6c71011-0d91-4dbe-8520-43816df80374"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:25:35 crc kubenswrapper[4830]: I0318 18:25:35.098141 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6c71011-0d91-4dbe-8520-43816df80374-kube-api-access-pjmtv" (OuterVolumeSpecName: "kube-api-access-pjmtv") pod "f6c71011-0d91-4dbe-8520-43816df80374" (UID: "f6c71011-0d91-4dbe-8520-43816df80374"). InnerVolumeSpecName "kube-api-access-pjmtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:35 crc kubenswrapper[4830]: I0318 18:25:35.189512 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjmtv\" (UniqueName: \"kubernetes.io/projected/f6c71011-0d91-4dbe-8520-43816df80374-kube-api-access-pjmtv\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:35 crc kubenswrapper[4830]: I0318 18:25:35.191028 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6c71011-0d91-4dbe-8520-43816df80374-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:35 crc kubenswrapper[4830]: I0318 18:25:35.233444 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6c71011-0d91-4dbe-8520-43816df80374-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f6c71011-0d91-4dbe-8520-43816df80374" (UID: "f6c71011-0d91-4dbe-8520-43816df80374"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:25:35 crc kubenswrapper[4830]: I0318 18:25:35.292405 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6c71011-0d91-4dbe-8520-43816df80374-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:35 crc kubenswrapper[4830]: I0318 18:25:35.661019 4830 generic.go:334] "Generic (PLEG): container finished" podID="f6c71011-0d91-4dbe-8520-43816df80374" containerID="23ec5f5359d2f176556cc62a37ce3d12b155f12fc668df61bb421fd59f375ff6" exitCode=0 Mar 18 18:25:35 crc kubenswrapper[4830]: I0318 18:25:35.661070 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gwsh9" event={"ID":"f6c71011-0d91-4dbe-8520-43816df80374","Type":"ContainerDied","Data":"23ec5f5359d2f176556cc62a37ce3d12b155f12fc668df61bb421fd59f375ff6"} Mar 18 18:25:35 crc kubenswrapper[4830]: I0318 18:25:35.661105 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gwsh9" event={"ID":"f6c71011-0d91-4dbe-8520-43816df80374","Type":"ContainerDied","Data":"c36dbc7e4c11f90af3de7069f42c906c461e17b44b5ae972b255bcf8c3e038f8"} Mar 18 18:25:35 crc kubenswrapper[4830]: I0318 18:25:35.661127 4830 scope.go:117] "RemoveContainer" containerID="23ec5f5359d2f176556cc62a37ce3d12b155f12fc668df61bb421fd59f375ff6" Mar 18 18:25:35 crc kubenswrapper[4830]: I0318 18:25:35.661194 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gwsh9" Mar 18 18:25:35 crc kubenswrapper[4830]: I0318 18:25:35.690044 4830 scope.go:117] "RemoveContainer" containerID="f2a67747825fa9e9684a5b2631bb1727df201c710ceaf61da3ea49d949e8ba27" Mar 18 18:25:35 crc kubenswrapper[4830]: I0318 18:25:35.706206 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gwsh9"] Mar 18 18:25:35 crc kubenswrapper[4830]: I0318 18:25:35.713658 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gwsh9"] Mar 18 18:25:35 crc kubenswrapper[4830]: I0318 18:25:35.733658 4830 scope.go:117] "RemoveContainer" containerID="8d2c8c2ecc9f860131130c7f010ff406095349f38bb9af99b4f872834e2f0e4c" Mar 18 18:25:35 crc kubenswrapper[4830]: I0318 18:25:35.756539 4830 scope.go:117] "RemoveContainer" containerID="23ec5f5359d2f176556cc62a37ce3d12b155f12fc668df61bb421fd59f375ff6" Mar 18 18:25:35 crc kubenswrapper[4830]: E0318 18:25:35.758709 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23ec5f5359d2f176556cc62a37ce3d12b155f12fc668df61bb421fd59f375ff6\": container with ID starting with 23ec5f5359d2f176556cc62a37ce3d12b155f12fc668df61bb421fd59f375ff6 not found: ID does not exist" containerID="23ec5f5359d2f176556cc62a37ce3d12b155f12fc668df61bb421fd59f375ff6" Mar 18 18:25:35 crc kubenswrapper[4830]: I0318 18:25:35.758868 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23ec5f5359d2f176556cc62a37ce3d12b155f12fc668df61bb421fd59f375ff6"} err="failed to get container status \"23ec5f5359d2f176556cc62a37ce3d12b155f12fc668df61bb421fd59f375ff6\": rpc error: code = NotFound desc = could not find container \"23ec5f5359d2f176556cc62a37ce3d12b155f12fc668df61bb421fd59f375ff6\": container with ID starting with 23ec5f5359d2f176556cc62a37ce3d12b155f12fc668df61bb421fd59f375ff6 not found: ID does not exist" Mar 18 18:25:35 crc kubenswrapper[4830]: I0318 18:25:35.758957 4830 scope.go:117] "RemoveContainer" containerID="f2a67747825fa9e9684a5b2631bb1727df201c710ceaf61da3ea49d949e8ba27" Mar 18 18:25:35 crc kubenswrapper[4830]: E0318 18:25:35.760309 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2a67747825fa9e9684a5b2631bb1727df201c710ceaf61da3ea49d949e8ba27\": container with ID starting with f2a67747825fa9e9684a5b2631bb1727df201c710ceaf61da3ea49d949e8ba27 not found: ID does not exist" containerID="f2a67747825fa9e9684a5b2631bb1727df201c710ceaf61da3ea49d949e8ba27" Mar 18 18:25:35 crc kubenswrapper[4830]: I0318 18:25:35.760352 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2a67747825fa9e9684a5b2631bb1727df201c710ceaf61da3ea49d949e8ba27"} err="failed to get container status \"f2a67747825fa9e9684a5b2631bb1727df201c710ceaf61da3ea49d949e8ba27\": rpc error: code = NotFound desc = could not find container \"f2a67747825fa9e9684a5b2631bb1727df201c710ceaf61da3ea49d949e8ba27\": container with ID starting with f2a67747825fa9e9684a5b2631bb1727df201c710ceaf61da3ea49d949e8ba27 not found: ID does not exist" Mar 18 18:25:35 crc kubenswrapper[4830]: I0318 18:25:35.760370 4830 scope.go:117] "RemoveContainer" containerID="8d2c8c2ecc9f860131130c7f010ff406095349f38bb9af99b4f872834e2f0e4c" Mar 18 18:25:35 crc kubenswrapper[4830]: E0318 18:25:35.761469 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d2c8c2ecc9f860131130c7f010ff406095349f38bb9af99b4f872834e2f0e4c\": container with ID starting with 8d2c8c2ecc9f860131130c7f010ff406095349f38bb9af99b4f872834e2f0e4c not found: ID does not exist" containerID="8d2c8c2ecc9f860131130c7f010ff406095349f38bb9af99b4f872834e2f0e4c" Mar 18 18:25:35 crc kubenswrapper[4830]: I0318 18:25:35.761554 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d2c8c2ecc9f860131130c7f010ff406095349f38bb9af99b4f872834e2f0e4c"} err="failed to get container status \"8d2c8c2ecc9f860131130c7f010ff406095349f38bb9af99b4f872834e2f0e4c\": rpc error: code = NotFound desc = could not find container \"8d2c8c2ecc9f860131130c7f010ff406095349f38bb9af99b4f872834e2f0e4c\": container with ID starting with 8d2c8c2ecc9f860131130c7f010ff406095349f38bb9af99b4f872834e2f0e4c not found: ID does not exist" Mar 18 18:25:36 crc kubenswrapper[4830]: I0318 18:25:36.248136 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6c71011-0d91-4dbe-8520-43816df80374" path="/var/lib/kubelet/pods/f6c71011-0d91-4dbe-8520-43816df80374/volumes" Mar 18 18:25:59 crc kubenswrapper[4830]: I0318 18:25:59.510175 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:25:59 crc kubenswrapper[4830]: I0318 18:25:59.510828 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.172891 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564306-q66sx"] Mar 18 18:26:00 crc kubenswrapper[4830]: E0318 18:26:00.173367 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23b737c7-6b5d-44f4-b05a-de278f4ca572" containerName="ovsdb-server" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.173394 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="23b737c7-6b5d-44f4-b05a-de278f4ca572" containerName="ovsdb-server" Mar 18 18:26:00 crc kubenswrapper[4830]: E0318 18:26:00.173421 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="container-replicator" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.173434 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="container-replicator" Mar 18 18:26:00 crc kubenswrapper[4830]: E0318 18:26:00.173455 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6c71011-0d91-4dbe-8520-43816df80374" containerName="registry-server" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.173470 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6c71011-0d91-4dbe-8520-43816df80374" containerName="registry-server" Mar 18 18:26:00 crc kubenswrapper[4830]: E0318 18:26:00.173494 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23b737c7-6b5d-44f4-b05a-de278f4ca572" containerName="ovsdb-server-init" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.173506 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="23b737c7-6b5d-44f4-b05a-de278f4ca572" containerName="ovsdb-server-init" Mar 18 18:26:00 crc kubenswrapper[4830]: E0318 18:26:00.173522 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="object-server" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.173534 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="object-server" Mar 18 18:26:00 crc kubenswrapper[4830]: E0318 18:26:00.173556 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6c71011-0d91-4dbe-8520-43816df80374" containerName="extract-utilities" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.173570 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6c71011-0d91-4dbe-8520-43816df80374" containerName="extract-utilities" Mar 18 18:26:00 crc kubenswrapper[4830]: E0318 18:26:00.173585 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="container-server" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.173597 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="container-server" Mar 18 18:26:00 crc kubenswrapper[4830]: E0318 18:26:00.173617 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="container-auditor" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.173629 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="container-auditor" Mar 18 18:26:00 crc kubenswrapper[4830]: E0318 18:26:00.173647 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="object-replicator" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.173659 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="object-replicator" Mar 18 18:26:00 crc kubenswrapper[4830]: E0318 18:26:00.173672 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="object-updater" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.173685 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="object-updater" Mar 18 18:26:00 crc kubenswrapper[4830]: E0318 18:26:00.173711 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23b737c7-6b5d-44f4-b05a-de278f4ca572" containerName="ovs-vswitchd" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.173726 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="23b737c7-6b5d-44f4-b05a-de278f4ca572" containerName="ovs-vswitchd" Mar 18 18:26:00 crc kubenswrapper[4830]: E0318 18:26:00.173749 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="object-expirer" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.173760 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="object-expirer" Mar 18 18:26:00 crc kubenswrapper[4830]: E0318 18:26:00.173798 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="account-server" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.173813 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="account-server" Mar 18 18:26:00 crc kubenswrapper[4830]: E0318 18:26:00.173836 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="account-auditor" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.173850 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="account-auditor" Mar 18 18:26:00 crc kubenswrapper[4830]: E0318 18:26:00.173872 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="rsync" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.173886 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="rsync" Mar 18 18:26:00 crc kubenswrapper[4830]: E0318 18:26:00.173900 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="container-updater" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.173912 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="container-updater" Mar 18 18:26:00 crc kubenswrapper[4830]: E0318 18:26:00.173932 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6c71011-0d91-4dbe-8520-43816df80374" containerName="extract-content" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.173944 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6c71011-0d91-4dbe-8520-43816df80374" containerName="extract-content" Mar 18 18:26:00 crc kubenswrapper[4830]: E0318 18:26:00.173960 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="object-auditor" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.173972 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="object-auditor" Mar 18 18:26:00 crc kubenswrapper[4830]: E0318 18:26:00.173989 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="account-reaper" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.174001 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="account-reaper" Mar 18 18:26:00 crc kubenswrapper[4830]: E0318 18:26:00.174017 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="swift-recon-cron" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.174034 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="swift-recon-cron" Mar 18 18:26:00 crc kubenswrapper[4830]: E0318 18:26:00.174060 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="account-replicator" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.174076 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="account-replicator" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.174323 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="object-replicator" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.174348 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6c71011-0d91-4dbe-8520-43816df80374" containerName="registry-server" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.174369 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="account-reaper" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.174389 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="account-server" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.174405 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="23b737c7-6b5d-44f4-b05a-de278f4ca572" containerName="ovsdb-server" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.174425 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="account-replicator" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.174439 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="23b737c7-6b5d-44f4-b05a-de278f4ca572" containerName="ovs-vswitchd" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.174460 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="container-auditor" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.174473 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="container-replicator" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.174485 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="object-expirer" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.174504 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="swift-recon-cron" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.174521 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="object-updater" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.174543 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="account-auditor" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.174557 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="container-server" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.174576 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="container-updater" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.174592 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="object-auditor" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.174609 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="object-server" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.174624 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccc6cbaa-b562-49fc-9add-94aac04d60ed" containerName="rsync" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.175469 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564306-q66sx" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.178104 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.180444 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564306-q66sx"] Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.181231 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.181319 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.287451 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpqf8\" (UniqueName: \"kubernetes.io/projected/07cd51e9-9091-4a1c-8055-448c0df097ed-kube-api-access-fpqf8\") pod \"auto-csr-approver-29564306-q66sx\" (UID: \"07cd51e9-9091-4a1c-8055-448c0df097ed\") " pod="openshift-infra/auto-csr-approver-29564306-q66sx" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.389002 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpqf8\" (UniqueName: \"kubernetes.io/projected/07cd51e9-9091-4a1c-8055-448c0df097ed-kube-api-access-fpqf8\") pod \"auto-csr-approver-29564306-q66sx\" (UID: \"07cd51e9-9091-4a1c-8055-448c0df097ed\") " pod="openshift-infra/auto-csr-approver-29564306-q66sx" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.422344 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpqf8\" (UniqueName: \"kubernetes.io/projected/07cd51e9-9091-4a1c-8055-448c0df097ed-kube-api-access-fpqf8\") pod \"auto-csr-approver-29564306-q66sx\" (UID: \"07cd51e9-9091-4a1c-8055-448c0df097ed\") " pod="openshift-infra/auto-csr-approver-29564306-q66sx" Mar 18 18:26:00 crc kubenswrapper[4830]: I0318 18:26:00.510141 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564306-q66sx" Mar 18 18:26:01 crc kubenswrapper[4830]: I0318 18:26:01.008829 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564306-q66sx"] Mar 18 18:26:01 crc kubenswrapper[4830]: W0318 18:26:01.020950 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07cd51e9_9091_4a1c_8055_448c0df097ed.slice/crio-20e7c3131cea6272d5ca8e52cf9df8f2eaba573cb901c1da2132f3de92517428 WatchSource:0}: Error finding container 20e7c3131cea6272d5ca8e52cf9df8f2eaba573cb901c1da2132f3de92517428: Status 404 returned error can't find the container with id 20e7c3131cea6272d5ca8e52cf9df8f2eaba573cb901c1da2132f3de92517428 Mar 18 18:26:01 crc kubenswrapper[4830]: I0318 18:26:01.962935 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564306-q66sx" event={"ID":"07cd51e9-9091-4a1c-8055-448c0df097ed","Type":"ContainerStarted","Data":"20e7c3131cea6272d5ca8e52cf9df8f2eaba573cb901c1da2132f3de92517428"} Mar 18 18:26:02 crc kubenswrapper[4830]: I0318 18:26:02.975277 4830 generic.go:334] "Generic (PLEG): container finished" podID="07cd51e9-9091-4a1c-8055-448c0df097ed" containerID="3397918b5d9c6f9e9358b8667a445e69620688770fa08a78734184570748fd51" exitCode=0 Mar 18 18:26:02 crc kubenswrapper[4830]: I0318 18:26:02.975367 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564306-q66sx" event={"ID":"07cd51e9-9091-4a1c-8055-448c0df097ed","Type":"ContainerDied","Data":"3397918b5d9c6f9e9358b8667a445e69620688770fa08a78734184570748fd51"} Mar 18 18:26:04 crc kubenswrapper[4830]: I0318 18:26:04.300866 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564306-q66sx" Mar 18 18:26:04 crc kubenswrapper[4830]: I0318 18:26:04.347438 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpqf8\" (UniqueName: \"kubernetes.io/projected/07cd51e9-9091-4a1c-8055-448c0df097ed-kube-api-access-fpqf8\") pod \"07cd51e9-9091-4a1c-8055-448c0df097ed\" (UID: \"07cd51e9-9091-4a1c-8055-448c0df097ed\") " Mar 18 18:26:04 crc kubenswrapper[4830]: I0318 18:26:04.353226 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07cd51e9-9091-4a1c-8055-448c0df097ed-kube-api-access-fpqf8" (OuterVolumeSpecName: "kube-api-access-fpqf8") pod "07cd51e9-9091-4a1c-8055-448c0df097ed" (UID: "07cd51e9-9091-4a1c-8055-448c0df097ed"). InnerVolumeSpecName "kube-api-access-fpqf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:26:04 crc kubenswrapper[4830]: I0318 18:26:04.448515 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpqf8\" (UniqueName: \"kubernetes.io/projected/07cd51e9-9091-4a1c-8055-448c0df097ed-kube-api-access-fpqf8\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:05 crc kubenswrapper[4830]: I0318 18:26:05.000533 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564306-q66sx" event={"ID":"07cd51e9-9091-4a1c-8055-448c0df097ed","Type":"ContainerDied","Data":"20e7c3131cea6272d5ca8e52cf9df8f2eaba573cb901c1da2132f3de92517428"} Mar 18 18:26:05 crc kubenswrapper[4830]: I0318 18:26:05.000590 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20e7c3131cea6272d5ca8e52cf9df8f2eaba573cb901c1da2132f3de92517428" Mar 18 18:26:05 crc kubenswrapper[4830]: I0318 18:26:05.000597 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564306-q66sx" Mar 18 18:26:05 crc kubenswrapper[4830]: I0318 18:26:05.368017 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564300-tc5tf"] Mar 18 18:26:05 crc kubenswrapper[4830]: I0318 18:26:05.373116 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564300-tc5tf"] Mar 18 18:26:06 crc kubenswrapper[4830]: I0318 18:26:06.250484 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0ac501e-b22c-4dd7-8e7e-51c56f870890" path="/var/lib/kubelet/pods/e0ac501e-b22c-4dd7-8e7e-51c56f870890/volumes" Mar 18 18:26:13 crc kubenswrapper[4830]: I0318 18:26:13.262622 4830 scope.go:117] "RemoveContainer" containerID="a98d18015621f551484484a4e5423ae99f7e30bab0f975455f87bfd8b54217cb" Mar 18 18:26:13 crc kubenswrapper[4830]: I0318 18:26:13.308719 4830 scope.go:117] "RemoveContainer" containerID="ab2208ca95c916d6035b8232ffd1553a2e84b6421f7e81beebc4d247d69149c0" Mar 18 18:26:13 crc kubenswrapper[4830]: I0318 18:26:13.370590 4830 scope.go:117] "RemoveContainer" containerID="1ccee24cadf8c0c5bd3c5bc1d38965eb6e50e70aa0322e3306f65f0b4f8a4891" Mar 18 18:26:13 crc kubenswrapper[4830]: I0318 18:26:13.399160 4830 scope.go:117] "RemoveContainer" containerID="c8b481dbe3098add8fb7e6339fdbf0bcaecb99b50c86f76e144ae1ccaa2b3c6f" Mar 18 18:26:13 crc kubenswrapper[4830]: I0318 18:26:13.429989 4830 scope.go:117] "RemoveContainer" containerID="3073305b4183467e7f6c2b40e18ca0a3dc5dd325e4392cdfee5efad929986263" Mar 18 18:26:13 crc kubenswrapper[4830]: I0318 18:26:13.458994 4830 scope.go:117] "RemoveContainer" containerID="a4338a1a65a166268b1973a86ac23dc739a16a3c7ea8b2c8be7a7cf3c241ecee" Mar 18 18:26:13 crc kubenswrapper[4830]: I0318 18:26:13.486283 4830 scope.go:117] "RemoveContainer" containerID="7eab1cf8b6cb575621ae6e6f99b624e1a23b211fa8cf4fe29aa7e8049a993337" Mar 18 18:26:13 crc kubenswrapper[4830]: I0318 18:26:13.519744 4830 scope.go:117] "RemoveContainer" containerID="e6ff33896ab819ecb0f974d24f8341e4cd47187d5b83f4031d921d854055799e" Mar 18 18:26:13 crc kubenswrapper[4830]: I0318 18:26:13.542461 4830 scope.go:117] "RemoveContainer" containerID="ca885e950a7893618761320e7bff3491a6c83307e3faaddbb8ddf40f9d55f77e" Mar 18 18:26:29 crc kubenswrapper[4830]: I0318 18:26:29.509641 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:26:29 crc kubenswrapper[4830]: I0318 18:26:29.511680 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:26:29 crc kubenswrapper[4830]: I0318 18:26:29.511907 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" Mar 18 18:26:29 crc kubenswrapper[4830]: I0318 18:26:29.512932 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7e673d7cc71d559a72795b6d3a15f56048a692df9a147924f348d8b7d4cd054a"} pod="openshift-machine-config-operator/machine-config-daemon-plzpb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 18:26:29 crc kubenswrapper[4830]: I0318 18:26:29.513154 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" containerID="cri-o://7e673d7cc71d559a72795b6d3a15f56048a692df9a147924f348d8b7d4cd054a" gracePeriod=600 Mar 18 18:26:30 crc kubenswrapper[4830]: I0318 18:26:30.267866 4830 generic.go:334] "Generic (PLEG): container finished" podID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerID="7e673d7cc71d559a72795b6d3a15f56048a692df9a147924f348d8b7d4cd054a" exitCode=0 Mar 18 18:26:30 crc kubenswrapper[4830]: I0318 18:26:30.267930 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerDied","Data":"7e673d7cc71d559a72795b6d3a15f56048a692df9a147924f348d8b7d4cd054a"} Mar 18 18:26:30 crc kubenswrapper[4830]: I0318 18:26:30.268355 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerStarted","Data":"46d4f627aa313dcb3a4d23bb3daecb5b1b3e4d6558380e5cd88db3746ce3260b"} Mar 18 18:26:30 crc kubenswrapper[4830]: I0318 18:26:30.268378 4830 scope.go:117] "RemoveContainer" containerID="95c4e07cab8acd660c3305d62103b7c04d3c929938a23e2544d7e9b8fe0b847c" Mar 18 18:26:58 crc kubenswrapper[4830]: I0318 18:26:58.030456 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6zm2m"] Mar 18 18:26:58 crc kubenswrapper[4830]: E0318 18:26:58.031615 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07cd51e9-9091-4a1c-8055-448c0df097ed" containerName="oc" Mar 18 18:26:58 crc kubenswrapper[4830]: I0318 18:26:58.031645 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="07cd51e9-9091-4a1c-8055-448c0df097ed" containerName="oc" Mar 18 18:26:58 crc kubenswrapper[4830]: I0318 18:26:58.031957 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="07cd51e9-9091-4a1c-8055-448c0df097ed" containerName="oc" Mar 18 18:26:58 crc kubenswrapper[4830]: I0318 18:26:58.033718 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6zm2m" Mar 18 18:26:58 crc kubenswrapper[4830]: I0318 18:26:58.060511 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6zm2m"] Mar 18 18:26:58 crc kubenswrapper[4830]: I0318 18:26:58.164716 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgpr2\" (UniqueName: \"kubernetes.io/projected/c4488db7-211b-4c97-a321-71b1dba06765-kube-api-access-dgpr2\") pod \"redhat-marketplace-6zm2m\" (UID: \"c4488db7-211b-4c97-a321-71b1dba06765\") " pod="openshift-marketplace/redhat-marketplace-6zm2m" Mar 18 18:26:58 crc kubenswrapper[4830]: I0318 18:26:58.164809 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4488db7-211b-4c97-a321-71b1dba06765-catalog-content\") pod \"redhat-marketplace-6zm2m\" (UID: \"c4488db7-211b-4c97-a321-71b1dba06765\") " pod="openshift-marketplace/redhat-marketplace-6zm2m" Mar 18 18:26:58 crc kubenswrapper[4830]: I0318 18:26:58.164852 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4488db7-211b-4c97-a321-71b1dba06765-utilities\") pod \"redhat-marketplace-6zm2m\" (UID: \"c4488db7-211b-4c97-a321-71b1dba06765\") " pod="openshift-marketplace/redhat-marketplace-6zm2m" Mar 18 18:26:58 crc kubenswrapper[4830]: I0318 18:26:58.265648 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4488db7-211b-4c97-a321-71b1dba06765-catalog-content\") pod \"redhat-marketplace-6zm2m\" (UID: \"c4488db7-211b-4c97-a321-71b1dba06765\") " pod="openshift-marketplace/redhat-marketplace-6zm2m" Mar 18 18:26:58 crc kubenswrapper[4830]: I0318 18:26:58.265723 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4488db7-211b-4c97-a321-71b1dba06765-utilities\") pod \"redhat-marketplace-6zm2m\" (UID: \"c4488db7-211b-4c97-a321-71b1dba06765\") " pod="openshift-marketplace/redhat-marketplace-6zm2m" Mar 18 18:26:58 crc kubenswrapper[4830]: I0318 18:26:58.265869 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgpr2\" (UniqueName: \"kubernetes.io/projected/c4488db7-211b-4c97-a321-71b1dba06765-kube-api-access-dgpr2\") pod \"redhat-marketplace-6zm2m\" (UID: \"c4488db7-211b-4c97-a321-71b1dba06765\") " pod="openshift-marketplace/redhat-marketplace-6zm2m" Mar 18 18:26:58 crc kubenswrapper[4830]: I0318 18:26:58.266243 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4488db7-211b-4c97-a321-71b1dba06765-utilities\") pod \"redhat-marketplace-6zm2m\" (UID: \"c4488db7-211b-4c97-a321-71b1dba06765\") " pod="openshift-marketplace/redhat-marketplace-6zm2m" Mar 18 18:26:58 crc kubenswrapper[4830]: I0318 18:26:58.266399 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4488db7-211b-4c97-a321-71b1dba06765-catalog-content\") pod \"redhat-marketplace-6zm2m\" (UID: \"c4488db7-211b-4c97-a321-71b1dba06765\") " pod="openshift-marketplace/redhat-marketplace-6zm2m" Mar 18 18:26:58 crc kubenswrapper[4830]: I0318 18:26:58.291033 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgpr2\" (UniqueName: \"kubernetes.io/projected/c4488db7-211b-4c97-a321-71b1dba06765-kube-api-access-dgpr2\") pod \"redhat-marketplace-6zm2m\" (UID: \"c4488db7-211b-4c97-a321-71b1dba06765\") " pod="openshift-marketplace/redhat-marketplace-6zm2m" Mar 18 18:26:58 crc kubenswrapper[4830]: I0318 18:26:58.365149 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6zm2m" Mar 18 18:26:58 crc kubenswrapper[4830]: I0318 18:26:58.806002 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6zm2m"] Mar 18 18:26:59 crc kubenswrapper[4830]: I0318 18:26:59.609878 4830 generic.go:334] "Generic (PLEG): container finished" podID="c4488db7-211b-4c97-a321-71b1dba06765" containerID="833291c81b5c8c6413b0594557668b759524f59c8147f8e6628911192b7508e6" exitCode=0 Mar 18 18:26:59 crc kubenswrapper[4830]: I0318 18:26:59.609979 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zm2m" event={"ID":"c4488db7-211b-4c97-a321-71b1dba06765","Type":"ContainerDied","Data":"833291c81b5c8c6413b0594557668b759524f59c8147f8e6628911192b7508e6"} Mar 18 18:26:59 crc kubenswrapper[4830]: I0318 18:26:59.610260 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zm2m" event={"ID":"c4488db7-211b-4c97-a321-71b1dba06765","Type":"ContainerStarted","Data":"63fd9aaa7f79d1f906447e6e68434522c3141ab6fe82aca1d6e32380a62e1622"} Mar 18 18:27:01 crc kubenswrapper[4830]: I0318 18:27:01.630493 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zm2m" event={"ID":"c4488db7-211b-4c97-a321-71b1dba06765","Type":"ContainerStarted","Data":"469daccb2e4c12c32d13a2a90495e2648d1013e5b08cd6dcf8ed3e44c8118582"} Mar 18 18:27:02 crc kubenswrapper[4830]: I0318 18:27:02.642055 4830 generic.go:334] "Generic (PLEG): container finished" podID="c4488db7-211b-4c97-a321-71b1dba06765" containerID="469daccb2e4c12c32d13a2a90495e2648d1013e5b08cd6dcf8ed3e44c8118582" exitCode=0 Mar 18 18:27:02 crc kubenswrapper[4830]: I0318 18:27:02.642312 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zm2m" event={"ID":"c4488db7-211b-4c97-a321-71b1dba06765","Type":"ContainerDied","Data":"469daccb2e4c12c32d13a2a90495e2648d1013e5b08cd6dcf8ed3e44c8118582"} Mar 18 18:27:03 crc kubenswrapper[4830]: I0318 18:27:03.651726 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zm2m" event={"ID":"c4488db7-211b-4c97-a321-71b1dba06765","Type":"ContainerStarted","Data":"ea15be20ab88c7198b0ee4f5c804fce721ce4da86c5b288d2a0e65b5213900d6"} Mar 18 18:27:03 crc kubenswrapper[4830]: I0318 18:27:03.673827 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6zm2m" podStartSLOduration=2.176258411 podStartE2EDuration="5.673806732s" podCreationTimestamp="2026-03-18 18:26:58 +0000 UTC" firstStartedPulling="2026-03-18 18:26:59.611554685 +0000 UTC m=+1454.179185027" lastFinishedPulling="2026-03-18 18:27:03.109103016 +0000 UTC m=+1457.676733348" observedRunningTime="2026-03-18 18:27:03.671326281 +0000 UTC m=+1458.238956613" watchObservedRunningTime="2026-03-18 18:27:03.673806732 +0000 UTC m=+1458.241437074" Mar 18 18:27:08 crc kubenswrapper[4830]: I0318 18:27:08.365973 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6zm2m" Mar 18 18:27:08 crc kubenswrapper[4830]: I0318 18:27:08.366664 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6zm2m" Mar 18 18:27:08 crc kubenswrapper[4830]: I0318 18:27:08.440868 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6zm2m" Mar 18 18:27:08 crc kubenswrapper[4830]: I0318 18:27:08.778475 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6zm2m" Mar 18 18:27:08 crc kubenswrapper[4830]: I0318 18:27:08.856395 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6zm2m"] Mar 18 18:27:10 crc kubenswrapper[4830]: I0318 18:27:10.719327 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6zm2m" podUID="c4488db7-211b-4c97-a321-71b1dba06765" containerName="registry-server" containerID="cri-o://ea15be20ab88c7198b0ee4f5c804fce721ce4da86c5b288d2a0e65b5213900d6" gracePeriod=2 Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.132895 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gn778"] Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.134530 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gn778" Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.150056 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gn778"] Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.167715 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8ff90f4-ffad-4379-9bbe-9ef9412229df-catalog-content\") pod \"certified-operators-gn778\" (UID: \"a8ff90f4-ffad-4379-9bbe-9ef9412229df\") " pod="openshift-marketplace/certified-operators-gn778" Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.167812 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wbv7\" (UniqueName: \"kubernetes.io/projected/a8ff90f4-ffad-4379-9bbe-9ef9412229df-kube-api-access-8wbv7\") pod \"certified-operators-gn778\" (UID: \"a8ff90f4-ffad-4379-9bbe-9ef9412229df\") " pod="openshift-marketplace/certified-operators-gn778" Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.167876 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8ff90f4-ffad-4379-9bbe-9ef9412229df-utilities\") pod \"certified-operators-gn778\" (UID: \"a8ff90f4-ffad-4379-9bbe-9ef9412229df\") " pod="openshift-marketplace/certified-operators-gn778" Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.247299 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6zm2m" Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.269581 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8ff90f4-ffad-4379-9bbe-9ef9412229df-utilities\") pod \"certified-operators-gn778\" (UID: \"a8ff90f4-ffad-4379-9bbe-9ef9412229df\") " pod="openshift-marketplace/certified-operators-gn778" Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.269669 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8ff90f4-ffad-4379-9bbe-9ef9412229df-catalog-content\") pod \"certified-operators-gn778\" (UID: \"a8ff90f4-ffad-4379-9bbe-9ef9412229df\") " pod="openshift-marketplace/certified-operators-gn778" Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.269703 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wbv7\" (UniqueName: \"kubernetes.io/projected/a8ff90f4-ffad-4379-9bbe-9ef9412229df-kube-api-access-8wbv7\") pod \"certified-operators-gn778\" (UID: \"a8ff90f4-ffad-4379-9bbe-9ef9412229df\") " pod="openshift-marketplace/certified-operators-gn778" Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.270390 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8ff90f4-ffad-4379-9bbe-9ef9412229df-utilities\") pod \"certified-operators-gn778\" (UID: \"a8ff90f4-ffad-4379-9bbe-9ef9412229df\") " pod="openshift-marketplace/certified-operators-gn778" Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.270580 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8ff90f4-ffad-4379-9bbe-9ef9412229df-catalog-content\") pod \"certified-operators-gn778\" (UID: \"a8ff90f4-ffad-4379-9bbe-9ef9412229df\") " pod="openshift-marketplace/certified-operators-gn778" Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.300379 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wbv7\" (UniqueName: \"kubernetes.io/projected/a8ff90f4-ffad-4379-9bbe-9ef9412229df-kube-api-access-8wbv7\") pod \"certified-operators-gn778\" (UID: \"a8ff90f4-ffad-4379-9bbe-9ef9412229df\") " pod="openshift-marketplace/certified-operators-gn778" Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.370332 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4488db7-211b-4c97-a321-71b1dba06765-catalog-content\") pod \"c4488db7-211b-4c97-a321-71b1dba06765\" (UID: \"c4488db7-211b-4c97-a321-71b1dba06765\") " Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.370459 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4488db7-211b-4c97-a321-71b1dba06765-utilities\") pod \"c4488db7-211b-4c97-a321-71b1dba06765\" (UID: \"c4488db7-211b-4c97-a321-71b1dba06765\") " Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.370481 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgpr2\" (UniqueName: \"kubernetes.io/projected/c4488db7-211b-4c97-a321-71b1dba06765-kube-api-access-dgpr2\") pod \"c4488db7-211b-4c97-a321-71b1dba06765\" (UID: \"c4488db7-211b-4c97-a321-71b1dba06765\") " Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.371796 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4488db7-211b-4c97-a321-71b1dba06765-utilities" (OuterVolumeSpecName: "utilities") pod "c4488db7-211b-4c97-a321-71b1dba06765" (UID: "c4488db7-211b-4c97-a321-71b1dba06765"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.374900 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4488db7-211b-4c97-a321-71b1dba06765-kube-api-access-dgpr2" (OuterVolumeSpecName: "kube-api-access-dgpr2") pod "c4488db7-211b-4c97-a321-71b1dba06765" (UID: "c4488db7-211b-4c97-a321-71b1dba06765"). InnerVolumeSpecName "kube-api-access-dgpr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.422892 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4488db7-211b-4c97-a321-71b1dba06765-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c4488db7-211b-4c97-a321-71b1dba06765" (UID: "c4488db7-211b-4c97-a321-71b1dba06765"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.454029 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gn778" Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.473355 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4488db7-211b-4c97-a321-71b1dba06765-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.473388 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgpr2\" (UniqueName: \"kubernetes.io/projected/c4488db7-211b-4c97-a321-71b1dba06765-kube-api-access-dgpr2\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.473397 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4488db7-211b-4c97-a321-71b1dba06765-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.727380 4830 generic.go:334] "Generic (PLEG): container finished" podID="c4488db7-211b-4c97-a321-71b1dba06765" containerID="ea15be20ab88c7198b0ee4f5c804fce721ce4da86c5b288d2a0e65b5213900d6" exitCode=0 Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.727534 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zm2m" event={"ID":"c4488db7-211b-4c97-a321-71b1dba06765","Type":"ContainerDied","Data":"ea15be20ab88c7198b0ee4f5c804fce721ce4da86c5b288d2a0e65b5213900d6"} Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.727670 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zm2m" event={"ID":"c4488db7-211b-4c97-a321-71b1dba06765","Type":"ContainerDied","Data":"63fd9aaa7f79d1f906447e6e68434522c3141ab6fe82aca1d6e32380a62e1622"} Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.727694 4830 scope.go:117] "RemoveContainer" containerID="ea15be20ab88c7198b0ee4f5c804fce721ce4da86c5b288d2a0e65b5213900d6" Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.727629 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6zm2m" Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.743634 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gn778"] Mar 18 18:27:11 crc kubenswrapper[4830]: W0318 18:27:11.754541 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8ff90f4_ffad_4379_9bbe_9ef9412229df.slice/crio-305a1a7b0c10a2f7c92891cabf6824152d599793396190f927d3f6bda0c672b7 WatchSource:0}: Error finding container 305a1a7b0c10a2f7c92891cabf6824152d599793396190f927d3f6bda0c672b7: Status 404 returned error can't find the container with id 305a1a7b0c10a2f7c92891cabf6824152d599793396190f927d3f6bda0c672b7 Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.767826 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6zm2m"] Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.772845 4830 scope.go:117] "RemoveContainer" containerID="469daccb2e4c12c32d13a2a90495e2648d1013e5b08cd6dcf8ed3e44c8118582" Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.775648 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6zm2m"] Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.805593 4830 scope.go:117] "RemoveContainer" containerID="833291c81b5c8c6413b0594557668b759524f59c8147f8e6628911192b7508e6" Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.828048 4830 scope.go:117] "RemoveContainer" containerID="ea15be20ab88c7198b0ee4f5c804fce721ce4da86c5b288d2a0e65b5213900d6" Mar 18 18:27:11 crc kubenswrapper[4830]: E0318 18:27:11.828368 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea15be20ab88c7198b0ee4f5c804fce721ce4da86c5b288d2a0e65b5213900d6\": container with ID starting with ea15be20ab88c7198b0ee4f5c804fce721ce4da86c5b288d2a0e65b5213900d6 not found: ID does not exist" containerID="ea15be20ab88c7198b0ee4f5c804fce721ce4da86c5b288d2a0e65b5213900d6" Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.828395 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea15be20ab88c7198b0ee4f5c804fce721ce4da86c5b288d2a0e65b5213900d6"} err="failed to get container status \"ea15be20ab88c7198b0ee4f5c804fce721ce4da86c5b288d2a0e65b5213900d6\": rpc error: code = NotFound desc = could not find container \"ea15be20ab88c7198b0ee4f5c804fce721ce4da86c5b288d2a0e65b5213900d6\": container with ID starting with ea15be20ab88c7198b0ee4f5c804fce721ce4da86c5b288d2a0e65b5213900d6 not found: ID does not exist" Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.828414 4830 scope.go:117] "RemoveContainer" containerID="469daccb2e4c12c32d13a2a90495e2648d1013e5b08cd6dcf8ed3e44c8118582" Mar 18 18:27:11 crc kubenswrapper[4830]: E0318 18:27:11.830172 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"469daccb2e4c12c32d13a2a90495e2648d1013e5b08cd6dcf8ed3e44c8118582\": container with ID starting with 469daccb2e4c12c32d13a2a90495e2648d1013e5b08cd6dcf8ed3e44c8118582 not found: ID does not exist" containerID="469daccb2e4c12c32d13a2a90495e2648d1013e5b08cd6dcf8ed3e44c8118582" Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.830197 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"469daccb2e4c12c32d13a2a90495e2648d1013e5b08cd6dcf8ed3e44c8118582"} err="failed to get container status \"469daccb2e4c12c32d13a2a90495e2648d1013e5b08cd6dcf8ed3e44c8118582\": rpc error: code = NotFound desc = could not find container \"469daccb2e4c12c32d13a2a90495e2648d1013e5b08cd6dcf8ed3e44c8118582\": container with ID starting with 469daccb2e4c12c32d13a2a90495e2648d1013e5b08cd6dcf8ed3e44c8118582 not found: ID does not exist" Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.830213 4830 scope.go:117] "RemoveContainer" containerID="833291c81b5c8c6413b0594557668b759524f59c8147f8e6628911192b7508e6" Mar 18 18:27:11 crc kubenswrapper[4830]: E0318 18:27:11.831687 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"833291c81b5c8c6413b0594557668b759524f59c8147f8e6628911192b7508e6\": container with ID starting with 833291c81b5c8c6413b0594557668b759524f59c8147f8e6628911192b7508e6 not found: ID does not exist" containerID="833291c81b5c8c6413b0594557668b759524f59c8147f8e6628911192b7508e6" Mar 18 18:27:11 crc kubenswrapper[4830]: I0318 18:27:11.831712 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"833291c81b5c8c6413b0594557668b759524f59c8147f8e6628911192b7508e6"} err="failed to get container status \"833291c81b5c8c6413b0594557668b759524f59c8147f8e6628911192b7508e6\": rpc error: code = NotFound desc = could not find container \"833291c81b5c8c6413b0594557668b759524f59c8147f8e6628911192b7508e6\": container with ID starting with 833291c81b5c8c6413b0594557668b759524f59c8147f8e6628911192b7508e6 not found: ID does not exist" Mar 18 18:27:12 crc kubenswrapper[4830]: I0318 18:27:12.252094 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4488db7-211b-4c97-a321-71b1dba06765" path="/var/lib/kubelet/pods/c4488db7-211b-4c97-a321-71b1dba06765/volumes" Mar 18 18:27:12 crc kubenswrapper[4830]: I0318 18:27:12.744318 4830 generic.go:334] "Generic (PLEG): container finished" podID="a8ff90f4-ffad-4379-9bbe-9ef9412229df" containerID="0299ef743dbb06165402559d24cee5a0a76f1b9c48fc28b892f5566bc0eaaccb" exitCode=0 Mar 18 18:27:12 crc kubenswrapper[4830]: I0318 18:27:12.744380 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn778" event={"ID":"a8ff90f4-ffad-4379-9bbe-9ef9412229df","Type":"ContainerDied","Data":"0299ef743dbb06165402559d24cee5a0a76f1b9c48fc28b892f5566bc0eaaccb"} Mar 18 18:27:12 crc kubenswrapper[4830]: I0318 18:27:12.744412 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn778" event={"ID":"a8ff90f4-ffad-4379-9bbe-9ef9412229df","Type":"ContainerStarted","Data":"305a1a7b0c10a2f7c92891cabf6824152d599793396190f927d3f6bda0c672b7"} Mar 18 18:27:13 crc kubenswrapper[4830]: I0318 18:27:13.756039 4830 scope.go:117] "RemoveContainer" containerID="36d3831532c5080f76c17b505df06b38c560192c7e4793abf714c8adf589ca70" Mar 18 18:27:13 crc kubenswrapper[4830]: I0318 18:27:13.785236 4830 scope.go:117] "RemoveContainer" containerID="dfdcf803a034b493e9fb3e0d17cf9adc12be15bac2d2e9b4f31b3f1c84c90d38" Mar 18 18:27:13 crc kubenswrapper[4830]: I0318 18:27:13.820361 4830 scope.go:117] "RemoveContainer" containerID="7c99bf884aafec1098e4dad942ec1e611c8c819bbe83815db3fdad19bac4fd8e" Mar 18 18:27:13 crc kubenswrapper[4830]: I0318 18:27:13.863180 4830 scope.go:117] "RemoveContainer" containerID="a4919c4786f2548b6767558777a241dc56d419f9904e042566ee841adbe1f1e3" Mar 18 18:27:13 crc kubenswrapper[4830]: I0318 18:27:13.889850 4830 scope.go:117] "RemoveContainer" containerID="933487d6b7c0d60ba81cf11b01ceaae63489030bbb5fd50148a67d7724abf942" Mar 18 18:27:13 crc kubenswrapper[4830]: I0318 18:27:13.909452 4830 scope.go:117] "RemoveContainer" containerID="230ee49d6b1a370d560b5372a13a835245324de53ff64139c40f778e9c9df746" Mar 18 18:27:13 crc kubenswrapper[4830]: I0318 18:27:13.930584 4830 scope.go:117] "RemoveContainer" containerID="a00384955c734a22389aeb43e6ded62b5afecb8b1d30da824d579f5a71c2a71a" Mar 18 18:27:13 crc kubenswrapper[4830]: I0318 18:27:13.963382 4830 scope.go:117] "RemoveContainer" containerID="e9e760a71aaf066d2755a2ade0ff78d11fa557bca8fdd20885df296496a7747d" Mar 18 18:27:13 crc kubenswrapper[4830]: I0318 18:27:13.995613 4830 scope.go:117] "RemoveContainer" containerID="aa08cf82fb2e3b65a4db0cafef455de1e72e29ea5cb6fff9ccdb05335df61a7a" Mar 18 18:27:14 crc kubenswrapper[4830]: I0318 18:27:14.016093 4830 scope.go:117] "RemoveContainer" containerID="b679d58105fc60f57cf5c7272800740c97568fe76f6fe223f1159f845d0f52b1" Mar 18 18:27:14 crc kubenswrapper[4830]: I0318 18:27:14.047403 4830 scope.go:117] "RemoveContainer" containerID="19e2f77105d5703f0646d3c61e7fe7c902c627dbb91bbc626d9e5d5bb3fa485c" Mar 18 18:27:14 crc kubenswrapper[4830]: I0318 18:27:14.090143 4830 scope.go:117] "RemoveContainer" containerID="76403c1ba924bd720fa9ad0e8b9fdcc56b2531f042be32e994982f3fc7c33064" Mar 18 18:27:14 crc kubenswrapper[4830]: I0318 18:27:14.111163 4830 scope.go:117] "RemoveContainer" containerID="6147c63258a2165648c12a669fffd532aabced926b601af8d7cde628ab4b44a4" Mar 18 18:27:14 crc kubenswrapper[4830]: I0318 18:27:14.133259 4830 scope.go:117] "RemoveContainer" containerID="3d815a588191bb1f303bdb826c5890be7d59362cd896066cfe0bd7ed228c7623" Mar 18 18:27:14 crc kubenswrapper[4830]: I0318 18:27:14.202523 4830 scope.go:117] "RemoveContainer" containerID="3e305ed4bde4816d27ef7a6fa961104500d0307d2c1c0ea2ab3b18a26aa4b3f2" Mar 18 18:27:14 crc kubenswrapper[4830]: I0318 18:27:14.232836 4830 scope.go:117] "RemoveContainer" containerID="7e87a03e3adb66017525596597b8739a2dd883902ed90c632e4e5cbfbfade6fe" Mar 18 18:27:14 crc kubenswrapper[4830]: I0318 18:27:14.259610 4830 scope.go:117] "RemoveContainer" containerID="44d7f582b1786283b3e923d17f41dabde89bb1069ef6be7a6bc4c163e7c6d398" Mar 18 18:27:14 crc kubenswrapper[4830]: I0318 18:27:14.277452 4830 scope.go:117] "RemoveContainer" containerID="6c5bd47f9683cd9c5e03e6fd6c51407a8085923fe0e9f8ac3506a2c980271f44" Mar 18 18:27:14 crc kubenswrapper[4830]: I0318 18:27:14.294775 4830 scope.go:117] "RemoveContainer" containerID="d154a60d139fbff2ee36c61dbc4db23537b10afebf0dabcddf0ce4e5874a933a" Mar 18 18:27:14 crc kubenswrapper[4830]: I0318 18:27:14.316007 4830 scope.go:117] "RemoveContainer" containerID="119d8ed4821b740db22ee1445ab8b58f898477bd08bc5e02c8c7846598245ad2" Mar 18 18:27:14 crc kubenswrapper[4830]: I0318 18:27:14.351955 4830 scope.go:117] "RemoveContainer" containerID="93c102b5fc9f4a88a8768bdc36062b71725eccef648daf124aa807bb59ea8cd8" Mar 18 18:27:14 crc kubenswrapper[4830]: I0318 18:27:14.369352 4830 scope.go:117] "RemoveContainer" containerID="b34b3a8f9ec6ede03cd125304c68ec8d92f19169893d3ffa48c8c3477adb2572" Mar 18 18:27:14 crc kubenswrapper[4830]: I0318 18:27:14.392733 4830 scope.go:117] "RemoveContainer" containerID="68ad223077ac746b9802f4eba8764e5eaa00ca98bf3773872d2cd95daf9b38f0" Mar 18 18:27:14 crc kubenswrapper[4830]: I0318 18:27:14.409895 4830 scope.go:117] "RemoveContainer" containerID="a19a7ebcd14a4be0ac0694088743b29c2a922f84d22e54d870830b7764d78682" Mar 18 18:27:14 crc kubenswrapper[4830]: I0318 18:27:14.426428 4830 scope.go:117] "RemoveContainer" containerID="a20dea92408e3316b920fe3e34c3564167b91f44ec33c56fd94553ec6a29e550" Mar 18 18:27:14 crc kubenswrapper[4830]: I0318 18:27:14.439261 4830 scope.go:117] "RemoveContainer" containerID="f310910b8917cc5993658361fcb89166ad9ce2e06a60d4b649ae75c286fa88be" Mar 18 18:27:14 crc kubenswrapper[4830]: I0318 18:27:14.455186 4830 scope.go:117] "RemoveContainer" containerID="964d29d1c66eefd1d61b425bffee997a70c6c81bf69aabae1c5c8383843cd69b" Mar 18 18:27:14 crc kubenswrapper[4830]: I0318 18:27:14.475939 4830 scope.go:117] "RemoveContainer" containerID="2ec0b23590aa07b58f0bce309a6926e720bbad903a64b11e9d199099e2f7a8f8" Mar 18 18:27:14 crc kubenswrapper[4830]: I0318 18:27:14.492021 4830 scope.go:117] "RemoveContainer" containerID="149f880fe60e94677e4b390ed1783c9f69df4024032f5dd5d9d59bc9f45e506a" Mar 18 18:27:14 crc kubenswrapper[4830]: I0318 18:27:14.506808 4830 scope.go:117] "RemoveContainer" containerID="260f46cb5b74d58cda51ed1891b814c7af3b8dc935ea5d26486826dc9676c6a4" Mar 18 18:27:14 crc kubenswrapper[4830]: I0318 18:27:14.527889 4830 scope.go:117] "RemoveContainer" containerID="ddac036c21cf6e7f7086be2d69ffc2a2c68d39299f23922898025a29a0596dc2" Mar 18 18:27:14 crc kubenswrapper[4830]: I0318 18:27:14.542228 4830 scope.go:117] "RemoveContainer" containerID="7c515d7ae75f9681256fdbe1f69965eb9c3684f823264317db51e5df64736fcd" Mar 18 18:27:14 crc kubenswrapper[4830]: I0318 18:27:14.575682 4830 scope.go:117] "RemoveContainer" containerID="bfe502cfd69f0dcee4198317e7340351aefeb4d8e022de2043630ae66cc8612a" Mar 18 18:27:14 crc kubenswrapper[4830]: I0318 18:27:14.599507 4830 scope.go:117] "RemoveContainer" containerID="1176cdff085a64af57931e22a9423ae76c0f52837134d47b63aa9518c32e92c6" Mar 18 18:27:14 crc kubenswrapper[4830]: I0318 18:27:14.774755 4830 generic.go:334] "Generic (PLEG): container finished" podID="a8ff90f4-ffad-4379-9bbe-9ef9412229df" containerID="65913dea2de04555099e6eb244b3a48ad6058a3c265289143510b13dba7c55bb" exitCode=0 Mar 18 18:27:14 crc kubenswrapper[4830]: I0318 18:27:14.774840 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn778" event={"ID":"a8ff90f4-ffad-4379-9bbe-9ef9412229df","Type":"ContainerDied","Data":"65913dea2de04555099e6eb244b3a48ad6058a3c265289143510b13dba7c55bb"} Mar 18 18:27:15 crc kubenswrapper[4830]: I0318 18:27:15.811161 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn778" event={"ID":"a8ff90f4-ffad-4379-9bbe-9ef9412229df","Type":"ContainerStarted","Data":"9d687bb896894974473734b1ed3ae4297573f3a97ed08d85201563472601f1b6"} Mar 18 18:27:15 crc kubenswrapper[4830]: I0318 18:27:15.835276 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gn778" podStartSLOduration=2.2560009389999998 podStartE2EDuration="4.835252147s" podCreationTimestamp="2026-03-18 18:27:11 +0000 UTC" firstStartedPulling="2026-03-18 18:27:12.746320445 +0000 UTC m=+1467.313950807" lastFinishedPulling="2026-03-18 18:27:15.325571643 +0000 UTC m=+1469.893202015" observedRunningTime="2026-03-18 18:27:15.830525622 +0000 UTC m=+1470.398155994" watchObservedRunningTime="2026-03-18 18:27:15.835252147 +0000 UTC m=+1470.402882489" Mar 18 18:27:21 crc kubenswrapper[4830]: I0318 18:27:21.454355 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gn778" Mar 18 18:27:21 crc kubenswrapper[4830]: I0318 18:27:21.455180 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gn778" Mar 18 18:27:21 crc kubenswrapper[4830]: I0318 18:27:21.533910 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gn778" Mar 18 18:27:21 crc kubenswrapper[4830]: I0318 18:27:21.923942 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gn778" Mar 18 18:27:21 crc kubenswrapper[4830]: I0318 18:27:21.996603 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gn778"] Mar 18 18:27:23 crc kubenswrapper[4830]: I0318 18:27:23.883871 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gn778" podUID="a8ff90f4-ffad-4379-9bbe-9ef9412229df" containerName="registry-server" containerID="cri-o://9d687bb896894974473734b1ed3ae4297573f3a97ed08d85201563472601f1b6" gracePeriod=2 Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.204763 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l7gp4"] Mar 18 18:27:24 crc kubenswrapper[4830]: E0318 18:27:24.206466 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4488db7-211b-4c97-a321-71b1dba06765" containerName="extract-content" Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.206497 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4488db7-211b-4c97-a321-71b1dba06765" containerName="extract-content" Mar 18 18:27:24 crc kubenswrapper[4830]: E0318 18:27:24.206526 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4488db7-211b-4c97-a321-71b1dba06765" containerName="extract-utilities" Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.206538 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4488db7-211b-4c97-a321-71b1dba06765" containerName="extract-utilities" Mar 18 18:27:24 crc kubenswrapper[4830]: E0318 18:27:24.206581 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4488db7-211b-4c97-a321-71b1dba06765" containerName="registry-server" Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.206590 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4488db7-211b-4c97-a321-71b1dba06765" containerName="registry-server" Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.206813 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4488db7-211b-4c97-a321-71b1dba06765" containerName="registry-server" Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.208400 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l7gp4" Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.228880 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l7gp4"] Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.279664 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5ad7be1-69f6-4162-ae51-b834929b59f5-catalog-content\") pod \"community-operators-l7gp4\" (UID: \"f5ad7be1-69f6-4162-ae51-b834929b59f5\") " pod="openshift-marketplace/community-operators-l7gp4" Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.279707 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5ad7be1-69f6-4162-ae51-b834929b59f5-utilities\") pod \"community-operators-l7gp4\" (UID: \"f5ad7be1-69f6-4162-ae51-b834929b59f5\") " pod="openshift-marketplace/community-operators-l7gp4" Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.279805 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vcsb\" (UniqueName: \"kubernetes.io/projected/f5ad7be1-69f6-4162-ae51-b834929b59f5-kube-api-access-8vcsb\") pod \"community-operators-l7gp4\" (UID: \"f5ad7be1-69f6-4162-ae51-b834929b59f5\") " pod="openshift-marketplace/community-operators-l7gp4" Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.340153 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gn778" Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.380465 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wbv7\" (UniqueName: \"kubernetes.io/projected/a8ff90f4-ffad-4379-9bbe-9ef9412229df-kube-api-access-8wbv7\") pod \"a8ff90f4-ffad-4379-9bbe-9ef9412229df\" (UID: \"a8ff90f4-ffad-4379-9bbe-9ef9412229df\") " Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.380524 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8ff90f4-ffad-4379-9bbe-9ef9412229df-utilities\") pod \"a8ff90f4-ffad-4379-9bbe-9ef9412229df\" (UID: \"a8ff90f4-ffad-4379-9bbe-9ef9412229df\") " Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.380701 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8ff90f4-ffad-4379-9bbe-9ef9412229df-catalog-content\") pod \"a8ff90f4-ffad-4379-9bbe-9ef9412229df\" (UID: \"a8ff90f4-ffad-4379-9bbe-9ef9412229df\") " Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.380938 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5ad7be1-69f6-4162-ae51-b834929b59f5-catalog-content\") pod \"community-operators-l7gp4\" (UID: \"f5ad7be1-69f6-4162-ae51-b834929b59f5\") " pod="openshift-marketplace/community-operators-l7gp4" Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.380976 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5ad7be1-69f6-4162-ae51-b834929b59f5-utilities\") pod \"community-operators-l7gp4\" (UID: \"f5ad7be1-69f6-4162-ae51-b834929b59f5\") " pod="openshift-marketplace/community-operators-l7gp4" Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.381082 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vcsb\" (UniqueName: \"kubernetes.io/projected/f5ad7be1-69f6-4162-ae51-b834929b59f5-kube-api-access-8vcsb\") pod \"community-operators-l7gp4\" (UID: \"f5ad7be1-69f6-4162-ae51-b834929b59f5\") " pod="openshift-marketplace/community-operators-l7gp4" Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.381579 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5ad7be1-69f6-4162-ae51-b834929b59f5-catalog-content\") pod \"community-operators-l7gp4\" (UID: \"f5ad7be1-69f6-4162-ae51-b834929b59f5\") " pod="openshift-marketplace/community-operators-l7gp4" Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.381922 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5ad7be1-69f6-4162-ae51-b834929b59f5-utilities\") pod \"community-operators-l7gp4\" (UID: \"f5ad7be1-69f6-4162-ae51-b834929b59f5\") " pod="openshift-marketplace/community-operators-l7gp4" Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.382412 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8ff90f4-ffad-4379-9bbe-9ef9412229df-utilities" (OuterVolumeSpecName: "utilities") pod "a8ff90f4-ffad-4379-9bbe-9ef9412229df" (UID: "a8ff90f4-ffad-4379-9bbe-9ef9412229df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.390990 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8ff90f4-ffad-4379-9bbe-9ef9412229df-kube-api-access-8wbv7" (OuterVolumeSpecName: "kube-api-access-8wbv7") pod "a8ff90f4-ffad-4379-9bbe-9ef9412229df" (UID: "a8ff90f4-ffad-4379-9bbe-9ef9412229df"). InnerVolumeSpecName "kube-api-access-8wbv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.401347 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vcsb\" (UniqueName: \"kubernetes.io/projected/f5ad7be1-69f6-4162-ae51-b834929b59f5-kube-api-access-8vcsb\") pod \"community-operators-l7gp4\" (UID: \"f5ad7be1-69f6-4162-ae51-b834929b59f5\") " pod="openshift-marketplace/community-operators-l7gp4" Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.467554 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8ff90f4-ffad-4379-9bbe-9ef9412229df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8ff90f4-ffad-4379-9bbe-9ef9412229df" (UID: "a8ff90f4-ffad-4379-9bbe-9ef9412229df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.482019 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8ff90f4-ffad-4379-9bbe-9ef9412229df-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.482062 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wbv7\" (UniqueName: \"kubernetes.io/projected/a8ff90f4-ffad-4379-9bbe-9ef9412229df-kube-api-access-8wbv7\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.482079 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8ff90f4-ffad-4379-9bbe-9ef9412229df-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.545160 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l7gp4" Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.894136 4830 generic.go:334] "Generic (PLEG): container finished" podID="a8ff90f4-ffad-4379-9bbe-9ef9412229df" containerID="9d687bb896894974473734b1ed3ae4297573f3a97ed08d85201563472601f1b6" exitCode=0 Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.894191 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gn778" Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.894188 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn778" event={"ID":"a8ff90f4-ffad-4379-9bbe-9ef9412229df","Type":"ContainerDied","Data":"9d687bb896894974473734b1ed3ae4297573f3a97ed08d85201563472601f1b6"} Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.894621 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn778" event={"ID":"a8ff90f4-ffad-4379-9bbe-9ef9412229df","Type":"ContainerDied","Data":"305a1a7b0c10a2f7c92891cabf6824152d599793396190f927d3f6bda0c672b7"} Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.894696 4830 scope.go:117] "RemoveContainer" containerID="9d687bb896894974473734b1ed3ae4297573f3a97ed08d85201563472601f1b6" Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.924022 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gn778"] Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.941603 4830 scope.go:117] "RemoveContainer" containerID="65913dea2de04555099e6eb244b3a48ad6058a3c265289143510b13dba7c55bb" Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.944137 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gn778"] Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.962859 4830 scope.go:117] "RemoveContainer" containerID="0299ef743dbb06165402559d24cee5a0a76f1b9c48fc28b892f5566bc0eaaccb" Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.980213 4830 scope.go:117] "RemoveContainer" containerID="9d687bb896894974473734b1ed3ae4297573f3a97ed08d85201563472601f1b6" Mar 18 18:27:24 crc kubenswrapper[4830]: E0318 18:27:24.980717 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d687bb896894974473734b1ed3ae4297573f3a97ed08d85201563472601f1b6\": container with ID starting with 9d687bb896894974473734b1ed3ae4297573f3a97ed08d85201563472601f1b6 not found: ID does not exist" containerID="9d687bb896894974473734b1ed3ae4297573f3a97ed08d85201563472601f1b6" Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.980800 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d687bb896894974473734b1ed3ae4297573f3a97ed08d85201563472601f1b6"} err="failed to get container status \"9d687bb896894974473734b1ed3ae4297573f3a97ed08d85201563472601f1b6\": rpc error: code = NotFound desc = could not find container \"9d687bb896894974473734b1ed3ae4297573f3a97ed08d85201563472601f1b6\": container with ID starting with 9d687bb896894974473734b1ed3ae4297573f3a97ed08d85201563472601f1b6 not found: ID does not exist" Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.980840 4830 scope.go:117] "RemoveContainer" containerID="65913dea2de04555099e6eb244b3a48ad6058a3c265289143510b13dba7c55bb" Mar 18 18:27:24 crc kubenswrapper[4830]: E0318 18:27:24.981240 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65913dea2de04555099e6eb244b3a48ad6058a3c265289143510b13dba7c55bb\": container with ID starting with 65913dea2de04555099e6eb244b3a48ad6058a3c265289143510b13dba7c55bb not found: ID does not exist" containerID="65913dea2de04555099e6eb244b3a48ad6058a3c265289143510b13dba7c55bb" Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.981278 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65913dea2de04555099e6eb244b3a48ad6058a3c265289143510b13dba7c55bb"} err="failed to get container status \"65913dea2de04555099e6eb244b3a48ad6058a3c265289143510b13dba7c55bb\": rpc error: code = NotFound desc = could not find container \"65913dea2de04555099e6eb244b3a48ad6058a3c265289143510b13dba7c55bb\": container with ID starting with 65913dea2de04555099e6eb244b3a48ad6058a3c265289143510b13dba7c55bb not found: ID does not exist" Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.981306 4830 scope.go:117] "RemoveContainer" containerID="0299ef743dbb06165402559d24cee5a0a76f1b9c48fc28b892f5566bc0eaaccb" Mar 18 18:27:24 crc kubenswrapper[4830]: E0318 18:27:24.981576 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0299ef743dbb06165402559d24cee5a0a76f1b9c48fc28b892f5566bc0eaaccb\": container with ID starting with 0299ef743dbb06165402559d24cee5a0a76f1b9c48fc28b892f5566bc0eaaccb not found: ID does not exist" containerID="0299ef743dbb06165402559d24cee5a0a76f1b9c48fc28b892f5566bc0eaaccb" Mar 18 18:27:24 crc kubenswrapper[4830]: I0318 18:27:24.981610 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0299ef743dbb06165402559d24cee5a0a76f1b9c48fc28b892f5566bc0eaaccb"} err="failed to get container status \"0299ef743dbb06165402559d24cee5a0a76f1b9c48fc28b892f5566bc0eaaccb\": rpc error: code = NotFound desc = could not find container \"0299ef743dbb06165402559d24cee5a0a76f1b9c48fc28b892f5566bc0eaaccb\": container with ID starting with 0299ef743dbb06165402559d24cee5a0a76f1b9c48fc28b892f5566bc0eaaccb not found: ID does not exist" Mar 18 18:27:25 crc kubenswrapper[4830]: I0318 18:27:25.068191 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l7gp4"] Mar 18 18:27:25 crc kubenswrapper[4830]: I0318 18:27:25.906214 4830 generic.go:334] "Generic (PLEG): container finished" podID="f5ad7be1-69f6-4162-ae51-b834929b59f5" containerID="d38eb54027fd2e58477ace2e706308d47a7e375204e75ac203f41a378140a830" exitCode=0 Mar 18 18:27:25 crc kubenswrapper[4830]: I0318 18:27:25.906417 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7gp4" event={"ID":"f5ad7be1-69f6-4162-ae51-b834929b59f5","Type":"ContainerDied","Data":"d38eb54027fd2e58477ace2e706308d47a7e375204e75ac203f41a378140a830"} Mar 18 18:27:25 crc kubenswrapper[4830]: I0318 18:27:25.906587 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7gp4" event={"ID":"f5ad7be1-69f6-4162-ae51-b834929b59f5","Type":"ContainerStarted","Data":"960a027401e1d6bec62e31fd97b04ba30f80cf149a9984ffc7ffc68e8db7d2f1"} Mar 18 18:27:26 crc kubenswrapper[4830]: I0318 18:27:26.249595 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8ff90f4-ffad-4379-9bbe-9ef9412229df" path="/var/lib/kubelet/pods/a8ff90f4-ffad-4379-9bbe-9ef9412229df/volumes" Mar 18 18:27:27 crc kubenswrapper[4830]: I0318 18:27:27.938976 4830 generic.go:334] "Generic (PLEG): container finished" podID="f5ad7be1-69f6-4162-ae51-b834929b59f5" containerID="f1174bf8db0ab6ccefa7801db9e1180937ce25fcfb8c8269c2c65e9fe54b489b" exitCode=0 Mar 18 18:27:27 crc kubenswrapper[4830]: I0318 18:27:27.939406 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7gp4" event={"ID":"f5ad7be1-69f6-4162-ae51-b834929b59f5","Type":"ContainerDied","Data":"f1174bf8db0ab6ccefa7801db9e1180937ce25fcfb8c8269c2c65e9fe54b489b"} Mar 18 18:27:28 crc kubenswrapper[4830]: I0318 18:27:28.951588 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7gp4" event={"ID":"f5ad7be1-69f6-4162-ae51-b834929b59f5","Type":"ContainerStarted","Data":"e1eced6cc19067d07f356e5d792fb5d2b2339a860de76a581db43128ef4dc507"} Mar 18 18:27:28 crc kubenswrapper[4830]: I0318 18:27:28.982704 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l7gp4" podStartSLOduration=2.525544783 podStartE2EDuration="4.98267643s" podCreationTimestamp="2026-03-18 18:27:24 +0000 UTC" firstStartedPulling="2026-03-18 18:27:25.909189952 +0000 UTC m=+1480.476820284" lastFinishedPulling="2026-03-18 18:27:28.366321589 +0000 UTC m=+1482.933951931" observedRunningTime="2026-03-18 18:27:28.973054853 +0000 UTC m=+1483.540685225" watchObservedRunningTime="2026-03-18 18:27:28.98267643 +0000 UTC m=+1483.550306802" Mar 18 18:27:34 crc kubenswrapper[4830]: I0318 18:27:34.546192 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l7gp4" Mar 18 18:27:34 crc kubenswrapper[4830]: I0318 18:27:34.547170 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l7gp4" Mar 18 18:27:34 crc kubenswrapper[4830]: I0318 18:27:34.617155 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l7gp4" Mar 18 18:27:35 crc kubenswrapper[4830]: I0318 18:27:35.113142 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l7gp4" Mar 18 18:27:35 crc kubenswrapper[4830]: I0318 18:27:35.174523 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l7gp4"] Mar 18 18:27:37 crc kubenswrapper[4830]: I0318 18:27:37.053391 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l7gp4" podUID="f5ad7be1-69f6-4162-ae51-b834929b59f5" containerName="registry-server" containerID="cri-o://e1eced6cc19067d07f356e5d792fb5d2b2339a860de76a581db43128ef4dc507" gracePeriod=2 Mar 18 18:27:37 crc kubenswrapper[4830]: I0318 18:27:37.516953 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l7gp4" Mar 18 18:27:37 crc kubenswrapper[4830]: I0318 18:27:37.603851 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vcsb\" (UniqueName: \"kubernetes.io/projected/f5ad7be1-69f6-4162-ae51-b834929b59f5-kube-api-access-8vcsb\") pod \"f5ad7be1-69f6-4162-ae51-b834929b59f5\" (UID: \"f5ad7be1-69f6-4162-ae51-b834929b59f5\") " Mar 18 18:27:37 crc kubenswrapper[4830]: I0318 18:27:37.604087 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5ad7be1-69f6-4162-ae51-b834929b59f5-utilities\") pod \"f5ad7be1-69f6-4162-ae51-b834929b59f5\" (UID: \"f5ad7be1-69f6-4162-ae51-b834929b59f5\") " Mar 18 18:27:37 crc kubenswrapper[4830]: I0318 18:27:37.604139 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5ad7be1-69f6-4162-ae51-b834929b59f5-catalog-content\") pod \"f5ad7be1-69f6-4162-ae51-b834929b59f5\" (UID: \"f5ad7be1-69f6-4162-ae51-b834929b59f5\") " Mar 18 18:27:37 crc kubenswrapper[4830]: I0318 18:27:37.605474 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5ad7be1-69f6-4162-ae51-b834929b59f5-utilities" (OuterVolumeSpecName: "utilities") pod "f5ad7be1-69f6-4162-ae51-b834929b59f5" (UID: "f5ad7be1-69f6-4162-ae51-b834929b59f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:27:37 crc kubenswrapper[4830]: I0318 18:27:37.609752 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5ad7be1-69f6-4162-ae51-b834929b59f5-kube-api-access-8vcsb" (OuterVolumeSpecName: "kube-api-access-8vcsb") pod "f5ad7be1-69f6-4162-ae51-b834929b59f5" (UID: "f5ad7be1-69f6-4162-ae51-b834929b59f5"). InnerVolumeSpecName "kube-api-access-8vcsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:37 crc kubenswrapper[4830]: I0318 18:27:37.706054 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5ad7be1-69f6-4162-ae51-b834929b59f5-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:37 crc kubenswrapper[4830]: I0318 18:27:37.706225 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vcsb\" (UniqueName: \"kubernetes.io/projected/f5ad7be1-69f6-4162-ae51-b834929b59f5-kube-api-access-8vcsb\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:38 crc kubenswrapper[4830]: I0318 18:27:38.067608 4830 generic.go:334] "Generic (PLEG): container finished" podID="f5ad7be1-69f6-4162-ae51-b834929b59f5" containerID="e1eced6cc19067d07f356e5d792fb5d2b2339a860de76a581db43128ef4dc507" exitCode=0 Mar 18 18:27:38 crc kubenswrapper[4830]: I0318 18:27:38.067657 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7gp4" event={"ID":"f5ad7be1-69f6-4162-ae51-b834929b59f5","Type":"ContainerDied","Data":"e1eced6cc19067d07f356e5d792fb5d2b2339a860de76a581db43128ef4dc507"} Mar 18 18:27:38 crc kubenswrapper[4830]: I0318 18:27:38.067721 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7gp4" event={"ID":"f5ad7be1-69f6-4162-ae51-b834929b59f5","Type":"ContainerDied","Data":"960a027401e1d6bec62e31fd97b04ba30f80cf149a9984ffc7ffc68e8db7d2f1"} Mar 18 18:27:38 crc kubenswrapper[4830]: I0318 18:27:38.067731 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l7gp4" Mar 18 18:27:38 crc kubenswrapper[4830]: I0318 18:27:38.067756 4830 scope.go:117] "RemoveContainer" containerID="e1eced6cc19067d07f356e5d792fb5d2b2339a860de76a581db43128ef4dc507" Mar 18 18:27:38 crc kubenswrapper[4830]: I0318 18:27:38.092998 4830 scope.go:117] "RemoveContainer" containerID="f1174bf8db0ab6ccefa7801db9e1180937ce25fcfb8c8269c2c65e9fe54b489b" Mar 18 18:27:38 crc kubenswrapper[4830]: I0318 18:27:38.126904 4830 scope.go:117] "RemoveContainer" containerID="d38eb54027fd2e58477ace2e706308d47a7e375204e75ac203f41a378140a830" Mar 18 18:27:38 crc kubenswrapper[4830]: I0318 18:27:38.167430 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5ad7be1-69f6-4162-ae51-b834929b59f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f5ad7be1-69f6-4162-ae51-b834929b59f5" (UID: "f5ad7be1-69f6-4162-ae51-b834929b59f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:27:38 crc kubenswrapper[4830]: I0318 18:27:38.170608 4830 scope.go:117] "RemoveContainer" containerID="e1eced6cc19067d07f356e5d792fb5d2b2339a860de76a581db43128ef4dc507" Mar 18 18:27:38 crc kubenswrapper[4830]: E0318 18:27:38.171061 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1eced6cc19067d07f356e5d792fb5d2b2339a860de76a581db43128ef4dc507\": container with ID starting with e1eced6cc19067d07f356e5d792fb5d2b2339a860de76a581db43128ef4dc507 not found: ID does not exist" containerID="e1eced6cc19067d07f356e5d792fb5d2b2339a860de76a581db43128ef4dc507" Mar 18 18:27:38 crc kubenswrapper[4830]: I0318 18:27:38.171102 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1eced6cc19067d07f356e5d792fb5d2b2339a860de76a581db43128ef4dc507"} err="failed to get container status \"e1eced6cc19067d07f356e5d792fb5d2b2339a860de76a581db43128ef4dc507\": rpc error: code = NotFound desc = could not find container \"e1eced6cc19067d07f356e5d792fb5d2b2339a860de76a581db43128ef4dc507\": container with ID starting with e1eced6cc19067d07f356e5d792fb5d2b2339a860de76a581db43128ef4dc507 not found: ID does not exist" Mar 18 18:27:38 crc kubenswrapper[4830]: I0318 18:27:38.171127 4830 scope.go:117] "RemoveContainer" containerID="f1174bf8db0ab6ccefa7801db9e1180937ce25fcfb8c8269c2c65e9fe54b489b" Mar 18 18:27:38 crc kubenswrapper[4830]: E0318 18:27:38.171388 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1174bf8db0ab6ccefa7801db9e1180937ce25fcfb8c8269c2c65e9fe54b489b\": container with ID starting with f1174bf8db0ab6ccefa7801db9e1180937ce25fcfb8c8269c2c65e9fe54b489b not found: ID does not exist" containerID="f1174bf8db0ab6ccefa7801db9e1180937ce25fcfb8c8269c2c65e9fe54b489b" Mar 18 18:27:38 crc kubenswrapper[4830]: I0318 18:27:38.171430 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1174bf8db0ab6ccefa7801db9e1180937ce25fcfb8c8269c2c65e9fe54b489b"} err="failed to get container status \"f1174bf8db0ab6ccefa7801db9e1180937ce25fcfb8c8269c2c65e9fe54b489b\": rpc error: code = NotFound desc = could not find container \"f1174bf8db0ab6ccefa7801db9e1180937ce25fcfb8c8269c2c65e9fe54b489b\": container with ID starting with f1174bf8db0ab6ccefa7801db9e1180937ce25fcfb8c8269c2c65e9fe54b489b not found: ID does not exist" Mar 18 18:27:38 crc kubenswrapper[4830]: I0318 18:27:38.171455 4830 scope.go:117] "RemoveContainer" containerID="d38eb54027fd2e58477ace2e706308d47a7e375204e75ac203f41a378140a830" Mar 18 18:27:38 crc kubenswrapper[4830]: E0318 18:27:38.171755 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d38eb54027fd2e58477ace2e706308d47a7e375204e75ac203f41a378140a830\": container with ID starting with d38eb54027fd2e58477ace2e706308d47a7e375204e75ac203f41a378140a830 not found: ID does not exist" containerID="d38eb54027fd2e58477ace2e706308d47a7e375204e75ac203f41a378140a830" Mar 18 18:27:38 crc kubenswrapper[4830]: I0318 18:27:38.171808 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d38eb54027fd2e58477ace2e706308d47a7e375204e75ac203f41a378140a830"} err="failed to get container status \"d38eb54027fd2e58477ace2e706308d47a7e375204e75ac203f41a378140a830\": rpc error: code = NotFound desc = could not find container \"d38eb54027fd2e58477ace2e706308d47a7e375204e75ac203f41a378140a830\": container with ID starting with d38eb54027fd2e58477ace2e706308d47a7e375204e75ac203f41a378140a830 not found: ID does not exist" Mar 18 18:27:38 crc kubenswrapper[4830]: I0318 18:27:38.213993 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5ad7be1-69f6-4162-ae51-b834929b59f5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:38 crc kubenswrapper[4830]: I0318 18:27:38.404166 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l7gp4"] Mar 18 18:27:38 crc kubenswrapper[4830]: I0318 18:27:38.416898 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l7gp4"] Mar 18 18:27:40 crc kubenswrapper[4830]: I0318 18:27:40.248213 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5ad7be1-69f6-4162-ae51-b834929b59f5" path="/var/lib/kubelet/pods/f5ad7be1-69f6-4162-ae51-b834929b59f5/volumes" Mar 18 18:28:00 crc kubenswrapper[4830]: I0318 18:28:00.206200 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564308-8s2dd"] Mar 18 18:28:00 crc kubenswrapper[4830]: E0318 18:28:00.207333 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8ff90f4-ffad-4379-9bbe-9ef9412229df" containerName="registry-server" Mar 18 18:28:00 crc kubenswrapper[4830]: I0318 18:28:00.207355 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8ff90f4-ffad-4379-9bbe-9ef9412229df" containerName="registry-server" Mar 18 18:28:00 crc kubenswrapper[4830]: E0318 18:28:00.207377 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5ad7be1-69f6-4162-ae51-b834929b59f5" containerName="registry-server" Mar 18 18:28:00 crc kubenswrapper[4830]: I0318 18:28:00.207391 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5ad7be1-69f6-4162-ae51-b834929b59f5" containerName="registry-server" Mar 18 18:28:00 crc kubenswrapper[4830]: E0318 18:28:00.207418 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8ff90f4-ffad-4379-9bbe-9ef9412229df" containerName="extract-content" Mar 18 18:28:00 crc kubenswrapper[4830]: I0318 18:28:00.207432 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8ff90f4-ffad-4379-9bbe-9ef9412229df" containerName="extract-content" Mar 18 18:28:00 crc kubenswrapper[4830]: E0318 18:28:00.207453 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8ff90f4-ffad-4379-9bbe-9ef9412229df" containerName="extract-utilities" Mar 18 18:28:00 crc kubenswrapper[4830]: I0318 18:28:00.207466 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8ff90f4-ffad-4379-9bbe-9ef9412229df" containerName="extract-utilities" Mar 18 18:28:00 crc kubenswrapper[4830]: E0318 18:28:00.207482 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5ad7be1-69f6-4162-ae51-b834929b59f5" containerName="extract-utilities" Mar 18 18:28:00 crc kubenswrapper[4830]: I0318 18:28:00.207496 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5ad7be1-69f6-4162-ae51-b834929b59f5" containerName="extract-utilities" Mar 18 18:28:00 crc kubenswrapper[4830]: E0318 18:28:00.207528 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5ad7be1-69f6-4162-ae51-b834929b59f5" containerName="extract-content" Mar 18 18:28:00 crc kubenswrapper[4830]: I0318 18:28:00.207543 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5ad7be1-69f6-4162-ae51-b834929b59f5" containerName="extract-content" Mar 18 18:28:00 crc kubenswrapper[4830]: I0318 18:28:00.207962 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5ad7be1-69f6-4162-ae51-b834929b59f5" containerName="registry-server" Mar 18 18:28:00 crc kubenswrapper[4830]: I0318 18:28:00.207996 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8ff90f4-ffad-4379-9bbe-9ef9412229df" containerName="registry-server" Mar 18 18:28:00 crc kubenswrapper[4830]: I0318 18:28:00.208738 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564308-8s2dd" Mar 18 18:28:00 crc kubenswrapper[4830]: I0318 18:28:00.211248 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 18:28:00 crc kubenswrapper[4830]: I0318 18:28:00.211536 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:28:00 crc kubenswrapper[4830]: I0318 18:28:00.211712 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:28:00 crc kubenswrapper[4830]: I0318 18:28:00.214486 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564308-8s2dd"] Mar 18 18:28:00 crc kubenswrapper[4830]: I0318 18:28:00.269438 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jprn8\" (UniqueName: \"kubernetes.io/projected/09371978-713b-4735-b7c2-28600f7a30af-kube-api-access-jprn8\") pod \"auto-csr-approver-29564308-8s2dd\" (UID: \"09371978-713b-4735-b7c2-28600f7a30af\") " pod="openshift-infra/auto-csr-approver-29564308-8s2dd" Mar 18 18:28:00 crc kubenswrapper[4830]: I0318 18:28:00.372960 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jprn8\" (UniqueName: \"kubernetes.io/projected/09371978-713b-4735-b7c2-28600f7a30af-kube-api-access-jprn8\") pod \"auto-csr-approver-29564308-8s2dd\" (UID: \"09371978-713b-4735-b7c2-28600f7a30af\") " pod="openshift-infra/auto-csr-approver-29564308-8s2dd" Mar 18 18:28:00 crc kubenswrapper[4830]: I0318 18:28:00.401435 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jprn8\" (UniqueName: \"kubernetes.io/projected/09371978-713b-4735-b7c2-28600f7a30af-kube-api-access-jprn8\") pod \"auto-csr-approver-29564308-8s2dd\" (UID: \"09371978-713b-4735-b7c2-28600f7a30af\") " pod="openshift-infra/auto-csr-approver-29564308-8s2dd" Mar 18 18:28:00 crc kubenswrapper[4830]: I0318 18:28:00.533173 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564308-8s2dd" Mar 18 18:28:01 crc kubenswrapper[4830]: I0318 18:28:01.046955 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564308-8s2dd"] Mar 18 18:28:01 crc kubenswrapper[4830]: W0318 18:28:01.051824 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09371978_713b_4735_b7c2_28600f7a30af.slice/crio-a21eec20d8f7335b9525e3ae27a1805011508461d02dd1dd0685c199e96ed39f WatchSource:0}: Error finding container a21eec20d8f7335b9525e3ae27a1805011508461d02dd1dd0685c199e96ed39f: Status 404 returned error can't find the container with id a21eec20d8f7335b9525e3ae27a1805011508461d02dd1dd0685c199e96ed39f Mar 18 18:28:01 crc kubenswrapper[4830]: I0318 18:28:01.329237 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564308-8s2dd" event={"ID":"09371978-713b-4735-b7c2-28600f7a30af","Type":"ContainerStarted","Data":"a21eec20d8f7335b9525e3ae27a1805011508461d02dd1dd0685c199e96ed39f"} Mar 18 18:28:03 crc kubenswrapper[4830]: I0318 18:28:03.354853 4830 generic.go:334] "Generic (PLEG): container finished" podID="09371978-713b-4735-b7c2-28600f7a30af" containerID="3e40494ebbee3cd7c874d1f58eac924538aa69de61dd4491956ed6e88edb310a" exitCode=0 Mar 18 18:28:03 crc kubenswrapper[4830]: I0318 18:28:03.354920 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564308-8s2dd" event={"ID":"09371978-713b-4735-b7c2-28600f7a30af","Type":"ContainerDied","Data":"3e40494ebbee3cd7c874d1f58eac924538aa69de61dd4491956ed6e88edb310a"} Mar 18 18:28:04 crc kubenswrapper[4830]: I0318 18:28:04.719433 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564308-8s2dd" Mar 18 18:28:04 crc kubenswrapper[4830]: I0318 18:28:04.748752 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jprn8\" (UniqueName: \"kubernetes.io/projected/09371978-713b-4735-b7c2-28600f7a30af-kube-api-access-jprn8\") pod \"09371978-713b-4735-b7c2-28600f7a30af\" (UID: \"09371978-713b-4735-b7c2-28600f7a30af\") " Mar 18 18:28:04 crc kubenswrapper[4830]: I0318 18:28:04.758220 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09371978-713b-4735-b7c2-28600f7a30af-kube-api-access-jprn8" (OuterVolumeSpecName: "kube-api-access-jprn8") pod "09371978-713b-4735-b7c2-28600f7a30af" (UID: "09371978-713b-4735-b7c2-28600f7a30af"). InnerVolumeSpecName "kube-api-access-jprn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:28:04 crc kubenswrapper[4830]: I0318 18:28:04.850418 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jprn8\" (UniqueName: \"kubernetes.io/projected/09371978-713b-4735-b7c2-28600f7a30af-kube-api-access-jprn8\") on node \"crc\" DevicePath \"\"" Mar 18 18:28:05 crc kubenswrapper[4830]: I0318 18:28:05.379027 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564308-8s2dd" event={"ID":"09371978-713b-4735-b7c2-28600f7a30af","Type":"ContainerDied","Data":"a21eec20d8f7335b9525e3ae27a1805011508461d02dd1dd0685c199e96ed39f"} Mar 18 18:28:05 crc kubenswrapper[4830]: I0318 18:28:05.379095 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a21eec20d8f7335b9525e3ae27a1805011508461d02dd1dd0685c199e96ed39f" Mar 18 18:28:05 crc kubenswrapper[4830]: I0318 18:28:05.379511 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564308-8s2dd" Mar 18 18:28:05 crc kubenswrapper[4830]: I0318 18:28:05.824685 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564302-vdnd2"] Mar 18 18:28:05 crc kubenswrapper[4830]: I0318 18:28:05.831781 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564302-vdnd2"] Mar 18 18:28:06 crc kubenswrapper[4830]: I0318 18:28:06.248310 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00b94971-0d4a-45d3-8b41-4730dbcd9c81" path="/var/lib/kubelet/pods/00b94971-0d4a-45d3-8b41-4730dbcd9c81/volumes" Mar 18 18:28:14 crc kubenswrapper[4830]: I0318 18:28:14.977732 4830 scope.go:117] "RemoveContainer" containerID="1107cede367b6677dfcb133136a1fc600b75615f76790b67e94f9a87ead486bd" Mar 18 18:28:15 crc kubenswrapper[4830]: I0318 18:28:15.008609 4830 scope.go:117] "RemoveContainer" containerID="41f91f23eaf980683c2e539fdf83cddc5976b7b8cc570cd5a0dd21341c97981e" Mar 18 18:28:15 crc kubenswrapper[4830]: I0318 18:28:15.043334 4830 scope.go:117] "RemoveContainer" containerID="9e6361c5ac167888b9787c9edea1f987baf91625a38cc774ca68d2fd5c29279b" Mar 18 18:28:15 crc kubenswrapper[4830]: I0318 18:28:15.095528 4830 scope.go:117] "RemoveContainer" containerID="1beae2c931be0c4737d38ea81296666d857479db917f360e7a3d94688386583c" Mar 18 18:28:15 crc kubenswrapper[4830]: I0318 18:28:15.127042 4830 scope.go:117] "RemoveContainer" containerID="a777ec6c17a30f13877088b3dbdfc19eeb5cb76c8ad52c4fdc99d771910af2fb" Mar 18 18:28:15 crc kubenswrapper[4830]: I0318 18:28:15.154909 4830 scope.go:117] "RemoveContainer" containerID="7f2292a71a9c798a2c17f9c6fc6b6d12fc68c263b920ce067f07390c0bc23f23" Mar 18 18:28:15 crc kubenswrapper[4830]: I0318 18:28:15.175872 4830 scope.go:117] "RemoveContainer" containerID="27397b7f0510c55e80bce8e6b2943cb87b8939f2cbb3fc1f61ad4e083d4a54cd" Mar 18 18:28:15 crc kubenswrapper[4830]: I0318 18:28:15.202167 4830 scope.go:117] "RemoveContainer" containerID="169735e5b77a4533222c6bab6fdb49feec714327883d4fd25c5a424ede227b9d" Mar 18 18:28:15 crc kubenswrapper[4830]: I0318 18:28:15.230602 4830 scope.go:117] "RemoveContainer" containerID="95aed85a54b5c9ff61365b28e7d347033cc76e530b5b1df84aab3499edc37f6b" Mar 18 18:28:15 crc kubenswrapper[4830]: I0318 18:28:15.251024 4830 scope.go:117] "RemoveContainer" containerID="4c0b35b5d195b0f9135287d08ef0223c294be7b2eccc6c9451f2bc66faf3424e" Mar 18 18:28:15 crc kubenswrapper[4830]: I0318 18:28:15.299080 4830 scope.go:117] "RemoveContainer" containerID="29fc62aa8b0c7dff64144c93d1f53c7be2667c73d45b77f8b2e9fee0136dd279" Mar 18 18:28:15 crc kubenswrapper[4830]: I0318 18:28:15.322123 4830 scope.go:117] "RemoveContainer" containerID="3607115effce7bd86408002d82ffa86b5174142250b309d1f475cbb22f7dc450" Mar 18 18:28:15 crc kubenswrapper[4830]: I0318 18:28:15.345232 4830 scope.go:117] "RemoveContainer" containerID="2798e226f5d33632ac3e39e1b2da992a846726cf311ecc786f14ceaefbe70926" Mar 18 18:28:29 crc kubenswrapper[4830]: I0318 18:28:29.509883 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:28:29 crc kubenswrapper[4830]: I0318 18:28:29.510743 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:28:59 crc kubenswrapper[4830]: I0318 18:28:59.509402 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:28:59 crc kubenswrapper[4830]: I0318 18:28:59.510176 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:29:15 crc kubenswrapper[4830]: I0318 18:29:15.573355 4830 scope.go:117] "RemoveContainer" containerID="701ea6d1b0fdb51a72c68cdb182a16279c852e27d1fa57098972a41a762217b5" Mar 18 18:29:15 crc kubenswrapper[4830]: I0318 18:29:15.606937 4830 scope.go:117] "RemoveContainer" containerID="21a4f27449c9ea1b7a1149cfd63216765da9dd6af1f60659938e91d068c9e30d" Mar 18 18:29:15 crc kubenswrapper[4830]: I0318 18:29:15.667885 4830 scope.go:117] "RemoveContainer" containerID="966ee135f8e6e3d440939198bd3d2a3c627df5403d51fc43caced871d92ebe29" Mar 18 18:29:15 crc kubenswrapper[4830]: I0318 18:29:15.701096 4830 scope.go:117] "RemoveContainer" containerID="0e020237f20895e44cd22804f80d8e0f9db0bf852624cfadd9a1771f880e84b4" Mar 18 18:29:15 crc kubenswrapper[4830]: I0318 18:29:15.760820 4830 scope.go:117] "RemoveContainer" containerID="ff1f5ffd4b74593f213089537aa13600d46dd89ff13f70a78e08d9778968bb0c" Mar 18 18:29:15 crc kubenswrapper[4830]: I0318 18:29:15.821322 4830 scope.go:117] "RemoveContainer" containerID="b60c175c22ac7e53fe71b2e37caf616851ec9cee94f547bc491df94aa3933b54" Mar 18 18:29:29 crc kubenswrapper[4830]: I0318 18:29:29.509496 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:29:29 crc kubenswrapper[4830]: I0318 18:29:29.510269 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:29:29 crc kubenswrapper[4830]: I0318 18:29:29.510341 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" Mar 18 18:29:29 crc kubenswrapper[4830]: I0318 18:29:29.511297 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"46d4f627aa313dcb3a4d23bb3daecb5b1b3e4d6558380e5cd88db3746ce3260b"} pod="openshift-machine-config-operator/machine-config-daemon-plzpb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 18:29:29 crc kubenswrapper[4830]: I0318 18:29:29.511404 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" containerID="cri-o://46d4f627aa313dcb3a4d23bb3daecb5b1b3e4d6558380e5cd88db3746ce3260b" gracePeriod=600 Mar 18 18:29:29 crc kubenswrapper[4830]: E0318 18:29:29.650381 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:29:30 crc kubenswrapper[4830]: I0318 18:29:30.276440 4830 generic.go:334] "Generic (PLEG): container finished" podID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerID="46d4f627aa313dcb3a4d23bb3daecb5b1b3e4d6558380e5cd88db3746ce3260b" exitCode=0 Mar 18 18:29:30 crc kubenswrapper[4830]: I0318 18:29:30.276530 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerDied","Data":"46d4f627aa313dcb3a4d23bb3daecb5b1b3e4d6558380e5cd88db3746ce3260b"} Mar 18 18:29:30 crc kubenswrapper[4830]: I0318 18:29:30.276606 4830 scope.go:117] "RemoveContainer" containerID="7e673d7cc71d559a72795b6d3a15f56048a692df9a147924f348d8b7d4cd054a" Mar 18 18:29:30 crc kubenswrapper[4830]: I0318 18:29:30.277924 4830 scope.go:117] "RemoveContainer" containerID="46d4f627aa313dcb3a4d23bb3daecb5b1b3e4d6558380e5cd88db3746ce3260b" Mar 18 18:29:30 crc kubenswrapper[4830]: E0318 18:29:30.278457 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:29:41 crc kubenswrapper[4830]: I0318 18:29:41.234477 4830 scope.go:117] "RemoveContainer" containerID="46d4f627aa313dcb3a4d23bb3daecb5b1b3e4d6558380e5cd88db3746ce3260b" Mar 18 18:29:41 crc kubenswrapper[4830]: E0318 18:29:41.235480 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:29:52 crc kubenswrapper[4830]: I0318 18:29:52.235679 4830 scope.go:117] "RemoveContainer" containerID="46d4f627aa313dcb3a4d23bb3daecb5b1b3e4d6558380e5cd88db3746ce3260b" Mar 18 18:29:52 crc kubenswrapper[4830]: E0318 18:29:52.236505 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:30:00 crc kubenswrapper[4830]: I0318 18:30:00.388537 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564310-s8v2b"] Mar 18 18:30:00 crc kubenswrapper[4830]: E0318 18:30:00.389498 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09371978-713b-4735-b7c2-28600f7a30af" containerName="oc" Mar 18 18:30:00 crc kubenswrapper[4830]: I0318 18:30:00.389515 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="09371978-713b-4735-b7c2-28600f7a30af" containerName="oc" Mar 18 18:30:00 crc kubenswrapper[4830]: I0318 18:30:00.389651 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="09371978-713b-4735-b7c2-28600f7a30af" containerName="oc" Mar 18 18:30:00 crc kubenswrapper[4830]: I0318 18:30:00.390096 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564310-s8v2b" Mar 18 18:30:00 crc kubenswrapper[4830]: I0318 18:30:00.392986 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:30:00 crc kubenswrapper[4830]: I0318 18:30:00.393087 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 18:30:00 crc kubenswrapper[4830]: I0318 18:30:00.399294 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:30:00 crc kubenswrapper[4830]: I0318 18:30:00.418358 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564310-s8v2b"] Mar 18 18:30:00 crc kubenswrapper[4830]: I0318 18:30:00.428557 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564310-lpqmq"] Mar 18 18:30:00 crc kubenswrapper[4830]: I0318 18:30:00.431628 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-lpqmq" Mar 18 18:30:00 crc kubenswrapper[4830]: I0318 18:30:00.435211 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 18:30:00 crc kubenswrapper[4830]: I0318 18:30:00.435426 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 18:30:00 crc kubenswrapper[4830]: I0318 18:30:00.441011 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564310-lpqmq"] Mar 18 18:30:00 crc kubenswrapper[4830]: I0318 18:30:00.523347 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz2tt\" (UniqueName: \"kubernetes.io/projected/bbb06e67-3f5b-41e5-97e7-b0c8ad433cf3-kube-api-access-mz2tt\") pod \"auto-csr-approver-29564310-s8v2b\" (UID: \"bbb06e67-3f5b-41e5-97e7-b0c8ad433cf3\") " pod="openshift-infra/auto-csr-approver-29564310-s8v2b" Mar 18 18:30:00 crc kubenswrapper[4830]: I0318 18:30:00.624729 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gvpj\" (UniqueName: \"kubernetes.io/projected/4eb0be4c-a5d0-4933-b04a-74b521c9a26b-kube-api-access-2gvpj\") pod \"collect-profiles-29564310-lpqmq\" (UID: \"4eb0be4c-a5d0-4933-b04a-74b521c9a26b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-lpqmq" Mar 18 18:30:00 crc kubenswrapper[4830]: I0318 18:30:00.625383 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4eb0be4c-a5d0-4933-b04a-74b521c9a26b-config-volume\") pod \"collect-profiles-29564310-lpqmq\" (UID: \"4eb0be4c-a5d0-4933-b04a-74b521c9a26b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-lpqmq" Mar 18 18:30:00 crc kubenswrapper[4830]: I0318 18:30:00.625524 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz2tt\" (UniqueName: \"kubernetes.io/projected/bbb06e67-3f5b-41e5-97e7-b0c8ad433cf3-kube-api-access-mz2tt\") pod \"auto-csr-approver-29564310-s8v2b\" (UID: \"bbb06e67-3f5b-41e5-97e7-b0c8ad433cf3\") " pod="openshift-infra/auto-csr-approver-29564310-s8v2b" Mar 18 18:30:00 crc kubenswrapper[4830]: I0318 18:30:00.625676 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4eb0be4c-a5d0-4933-b04a-74b521c9a26b-secret-volume\") pod \"collect-profiles-29564310-lpqmq\" (UID: \"4eb0be4c-a5d0-4933-b04a-74b521c9a26b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-lpqmq" Mar 18 18:30:00 crc kubenswrapper[4830]: I0318 18:30:00.644047 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz2tt\" (UniqueName: \"kubernetes.io/projected/bbb06e67-3f5b-41e5-97e7-b0c8ad433cf3-kube-api-access-mz2tt\") pod \"auto-csr-approver-29564310-s8v2b\" (UID: \"bbb06e67-3f5b-41e5-97e7-b0c8ad433cf3\") " pod="openshift-infra/auto-csr-approver-29564310-s8v2b" Mar 18 18:30:00 crc kubenswrapper[4830]: I0318 18:30:00.724202 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564310-s8v2b" Mar 18 18:30:00 crc kubenswrapper[4830]: I0318 18:30:00.726710 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4eb0be4c-a5d0-4933-b04a-74b521c9a26b-secret-volume\") pod \"collect-profiles-29564310-lpqmq\" (UID: \"4eb0be4c-a5d0-4933-b04a-74b521c9a26b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-lpqmq" Mar 18 18:30:00 crc kubenswrapper[4830]: I0318 18:30:00.726861 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gvpj\" (UniqueName: \"kubernetes.io/projected/4eb0be4c-a5d0-4933-b04a-74b521c9a26b-kube-api-access-2gvpj\") pod \"collect-profiles-29564310-lpqmq\" (UID: \"4eb0be4c-a5d0-4933-b04a-74b521c9a26b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-lpqmq" Mar 18 18:30:00 crc kubenswrapper[4830]: I0318 18:30:00.726967 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4eb0be4c-a5d0-4933-b04a-74b521c9a26b-config-volume\") pod \"collect-profiles-29564310-lpqmq\" (UID: \"4eb0be4c-a5d0-4933-b04a-74b521c9a26b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-lpqmq" Mar 18 18:30:00 crc kubenswrapper[4830]: I0318 18:30:00.728199 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4eb0be4c-a5d0-4933-b04a-74b521c9a26b-config-volume\") pod \"collect-profiles-29564310-lpqmq\" (UID: \"4eb0be4c-a5d0-4933-b04a-74b521c9a26b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-lpqmq" Mar 18 18:30:00 crc kubenswrapper[4830]: I0318 18:30:00.735408 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4eb0be4c-a5d0-4933-b04a-74b521c9a26b-secret-volume\") pod \"collect-profiles-29564310-lpqmq\" (UID: \"4eb0be4c-a5d0-4933-b04a-74b521c9a26b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-lpqmq" Mar 18 18:30:00 crc kubenswrapper[4830]: I0318 18:30:00.742623 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gvpj\" (UniqueName: \"kubernetes.io/projected/4eb0be4c-a5d0-4933-b04a-74b521c9a26b-kube-api-access-2gvpj\") pod \"collect-profiles-29564310-lpqmq\" (UID: \"4eb0be4c-a5d0-4933-b04a-74b521c9a26b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-lpqmq" Mar 18 18:30:00 crc kubenswrapper[4830]: I0318 18:30:00.778411 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-lpqmq" Mar 18 18:30:01 crc kubenswrapper[4830]: I0318 18:30:01.015615 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564310-s8v2b"] Mar 18 18:30:01 crc kubenswrapper[4830]: I0318 18:30:01.045862 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564310-lpqmq"] Mar 18 18:30:01 crc kubenswrapper[4830]: W0318 18:30:01.053500 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4eb0be4c_a5d0_4933_b04a_74b521c9a26b.slice/crio-0bc76bb415bd2d8de26df86432d3331ef525c815c3da58cc482005d89603ce52 WatchSource:0}: Error finding container 0bc76bb415bd2d8de26df86432d3331ef525c815c3da58cc482005d89603ce52: Status 404 returned error can't find the container with id 0bc76bb415bd2d8de26df86432d3331ef525c815c3da58cc482005d89603ce52 Mar 18 18:30:01 crc kubenswrapper[4830]: I0318 18:30:01.591507 4830 generic.go:334] "Generic (PLEG): container finished" podID="4eb0be4c-a5d0-4933-b04a-74b521c9a26b" containerID="816a46d175ee8b1b5e08a4d5b9b015164dd17158fa145c0eb40cfb987577a846" exitCode=0 Mar 18 18:30:01 crc kubenswrapper[4830]: I0318 18:30:01.591908 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-lpqmq" event={"ID":"4eb0be4c-a5d0-4933-b04a-74b521c9a26b","Type":"ContainerDied","Data":"816a46d175ee8b1b5e08a4d5b9b015164dd17158fa145c0eb40cfb987577a846"} Mar 18 18:30:01 crc kubenswrapper[4830]: I0318 18:30:01.591942 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-lpqmq" event={"ID":"4eb0be4c-a5d0-4933-b04a-74b521c9a26b","Type":"ContainerStarted","Data":"0bc76bb415bd2d8de26df86432d3331ef525c815c3da58cc482005d89603ce52"} Mar 18 18:30:01 crc kubenswrapper[4830]: I0318 18:30:01.595029 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564310-s8v2b" event={"ID":"bbb06e67-3f5b-41e5-97e7-b0c8ad433cf3","Type":"ContainerStarted","Data":"fe27e610de5dd9bcd91155633fc5f4340e4d234761e64fa124067f15713a78bf"} Mar 18 18:30:02 crc kubenswrapper[4830]: I0318 18:30:02.914397 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-lpqmq" Mar 18 18:30:03 crc kubenswrapper[4830]: I0318 18:30:03.066525 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4eb0be4c-a5d0-4933-b04a-74b521c9a26b-config-volume\") pod \"4eb0be4c-a5d0-4933-b04a-74b521c9a26b\" (UID: \"4eb0be4c-a5d0-4933-b04a-74b521c9a26b\") " Mar 18 18:30:03 crc kubenswrapper[4830]: I0318 18:30:03.066588 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4eb0be4c-a5d0-4933-b04a-74b521c9a26b-secret-volume\") pod \"4eb0be4c-a5d0-4933-b04a-74b521c9a26b\" (UID: \"4eb0be4c-a5d0-4933-b04a-74b521c9a26b\") " Mar 18 18:30:03 crc kubenswrapper[4830]: I0318 18:30:03.066661 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gvpj\" (UniqueName: \"kubernetes.io/projected/4eb0be4c-a5d0-4933-b04a-74b521c9a26b-kube-api-access-2gvpj\") pod \"4eb0be4c-a5d0-4933-b04a-74b521c9a26b\" (UID: \"4eb0be4c-a5d0-4933-b04a-74b521c9a26b\") " Mar 18 18:30:03 crc kubenswrapper[4830]: I0318 18:30:03.067550 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4eb0be4c-a5d0-4933-b04a-74b521c9a26b-config-volume" (OuterVolumeSpecName: "config-volume") pod "4eb0be4c-a5d0-4933-b04a-74b521c9a26b" (UID: "4eb0be4c-a5d0-4933-b04a-74b521c9a26b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:30:03 crc kubenswrapper[4830]: I0318 18:30:03.068024 4830 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4eb0be4c-a5d0-4933-b04a-74b521c9a26b-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 18:30:03 crc kubenswrapper[4830]: I0318 18:30:03.075533 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4eb0be4c-a5d0-4933-b04a-74b521c9a26b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4eb0be4c-a5d0-4933-b04a-74b521c9a26b" (UID: "4eb0be4c-a5d0-4933-b04a-74b521c9a26b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:30:03 crc kubenswrapper[4830]: I0318 18:30:03.075534 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eb0be4c-a5d0-4933-b04a-74b521c9a26b-kube-api-access-2gvpj" (OuterVolumeSpecName: "kube-api-access-2gvpj") pod "4eb0be4c-a5d0-4933-b04a-74b521c9a26b" (UID: "4eb0be4c-a5d0-4933-b04a-74b521c9a26b"). InnerVolumeSpecName "kube-api-access-2gvpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:30:03 crc kubenswrapper[4830]: I0318 18:30:03.169423 4830 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4eb0be4c-a5d0-4933-b04a-74b521c9a26b-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 18:30:03 crc kubenswrapper[4830]: I0318 18:30:03.169473 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gvpj\" (UniqueName: \"kubernetes.io/projected/4eb0be4c-a5d0-4933-b04a-74b521c9a26b-kube-api-access-2gvpj\") on node \"crc\" DevicePath \"\"" Mar 18 18:30:03 crc kubenswrapper[4830]: I0318 18:30:03.617989 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-lpqmq" Mar 18 18:30:03 crc kubenswrapper[4830]: I0318 18:30:03.617996 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-lpqmq" event={"ID":"4eb0be4c-a5d0-4933-b04a-74b521c9a26b","Type":"ContainerDied","Data":"0bc76bb415bd2d8de26df86432d3331ef525c815c3da58cc482005d89603ce52"} Mar 18 18:30:03 crc kubenswrapper[4830]: I0318 18:30:03.618462 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bc76bb415bd2d8de26df86432d3331ef525c815c3da58cc482005d89603ce52" Mar 18 18:30:04 crc kubenswrapper[4830]: I0318 18:30:04.632104 4830 generic.go:334] "Generic (PLEG): container finished" podID="bbb06e67-3f5b-41e5-97e7-b0c8ad433cf3" containerID="7f31f89187c0c58482ab6367789ba55810814555a97cb99dee7dbde8b6fc051c" exitCode=0 Mar 18 18:30:04 crc kubenswrapper[4830]: I0318 18:30:04.632159 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564310-s8v2b" event={"ID":"bbb06e67-3f5b-41e5-97e7-b0c8ad433cf3","Type":"ContainerDied","Data":"7f31f89187c0c58482ab6367789ba55810814555a97cb99dee7dbde8b6fc051c"} Mar 18 18:30:06 crc kubenswrapper[4830]: I0318 18:30:06.017935 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564310-s8v2b" Mar 18 18:30:06 crc kubenswrapper[4830]: I0318 18:30:06.114816 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mz2tt\" (UniqueName: \"kubernetes.io/projected/bbb06e67-3f5b-41e5-97e7-b0c8ad433cf3-kube-api-access-mz2tt\") pod \"bbb06e67-3f5b-41e5-97e7-b0c8ad433cf3\" (UID: \"bbb06e67-3f5b-41e5-97e7-b0c8ad433cf3\") " Mar 18 18:30:06 crc kubenswrapper[4830]: I0318 18:30:06.124066 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbb06e67-3f5b-41e5-97e7-b0c8ad433cf3-kube-api-access-mz2tt" (OuterVolumeSpecName: "kube-api-access-mz2tt") pod "bbb06e67-3f5b-41e5-97e7-b0c8ad433cf3" (UID: "bbb06e67-3f5b-41e5-97e7-b0c8ad433cf3"). InnerVolumeSpecName "kube-api-access-mz2tt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:30:06 crc kubenswrapper[4830]: I0318 18:30:06.217377 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mz2tt\" (UniqueName: \"kubernetes.io/projected/bbb06e67-3f5b-41e5-97e7-b0c8ad433cf3-kube-api-access-mz2tt\") on node \"crc\" DevicePath \"\"" Mar 18 18:30:06 crc kubenswrapper[4830]: I0318 18:30:06.651190 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564310-s8v2b" event={"ID":"bbb06e67-3f5b-41e5-97e7-b0c8ad433cf3","Type":"ContainerDied","Data":"fe27e610de5dd9bcd91155633fc5f4340e4d234761e64fa124067f15713a78bf"} Mar 18 18:30:06 crc kubenswrapper[4830]: I0318 18:30:06.651239 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe27e610de5dd9bcd91155633fc5f4340e4d234761e64fa124067f15713a78bf" Mar 18 18:30:06 crc kubenswrapper[4830]: I0318 18:30:06.651314 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564310-s8v2b" Mar 18 18:30:07 crc kubenswrapper[4830]: I0318 18:30:07.113571 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564304-4mmp9"] Mar 18 18:30:07 crc kubenswrapper[4830]: I0318 18:30:07.125215 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564304-4mmp9"] Mar 18 18:30:07 crc kubenswrapper[4830]: I0318 18:30:07.235364 4830 scope.go:117] "RemoveContainer" containerID="46d4f627aa313dcb3a4d23bb3daecb5b1b3e4d6558380e5cd88db3746ce3260b" Mar 18 18:30:07 crc kubenswrapper[4830]: E0318 18:30:07.235581 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:30:08 crc kubenswrapper[4830]: I0318 18:30:08.246116 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86e46d34-ca52-4c41-88c3-376e6219e90f" path="/var/lib/kubelet/pods/86e46d34-ca52-4c41-88c3-376e6219e90f/volumes" Mar 18 18:30:15 crc kubenswrapper[4830]: I0318 18:30:15.943986 4830 scope.go:117] "RemoveContainer" containerID="4cc141c38da7f2f14e8af81b886f2466b15b63a804233f4ae743bb0e785d7d90" Mar 18 18:30:15 crc kubenswrapper[4830]: I0318 18:30:15.975497 4830 scope.go:117] "RemoveContainer" containerID="529a3e227f8ade8e3311b168c9e554155f352338f229794d4a4c7d826510abae" Mar 18 18:30:16 crc kubenswrapper[4830]: I0318 18:30:16.033254 4830 scope.go:117] "RemoveContainer" containerID="91aff4166cbebec7917a849f1dae12a4f2caababfa680539bc75bf53f49cf551" Mar 18 18:30:16 crc kubenswrapper[4830]: I0318 18:30:16.064629 4830 scope.go:117] "RemoveContainer" containerID="a497502cf1e6b810a8afc3afffce3046c12bda7092873c827fdf04d4ed710b99" Mar 18 18:30:16 crc kubenswrapper[4830]: I0318 18:30:16.130725 4830 scope.go:117] "RemoveContainer" containerID="7269557d2134b5328a9871c88e8307ed1155b8cea2686c2ae04cc355079f438f" Mar 18 18:30:19 crc kubenswrapper[4830]: I0318 18:30:19.235380 4830 scope.go:117] "RemoveContainer" containerID="46d4f627aa313dcb3a4d23bb3daecb5b1b3e4d6558380e5cd88db3746ce3260b" Mar 18 18:30:19 crc kubenswrapper[4830]: E0318 18:30:19.236308 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:30:33 crc kubenswrapper[4830]: I0318 18:30:33.234953 4830 scope.go:117] "RemoveContainer" containerID="46d4f627aa313dcb3a4d23bb3daecb5b1b3e4d6558380e5cd88db3746ce3260b" Mar 18 18:30:33 crc kubenswrapper[4830]: E0318 18:30:33.236142 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:30:45 crc kubenswrapper[4830]: I0318 18:30:45.235178 4830 scope.go:117] "RemoveContainer" containerID="46d4f627aa313dcb3a4d23bb3daecb5b1b3e4d6558380e5cd88db3746ce3260b" Mar 18 18:30:45 crc kubenswrapper[4830]: E0318 18:30:45.236011 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:30:57 crc kubenswrapper[4830]: I0318 18:30:57.235067 4830 scope.go:117] "RemoveContainer" containerID="46d4f627aa313dcb3a4d23bb3daecb5b1b3e4d6558380e5cd88db3746ce3260b" Mar 18 18:30:57 crc kubenswrapper[4830]: E0318 18:30:57.236145 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:31:10 crc kubenswrapper[4830]: I0318 18:31:10.235414 4830 scope.go:117] "RemoveContainer" containerID="46d4f627aa313dcb3a4d23bb3daecb5b1b3e4d6558380e5cd88db3746ce3260b" Mar 18 18:31:10 crc kubenswrapper[4830]: E0318 18:31:10.236750 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:31:24 crc kubenswrapper[4830]: I0318 18:31:24.235014 4830 scope.go:117] "RemoveContainer" containerID="46d4f627aa313dcb3a4d23bb3daecb5b1b3e4d6558380e5cd88db3746ce3260b" Mar 18 18:31:24 crc kubenswrapper[4830]: E0318 18:31:24.236310 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:31:36 crc kubenswrapper[4830]: I0318 18:31:36.244975 4830 scope.go:117] "RemoveContainer" containerID="46d4f627aa313dcb3a4d23bb3daecb5b1b3e4d6558380e5cd88db3746ce3260b" Mar 18 18:31:36 crc kubenswrapper[4830]: E0318 18:31:36.246157 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:31:48 crc kubenswrapper[4830]: I0318 18:31:48.235226 4830 scope.go:117] "RemoveContainer" containerID="46d4f627aa313dcb3a4d23bb3daecb5b1b3e4d6558380e5cd88db3746ce3260b" Mar 18 18:31:48 crc kubenswrapper[4830]: E0318 18:31:48.237359 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:32:00 crc kubenswrapper[4830]: I0318 18:32:00.161638 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564312-tt6hm"] Mar 18 18:32:00 crc kubenswrapper[4830]: E0318 18:32:00.163067 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eb0be4c-a5d0-4933-b04a-74b521c9a26b" containerName="collect-profiles" Mar 18 18:32:00 crc kubenswrapper[4830]: I0318 18:32:00.163275 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eb0be4c-a5d0-4933-b04a-74b521c9a26b" containerName="collect-profiles" Mar 18 18:32:00 crc kubenswrapper[4830]: E0318 18:32:00.163306 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbb06e67-3f5b-41e5-97e7-b0c8ad433cf3" containerName="oc" Mar 18 18:32:00 crc kubenswrapper[4830]: I0318 18:32:00.163322 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbb06e67-3f5b-41e5-97e7-b0c8ad433cf3" containerName="oc" Mar 18 18:32:00 crc kubenswrapper[4830]: I0318 18:32:00.163682 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbb06e67-3f5b-41e5-97e7-b0c8ad433cf3" containerName="oc" Mar 18 18:32:00 crc kubenswrapper[4830]: I0318 18:32:00.163724 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eb0be4c-a5d0-4933-b04a-74b521c9a26b" containerName="collect-profiles" Mar 18 18:32:00 crc kubenswrapper[4830]: I0318 18:32:00.164675 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564312-tt6hm" Mar 18 18:32:00 crc kubenswrapper[4830]: I0318 18:32:00.167855 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 18:32:00 crc kubenswrapper[4830]: I0318 18:32:00.168585 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:32:00 crc kubenswrapper[4830]: I0318 18:32:00.171658 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:32:00 crc kubenswrapper[4830]: I0318 18:32:00.185617 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564312-tt6hm"] Mar 18 18:32:00 crc kubenswrapper[4830]: I0318 18:32:00.234812 4830 scope.go:117] "RemoveContainer" containerID="46d4f627aa313dcb3a4d23bb3daecb5b1b3e4d6558380e5cd88db3746ce3260b" Mar 18 18:32:00 crc kubenswrapper[4830]: E0318 18:32:00.235351 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:32:00 crc kubenswrapper[4830]: I0318 18:32:00.282181 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88m7k\" (UniqueName: \"kubernetes.io/projected/685eb33e-eea1-47ad-bbf4-567dc5821cac-kube-api-access-88m7k\") pod \"auto-csr-approver-29564312-tt6hm\" (UID: \"685eb33e-eea1-47ad-bbf4-567dc5821cac\") " pod="openshift-infra/auto-csr-approver-29564312-tt6hm" Mar 18 18:32:00 crc kubenswrapper[4830]: I0318 18:32:00.383930 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88m7k\" (UniqueName: \"kubernetes.io/projected/685eb33e-eea1-47ad-bbf4-567dc5821cac-kube-api-access-88m7k\") pod \"auto-csr-approver-29564312-tt6hm\" (UID: \"685eb33e-eea1-47ad-bbf4-567dc5821cac\") " pod="openshift-infra/auto-csr-approver-29564312-tt6hm" Mar 18 18:32:00 crc kubenswrapper[4830]: I0318 18:32:00.407468 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88m7k\" (UniqueName: \"kubernetes.io/projected/685eb33e-eea1-47ad-bbf4-567dc5821cac-kube-api-access-88m7k\") pod \"auto-csr-approver-29564312-tt6hm\" (UID: \"685eb33e-eea1-47ad-bbf4-567dc5821cac\") " pod="openshift-infra/auto-csr-approver-29564312-tt6hm" Mar 18 18:32:00 crc kubenswrapper[4830]: I0318 18:32:00.535022 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564312-tt6hm" Mar 18 18:32:00 crc kubenswrapper[4830]: I0318 18:32:00.832008 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564312-tt6hm"] Mar 18 18:32:00 crc kubenswrapper[4830]: I0318 18:32:00.843683 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 18:32:01 crc kubenswrapper[4830]: I0318 18:32:01.699114 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564312-tt6hm" event={"ID":"685eb33e-eea1-47ad-bbf4-567dc5821cac","Type":"ContainerStarted","Data":"651fa7943a7240cdcea4c4e5803aa4177ed6407e9af39413e59a21089f0beb0e"} Mar 18 18:32:02 crc kubenswrapper[4830]: I0318 18:32:02.708196 4830 generic.go:334] "Generic (PLEG): container finished" podID="685eb33e-eea1-47ad-bbf4-567dc5821cac" containerID="a00bd79a7aa0d7b7bfea0d24aeab33dadc514aba38d34746691e4adcd95d9fe8" exitCode=0 Mar 18 18:32:02 crc kubenswrapper[4830]: I0318 18:32:02.708527 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564312-tt6hm" event={"ID":"685eb33e-eea1-47ad-bbf4-567dc5821cac","Type":"ContainerDied","Data":"a00bd79a7aa0d7b7bfea0d24aeab33dadc514aba38d34746691e4adcd95d9fe8"} Mar 18 18:32:04 crc kubenswrapper[4830]: I0318 18:32:04.093233 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564312-tt6hm" Mar 18 18:32:04 crc kubenswrapper[4830]: I0318 18:32:04.240936 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88m7k\" (UniqueName: \"kubernetes.io/projected/685eb33e-eea1-47ad-bbf4-567dc5821cac-kube-api-access-88m7k\") pod \"685eb33e-eea1-47ad-bbf4-567dc5821cac\" (UID: \"685eb33e-eea1-47ad-bbf4-567dc5821cac\") " Mar 18 18:32:04 crc kubenswrapper[4830]: I0318 18:32:04.249077 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/685eb33e-eea1-47ad-bbf4-567dc5821cac-kube-api-access-88m7k" (OuterVolumeSpecName: "kube-api-access-88m7k") pod "685eb33e-eea1-47ad-bbf4-567dc5821cac" (UID: "685eb33e-eea1-47ad-bbf4-567dc5821cac"). InnerVolumeSpecName "kube-api-access-88m7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:32:04 crc kubenswrapper[4830]: I0318 18:32:04.342826 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88m7k\" (UniqueName: \"kubernetes.io/projected/685eb33e-eea1-47ad-bbf4-567dc5821cac-kube-api-access-88m7k\") on node \"crc\" DevicePath \"\"" Mar 18 18:32:04 crc kubenswrapper[4830]: I0318 18:32:04.732411 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564312-tt6hm" event={"ID":"685eb33e-eea1-47ad-bbf4-567dc5821cac","Type":"ContainerDied","Data":"651fa7943a7240cdcea4c4e5803aa4177ed6407e9af39413e59a21089f0beb0e"} Mar 18 18:32:04 crc kubenswrapper[4830]: I0318 18:32:04.732467 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="651fa7943a7240cdcea4c4e5803aa4177ed6407e9af39413e59a21089f0beb0e" Mar 18 18:32:04 crc kubenswrapper[4830]: I0318 18:32:04.732607 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564312-tt6hm" Mar 18 18:32:05 crc kubenswrapper[4830]: I0318 18:32:05.165622 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564306-q66sx"] Mar 18 18:32:05 crc kubenswrapper[4830]: I0318 18:32:05.172220 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564306-q66sx"] Mar 18 18:32:06 crc kubenswrapper[4830]: I0318 18:32:06.248718 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07cd51e9-9091-4a1c-8055-448c0df097ed" path="/var/lib/kubelet/pods/07cd51e9-9091-4a1c-8055-448c0df097ed/volumes" Mar 18 18:32:13 crc kubenswrapper[4830]: I0318 18:32:13.234502 4830 scope.go:117] "RemoveContainer" containerID="46d4f627aa313dcb3a4d23bb3daecb5b1b3e4d6558380e5cd88db3746ce3260b" Mar 18 18:32:13 crc kubenswrapper[4830]: E0318 18:32:13.235646 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:32:16 crc kubenswrapper[4830]: I0318 18:32:16.271694 4830 scope.go:117] "RemoveContainer" containerID="3397918b5d9c6f9e9358b8667a445e69620688770fa08a78734184570748fd51" Mar 18 18:32:25 crc kubenswrapper[4830]: I0318 18:32:25.235450 4830 scope.go:117] "RemoveContainer" containerID="46d4f627aa313dcb3a4d23bb3daecb5b1b3e4d6558380e5cd88db3746ce3260b" Mar 18 18:32:25 crc kubenswrapper[4830]: E0318 18:32:25.237368 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:32:38 crc kubenswrapper[4830]: I0318 18:32:38.236620 4830 scope.go:117] "RemoveContainer" containerID="46d4f627aa313dcb3a4d23bb3daecb5b1b3e4d6558380e5cd88db3746ce3260b" Mar 18 18:32:38 crc kubenswrapper[4830]: E0318 18:32:38.237498 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:32:52 crc kubenswrapper[4830]: I0318 18:32:52.235348 4830 scope.go:117] "RemoveContainer" containerID="46d4f627aa313dcb3a4d23bb3daecb5b1b3e4d6558380e5cd88db3746ce3260b" Mar 18 18:32:52 crc kubenswrapper[4830]: E0318 18:32:52.236565 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:33:05 crc kubenswrapper[4830]: I0318 18:33:05.235153 4830 scope.go:117] "RemoveContainer" containerID="46d4f627aa313dcb3a4d23bb3daecb5b1b3e4d6558380e5cd88db3746ce3260b" Mar 18 18:33:05 crc kubenswrapper[4830]: E0318 18:33:05.236431 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:33:18 crc kubenswrapper[4830]: I0318 18:33:18.235317 4830 scope.go:117] "RemoveContainer" containerID="46d4f627aa313dcb3a4d23bb3daecb5b1b3e4d6558380e5cd88db3746ce3260b" Mar 18 18:33:18 crc kubenswrapper[4830]: E0318 18:33:18.236461 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:33:29 crc kubenswrapper[4830]: I0318 18:33:29.234434 4830 scope.go:117] "RemoveContainer" containerID="46d4f627aa313dcb3a4d23bb3daecb5b1b3e4d6558380e5cd88db3746ce3260b" Mar 18 18:33:29 crc kubenswrapper[4830]: E0318 18:33:29.235799 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:33:42 crc kubenswrapper[4830]: I0318 18:33:42.235901 4830 scope.go:117] "RemoveContainer" containerID="46d4f627aa313dcb3a4d23bb3daecb5b1b3e4d6558380e5cd88db3746ce3260b" Mar 18 18:33:42 crc kubenswrapper[4830]: E0318 18:33:42.237140 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:33:57 crc kubenswrapper[4830]: I0318 18:33:57.236019 4830 scope.go:117] "RemoveContainer" containerID="46d4f627aa313dcb3a4d23bb3daecb5b1b3e4d6558380e5cd88db3746ce3260b" Mar 18 18:33:57 crc kubenswrapper[4830]: E0318 18:33:57.237672 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:34:00 crc kubenswrapper[4830]: I0318 18:34:00.151497 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564314-lxbdr"] Mar 18 18:34:00 crc kubenswrapper[4830]: E0318 18:34:00.152102 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="685eb33e-eea1-47ad-bbf4-567dc5821cac" containerName="oc" Mar 18 18:34:00 crc kubenswrapper[4830]: I0318 18:34:00.152117 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="685eb33e-eea1-47ad-bbf4-567dc5821cac" containerName="oc" Mar 18 18:34:00 crc kubenswrapper[4830]: I0318 18:34:00.152312 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="685eb33e-eea1-47ad-bbf4-567dc5821cac" containerName="oc" Mar 18 18:34:00 crc kubenswrapper[4830]: I0318 18:34:00.152821 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564314-lxbdr" Mar 18 18:34:00 crc kubenswrapper[4830]: I0318 18:34:00.156108 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:34:00 crc kubenswrapper[4830]: I0318 18:34:00.156937 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:34:00 crc kubenswrapper[4830]: I0318 18:34:00.161368 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564314-lxbdr"] Mar 18 18:34:00 crc kubenswrapper[4830]: I0318 18:34:00.162221 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 18:34:00 crc kubenswrapper[4830]: I0318 18:34:00.196712 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6zgv\" (UniqueName: \"kubernetes.io/projected/ddccc616-3d55-449e-a9fe-b0769f1b1034-kube-api-access-k6zgv\") pod \"auto-csr-approver-29564314-lxbdr\" (UID: \"ddccc616-3d55-449e-a9fe-b0769f1b1034\") " pod="openshift-infra/auto-csr-approver-29564314-lxbdr" Mar 18 18:34:00 crc kubenswrapper[4830]: I0318 18:34:00.298385 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6zgv\" (UniqueName: \"kubernetes.io/projected/ddccc616-3d55-449e-a9fe-b0769f1b1034-kube-api-access-k6zgv\") pod \"auto-csr-approver-29564314-lxbdr\" (UID: \"ddccc616-3d55-449e-a9fe-b0769f1b1034\") " pod="openshift-infra/auto-csr-approver-29564314-lxbdr" Mar 18 18:34:00 crc kubenswrapper[4830]: I0318 18:34:00.332172 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6zgv\" (UniqueName: \"kubernetes.io/projected/ddccc616-3d55-449e-a9fe-b0769f1b1034-kube-api-access-k6zgv\") pod \"auto-csr-approver-29564314-lxbdr\" (UID: \"ddccc616-3d55-449e-a9fe-b0769f1b1034\") " pod="openshift-infra/auto-csr-approver-29564314-lxbdr" Mar 18 18:34:00 crc kubenswrapper[4830]: I0318 18:34:00.485853 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564314-lxbdr" Mar 18 18:34:00 crc kubenswrapper[4830]: I0318 18:34:00.984665 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564314-lxbdr"] Mar 18 18:34:00 crc kubenswrapper[4830]: W0318 18:34:00.990694 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddccc616_3d55_449e_a9fe_b0769f1b1034.slice/crio-bab2ff6ee1dd72afd278818d12b011d2d3528c97623e89f44a5e2ef31dab193d WatchSource:0}: Error finding container bab2ff6ee1dd72afd278818d12b011d2d3528c97623e89f44a5e2ef31dab193d: Status 404 returned error can't find the container with id bab2ff6ee1dd72afd278818d12b011d2d3528c97623e89f44a5e2ef31dab193d Mar 18 18:34:01 crc kubenswrapper[4830]: I0318 18:34:01.819885 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564314-lxbdr" event={"ID":"ddccc616-3d55-449e-a9fe-b0769f1b1034","Type":"ContainerStarted","Data":"bab2ff6ee1dd72afd278818d12b011d2d3528c97623e89f44a5e2ef31dab193d"} Mar 18 18:34:02 crc kubenswrapper[4830]: I0318 18:34:02.832230 4830 generic.go:334] "Generic (PLEG): container finished" podID="ddccc616-3d55-449e-a9fe-b0769f1b1034" containerID="d3d1f2f024b45ab977ec859af2ce98d94c4a018f026966edfde536e41cb83407" exitCode=0 Mar 18 18:34:02 crc kubenswrapper[4830]: I0318 18:34:02.832321 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564314-lxbdr" event={"ID":"ddccc616-3d55-449e-a9fe-b0769f1b1034","Type":"ContainerDied","Data":"d3d1f2f024b45ab977ec859af2ce98d94c4a018f026966edfde536e41cb83407"} Mar 18 18:34:04 crc kubenswrapper[4830]: I0318 18:34:04.215411 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564314-lxbdr" Mar 18 18:34:04 crc kubenswrapper[4830]: I0318 18:34:04.359254 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6zgv\" (UniqueName: \"kubernetes.io/projected/ddccc616-3d55-449e-a9fe-b0769f1b1034-kube-api-access-k6zgv\") pod \"ddccc616-3d55-449e-a9fe-b0769f1b1034\" (UID: \"ddccc616-3d55-449e-a9fe-b0769f1b1034\") " Mar 18 18:34:04 crc kubenswrapper[4830]: I0318 18:34:04.366056 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddccc616-3d55-449e-a9fe-b0769f1b1034-kube-api-access-k6zgv" (OuterVolumeSpecName: "kube-api-access-k6zgv") pod "ddccc616-3d55-449e-a9fe-b0769f1b1034" (UID: "ddccc616-3d55-449e-a9fe-b0769f1b1034"). InnerVolumeSpecName "kube-api-access-k6zgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:34:04 crc kubenswrapper[4830]: I0318 18:34:04.461405 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6zgv\" (UniqueName: \"kubernetes.io/projected/ddccc616-3d55-449e-a9fe-b0769f1b1034-kube-api-access-k6zgv\") on node \"crc\" DevicePath \"\"" Mar 18 18:34:04 crc kubenswrapper[4830]: I0318 18:34:04.855347 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564314-lxbdr" event={"ID":"ddccc616-3d55-449e-a9fe-b0769f1b1034","Type":"ContainerDied","Data":"bab2ff6ee1dd72afd278818d12b011d2d3528c97623e89f44a5e2ef31dab193d"} Mar 18 18:34:04 crc kubenswrapper[4830]: I0318 18:34:04.855594 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bab2ff6ee1dd72afd278818d12b011d2d3528c97623e89f44a5e2ef31dab193d" Mar 18 18:34:04 crc kubenswrapper[4830]: I0318 18:34:04.855667 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564314-lxbdr" Mar 18 18:34:05 crc kubenswrapper[4830]: I0318 18:34:05.290012 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564308-8s2dd"] Mar 18 18:34:05 crc kubenswrapper[4830]: I0318 18:34:05.299350 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564308-8s2dd"] Mar 18 18:34:06 crc kubenswrapper[4830]: I0318 18:34:06.279817 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09371978-713b-4735-b7c2-28600f7a30af" path="/var/lib/kubelet/pods/09371978-713b-4735-b7c2-28600f7a30af/volumes" Mar 18 18:34:10 crc kubenswrapper[4830]: I0318 18:34:10.235047 4830 scope.go:117] "RemoveContainer" containerID="46d4f627aa313dcb3a4d23bb3daecb5b1b3e4d6558380e5cd88db3746ce3260b" Mar 18 18:34:10 crc kubenswrapper[4830]: E0318 18:34:10.235880 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:34:16 crc kubenswrapper[4830]: I0318 18:34:16.396969 4830 scope.go:117] "RemoveContainer" containerID="3e40494ebbee3cd7c874d1f58eac924538aa69de61dd4491956ed6e88edb310a" Mar 18 18:34:23 crc kubenswrapper[4830]: I0318 18:34:23.234674 4830 scope.go:117] "RemoveContainer" containerID="46d4f627aa313dcb3a4d23bb3daecb5b1b3e4d6558380e5cd88db3746ce3260b" Mar 18 18:34:23 crc kubenswrapper[4830]: E0318 18:34:23.236519 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:34:35 crc kubenswrapper[4830]: I0318 18:34:35.235469 4830 scope.go:117] "RemoveContainer" containerID="46d4f627aa313dcb3a4d23bb3daecb5b1b3e4d6558380e5cd88db3746ce3260b" Mar 18 18:34:36 crc kubenswrapper[4830]: I0318 18:34:36.175412 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerStarted","Data":"45d96ab476768f3520cb380f87fe0545043e6591eaa2f2e1a853cbb7f4d2d3bf"} Mar 18 18:36:00 crc kubenswrapper[4830]: I0318 18:36:00.162876 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564316-q5l4v"] Mar 18 18:36:00 crc kubenswrapper[4830]: E0318 18:36:00.164102 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddccc616-3d55-449e-a9fe-b0769f1b1034" containerName="oc" Mar 18 18:36:00 crc kubenswrapper[4830]: I0318 18:36:00.164137 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddccc616-3d55-449e-a9fe-b0769f1b1034" containerName="oc" Mar 18 18:36:00 crc kubenswrapper[4830]: I0318 18:36:00.164552 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddccc616-3d55-449e-a9fe-b0769f1b1034" containerName="oc" Mar 18 18:36:00 crc kubenswrapper[4830]: I0318 18:36:00.165604 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564316-q5l4v" Mar 18 18:36:00 crc kubenswrapper[4830]: I0318 18:36:00.169452 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:36:00 crc kubenswrapper[4830]: I0318 18:36:00.173548 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564316-q5l4v"] Mar 18 18:36:00 crc kubenswrapper[4830]: I0318 18:36:00.177957 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:36:00 crc kubenswrapper[4830]: I0318 18:36:00.178052 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 18:36:00 crc kubenswrapper[4830]: I0318 18:36:00.341260 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl5l4\" (UniqueName: \"kubernetes.io/projected/8ca6178c-b87d-48ce-9e4d-bed2d9d62174-kube-api-access-sl5l4\") pod \"auto-csr-approver-29564316-q5l4v\" (UID: \"8ca6178c-b87d-48ce-9e4d-bed2d9d62174\") " pod="openshift-infra/auto-csr-approver-29564316-q5l4v" Mar 18 18:36:00 crc kubenswrapper[4830]: I0318 18:36:00.442569 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl5l4\" (UniqueName: \"kubernetes.io/projected/8ca6178c-b87d-48ce-9e4d-bed2d9d62174-kube-api-access-sl5l4\") pod \"auto-csr-approver-29564316-q5l4v\" (UID: \"8ca6178c-b87d-48ce-9e4d-bed2d9d62174\") " pod="openshift-infra/auto-csr-approver-29564316-q5l4v" Mar 18 18:36:00 crc kubenswrapper[4830]: I0318 18:36:00.488113 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl5l4\" (UniqueName: \"kubernetes.io/projected/8ca6178c-b87d-48ce-9e4d-bed2d9d62174-kube-api-access-sl5l4\") pod \"auto-csr-approver-29564316-q5l4v\" (UID: \"8ca6178c-b87d-48ce-9e4d-bed2d9d62174\") " pod="openshift-infra/auto-csr-approver-29564316-q5l4v" Mar 18 18:36:00 crc kubenswrapper[4830]: I0318 18:36:00.543990 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564316-q5l4v" Mar 18 18:36:01 crc kubenswrapper[4830]: I0318 18:36:01.002459 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564316-q5l4v"] Mar 18 18:36:02 crc kubenswrapper[4830]: I0318 18:36:02.026335 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564316-q5l4v" event={"ID":"8ca6178c-b87d-48ce-9e4d-bed2d9d62174","Type":"ContainerStarted","Data":"5183d207ea3a4e5f9b9bb13544ae55e1e88aab8ba24c26efc48e1c8f3e87443b"} Mar 18 18:36:04 crc kubenswrapper[4830]: I0318 18:36:04.049803 4830 generic.go:334] "Generic (PLEG): container finished" podID="8ca6178c-b87d-48ce-9e4d-bed2d9d62174" containerID="669ee0a9ffa94ead7911f10d351ad6cf26d95073b1bbfaea68e97275434fe9d2" exitCode=0 Mar 18 18:36:04 crc kubenswrapper[4830]: I0318 18:36:04.049886 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564316-q5l4v" event={"ID":"8ca6178c-b87d-48ce-9e4d-bed2d9d62174","Type":"ContainerDied","Data":"669ee0a9ffa94ead7911f10d351ad6cf26d95073b1bbfaea68e97275434fe9d2"} Mar 18 18:36:05 crc kubenswrapper[4830]: I0318 18:36:05.374100 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564316-q5l4v" Mar 18 18:36:05 crc kubenswrapper[4830]: I0318 18:36:05.478101 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sl5l4\" (UniqueName: \"kubernetes.io/projected/8ca6178c-b87d-48ce-9e4d-bed2d9d62174-kube-api-access-sl5l4\") pod \"8ca6178c-b87d-48ce-9e4d-bed2d9d62174\" (UID: \"8ca6178c-b87d-48ce-9e4d-bed2d9d62174\") " Mar 18 18:36:05 crc kubenswrapper[4830]: I0318 18:36:05.485807 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ca6178c-b87d-48ce-9e4d-bed2d9d62174-kube-api-access-sl5l4" (OuterVolumeSpecName: "kube-api-access-sl5l4") pod "8ca6178c-b87d-48ce-9e4d-bed2d9d62174" (UID: "8ca6178c-b87d-48ce-9e4d-bed2d9d62174"). InnerVolumeSpecName "kube-api-access-sl5l4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:36:05 crc kubenswrapper[4830]: I0318 18:36:05.580379 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sl5l4\" (UniqueName: \"kubernetes.io/projected/8ca6178c-b87d-48ce-9e4d-bed2d9d62174-kube-api-access-sl5l4\") on node \"crc\" DevicePath \"\"" Mar 18 18:36:06 crc kubenswrapper[4830]: I0318 18:36:06.069860 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564316-q5l4v" event={"ID":"8ca6178c-b87d-48ce-9e4d-bed2d9d62174","Type":"ContainerDied","Data":"5183d207ea3a4e5f9b9bb13544ae55e1e88aab8ba24c26efc48e1c8f3e87443b"} Mar 18 18:36:06 crc kubenswrapper[4830]: I0318 18:36:06.069908 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564316-q5l4v" Mar 18 18:36:06 crc kubenswrapper[4830]: I0318 18:36:06.069917 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5183d207ea3a4e5f9b9bb13544ae55e1e88aab8ba24c26efc48e1c8f3e87443b" Mar 18 18:36:06 crc kubenswrapper[4830]: I0318 18:36:06.475869 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564310-s8v2b"] Mar 18 18:36:06 crc kubenswrapper[4830]: I0318 18:36:06.484396 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564310-s8v2b"] Mar 18 18:36:08 crc kubenswrapper[4830]: I0318 18:36:08.248671 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbb06e67-3f5b-41e5-97e7-b0c8ad433cf3" path="/var/lib/kubelet/pods/bbb06e67-3f5b-41e5-97e7-b0c8ad433cf3/volumes" Mar 18 18:36:16 crc kubenswrapper[4830]: I0318 18:36:16.507347 4830 scope.go:117] "RemoveContainer" containerID="7f31f89187c0c58482ab6367789ba55810814555a97cb99dee7dbde8b6fc051c" Mar 18 18:36:33 crc kubenswrapper[4830]: I0318 18:36:33.236659 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bnzn7"] Mar 18 18:36:33 crc kubenswrapper[4830]: E0318 18:36:33.237588 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ca6178c-b87d-48ce-9e4d-bed2d9d62174" containerName="oc" Mar 18 18:36:33 crc kubenswrapper[4830]: I0318 18:36:33.237604 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ca6178c-b87d-48ce-9e4d-bed2d9d62174" containerName="oc" Mar 18 18:36:33 crc kubenswrapper[4830]: I0318 18:36:33.237816 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ca6178c-b87d-48ce-9e4d-bed2d9d62174" containerName="oc" Mar 18 18:36:33 crc kubenswrapper[4830]: I0318 18:36:33.238924 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bnzn7" Mar 18 18:36:33 crc kubenswrapper[4830]: I0318 18:36:33.259362 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bnzn7"] Mar 18 18:36:33 crc kubenswrapper[4830]: I0318 18:36:33.382005 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzggv\" (UniqueName: \"kubernetes.io/projected/32d70335-6ca2-442d-ad26-38400b2b2ffc-kube-api-access-pzggv\") pod \"redhat-operators-bnzn7\" (UID: \"32d70335-6ca2-442d-ad26-38400b2b2ffc\") " pod="openshift-marketplace/redhat-operators-bnzn7" Mar 18 18:36:33 crc kubenswrapper[4830]: I0318 18:36:33.382075 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32d70335-6ca2-442d-ad26-38400b2b2ffc-utilities\") pod \"redhat-operators-bnzn7\" (UID: \"32d70335-6ca2-442d-ad26-38400b2b2ffc\") " pod="openshift-marketplace/redhat-operators-bnzn7" Mar 18 18:36:33 crc kubenswrapper[4830]: I0318 18:36:33.382226 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32d70335-6ca2-442d-ad26-38400b2b2ffc-catalog-content\") pod \"redhat-operators-bnzn7\" (UID: \"32d70335-6ca2-442d-ad26-38400b2b2ffc\") " pod="openshift-marketplace/redhat-operators-bnzn7" Mar 18 18:36:33 crc kubenswrapper[4830]: I0318 18:36:33.483282 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32d70335-6ca2-442d-ad26-38400b2b2ffc-catalog-content\") pod \"redhat-operators-bnzn7\" (UID: \"32d70335-6ca2-442d-ad26-38400b2b2ffc\") " pod="openshift-marketplace/redhat-operators-bnzn7" Mar 18 18:36:33 crc kubenswrapper[4830]: I0318 18:36:33.483392 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzggv\" (UniqueName: \"kubernetes.io/projected/32d70335-6ca2-442d-ad26-38400b2b2ffc-kube-api-access-pzggv\") pod \"redhat-operators-bnzn7\" (UID: \"32d70335-6ca2-442d-ad26-38400b2b2ffc\") " pod="openshift-marketplace/redhat-operators-bnzn7" Mar 18 18:36:33 crc kubenswrapper[4830]: I0318 18:36:33.483427 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32d70335-6ca2-442d-ad26-38400b2b2ffc-utilities\") pod \"redhat-operators-bnzn7\" (UID: \"32d70335-6ca2-442d-ad26-38400b2b2ffc\") " pod="openshift-marketplace/redhat-operators-bnzn7" Mar 18 18:36:33 crc kubenswrapper[4830]: I0318 18:36:33.484179 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32d70335-6ca2-442d-ad26-38400b2b2ffc-utilities\") pod \"redhat-operators-bnzn7\" (UID: \"32d70335-6ca2-442d-ad26-38400b2b2ffc\") " pod="openshift-marketplace/redhat-operators-bnzn7" Mar 18 18:36:33 crc kubenswrapper[4830]: I0318 18:36:33.484213 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32d70335-6ca2-442d-ad26-38400b2b2ffc-catalog-content\") pod \"redhat-operators-bnzn7\" (UID: \"32d70335-6ca2-442d-ad26-38400b2b2ffc\") " pod="openshift-marketplace/redhat-operators-bnzn7" Mar 18 18:36:33 crc kubenswrapper[4830]: I0318 18:36:33.504531 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzggv\" (UniqueName: \"kubernetes.io/projected/32d70335-6ca2-442d-ad26-38400b2b2ffc-kube-api-access-pzggv\") pod \"redhat-operators-bnzn7\" (UID: \"32d70335-6ca2-442d-ad26-38400b2b2ffc\") " pod="openshift-marketplace/redhat-operators-bnzn7" Mar 18 18:36:33 crc kubenswrapper[4830]: I0318 18:36:33.559812 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bnzn7" Mar 18 18:36:34 crc kubenswrapper[4830]: I0318 18:36:34.000959 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bnzn7"] Mar 18 18:36:34 crc kubenswrapper[4830]: I0318 18:36:34.325213 4830 generic.go:334] "Generic (PLEG): container finished" podID="32d70335-6ca2-442d-ad26-38400b2b2ffc" containerID="1ad41ebfd0b69d34c2e986ff5d61d5345f130e6ae3ca19d3d0fbf68b97e12b32" exitCode=0 Mar 18 18:36:34 crc kubenswrapper[4830]: I0318 18:36:34.325473 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bnzn7" event={"ID":"32d70335-6ca2-442d-ad26-38400b2b2ffc","Type":"ContainerDied","Data":"1ad41ebfd0b69d34c2e986ff5d61d5345f130e6ae3ca19d3d0fbf68b97e12b32"} Mar 18 18:36:34 crc kubenswrapper[4830]: I0318 18:36:34.325497 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bnzn7" event={"ID":"32d70335-6ca2-442d-ad26-38400b2b2ffc","Type":"ContainerStarted","Data":"1c5e08c24f1f95ba0b22a4f309aeee6a0f7225a81888db7957468bd979748a5b"} Mar 18 18:36:35 crc kubenswrapper[4830]: I0318 18:36:35.333480 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bnzn7" event={"ID":"32d70335-6ca2-442d-ad26-38400b2b2ffc","Type":"ContainerStarted","Data":"d86a9c7e2d48d9e986e17e0f093eb3cbb5b81fdbde1d996aa1f842b3c45dbc22"} Mar 18 18:36:36 crc kubenswrapper[4830]: I0318 18:36:36.347477 4830 generic.go:334] "Generic (PLEG): container finished" podID="32d70335-6ca2-442d-ad26-38400b2b2ffc" containerID="d86a9c7e2d48d9e986e17e0f093eb3cbb5b81fdbde1d996aa1f842b3c45dbc22" exitCode=0 Mar 18 18:36:36 crc kubenswrapper[4830]: I0318 18:36:36.347543 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bnzn7" event={"ID":"32d70335-6ca2-442d-ad26-38400b2b2ffc","Type":"ContainerDied","Data":"d86a9c7e2d48d9e986e17e0f093eb3cbb5b81fdbde1d996aa1f842b3c45dbc22"} Mar 18 18:36:37 crc kubenswrapper[4830]: I0318 18:36:37.361324 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bnzn7" event={"ID":"32d70335-6ca2-442d-ad26-38400b2b2ffc","Type":"ContainerStarted","Data":"6a849308186f7d25de2b9f3722dfdd291db9a37a565199b9293d59b540a0866d"} Mar 18 18:36:37 crc kubenswrapper[4830]: I0318 18:36:37.399228 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bnzn7" podStartSLOduration=1.9473589489999998 podStartE2EDuration="4.39920559s" podCreationTimestamp="2026-03-18 18:36:33 +0000 UTC" firstStartedPulling="2026-03-18 18:36:34.327116689 +0000 UTC m=+2028.894747021" lastFinishedPulling="2026-03-18 18:36:36.77896332 +0000 UTC m=+2031.346593662" observedRunningTime="2026-03-18 18:36:37.393210519 +0000 UTC m=+2031.960840881" watchObservedRunningTime="2026-03-18 18:36:37.39920559 +0000 UTC m=+2031.966835952" Mar 18 18:36:43 crc kubenswrapper[4830]: I0318 18:36:43.560160 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bnzn7" Mar 18 18:36:43 crc kubenswrapper[4830]: I0318 18:36:43.560986 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bnzn7" Mar 18 18:36:44 crc kubenswrapper[4830]: I0318 18:36:44.621092 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bnzn7" podUID="32d70335-6ca2-442d-ad26-38400b2b2ffc" containerName="registry-server" probeResult="failure" output=< Mar 18 18:36:44 crc kubenswrapper[4830]: timeout: failed to connect service ":50051" within 1s Mar 18 18:36:44 crc kubenswrapper[4830]: > Mar 18 18:36:53 crc kubenswrapper[4830]: I0318 18:36:53.621355 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bnzn7" Mar 18 18:36:53 crc kubenswrapper[4830]: I0318 18:36:53.690222 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bnzn7" Mar 18 18:36:53 crc kubenswrapper[4830]: I0318 18:36:53.873839 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bnzn7"] Mar 18 18:36:55 crc kubenswrapper[4830]: I0318 18:36:55.542476 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bnzn7" podUID="32d70335-6ca2-442d-ad26-38400b2b2ffc" containerName="registry-server" containerID="cri-o://6a849308186f7d25de2b9f3722dfdd291db9a37a565199b9293d59b540a0866d" gracePeriod=2 Mar 18 18:36:56 crc kubenswrapper[4830]: I0318 18:36:56.046829 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bnzn7" Mar 18 18:36:56 crc kubenswrapper[4830]: I0318 18:36:56.230392 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzggv\" (UniqueName: \"kubernetes.io/projected/32d70335-6ca2-442d-ad26-38400b2b2ffc-kube-api-access-pzggv\") pod \"32d70335-6ca2-442d-ad26-38400b2b2ffc\" (UID: \"32d70335-6ca2-442d-ad26-38400b2b2ffc\") " Mar 18 18:36:56 crc kubenswrapper[4830]: I0318 18:36:56.230465 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32d70335-6ca2-442d-ad26-38400b2b2ffc-utilities\") pod \"32d70335-6ca2-442d-ad26-38400b2b2ffc\" (UID: \"32d70335-6ca2-442d-ad26-38400b2b2ffc\") " Mar 18 18:36:56 crc kubenswrapper[4830]: I0318 18:36:56.230495 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32d70335-6ca2-442d-ad26-38400b2b2ffc-catalog-content\") pod \"32d70335-6ca2-442d-ad26-38400b2b2ffc\" (UID: \"32d70335-6ca2-442d-ad26-38400b2b2ffc\") " Mar 18 18:36:56 crc kubenswrapper[4830]: I0318 18:36:56.232043 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32d70335-6ca2-442d-ad26-38400b2b2ffc-utilities" (OuterVolumeSpecName: "utilities") pod "32d70335-6ca2-442d-ad26-38400b2b2ffc" (UID: "32d70335-6ca2-442d-ad26-38400b2b2ffc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:36:56 crc kubenswrapper[4830]: I0318 18:36:56.240348 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32d70335-6ca2-442d-ad26-38400b2b2ffc-kube-api-access-pzggv" (OuterVolumeSpecName: "kube-api-access-pzggv") pod "32d70335-6ca2-442d-ad26-38400b2b2ffc" (UID: "32d70335-6ca2-442d-ad26-38400b2b2ffc"). InnerVolumeSpecName "kube-api-access-pzggv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:36:56 crc kubenswrapper[4830]: I0318 18:36:56.332593 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzggv\" (UniqueName: \"kubernetes.io/projected/32d70335-6ca2-442d-ad26-38400b2b2ffc-kube-api-access-pzggv\") on node \"crc\" DevicePath \"\"" Mar 18 18:36:56 crc kubenswrapper[4830]: I0318 18:36:56.332620 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32d70335-6ca2-442d-ad26-38400b2b2ffc-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:36:56 crc kubenswrapper[4830]: I0318 18:36:56.368358 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32d70335-6ca2-442d-ad26-38400b2b2ffc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "32d70335-6ca2-442d-ad26-38400b2b2ffc" (UID: "32d70335-6ca2-442d-ad26-38400b2b2ffc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:36:56 crc kubenswrapper[4830]: I0318 18:36:56.433493 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32d70335-6ca2-442d-ad26-38400b2b2ffc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:36:56 crc kubenswrapper[4830]: I0318 18:36:56.550803 4830 generic.go:334] "Generic (PLEG): container finished" podID="32d70335-6ca2-442d-ad26-38400b2b2ffc" containerID="6a849308186f7d25de2b9f3722dfdd291db9a37a565199b9293d59b540a0866d" exitCode=0 Mar 18 18:36:56 crc kubenswrapper[4830]: I0318 18:36:56.550851 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bnzn7" event={"ID":"32d70335-6ca2-442d-ad26-38400b2b2ffc","Type":"ContainerDied","Data":"6a849308186f7d25de2b9f3722dfdd291db9a37a565199b9293d59b540a0866d"} Mar 18 18:36:56 crc kubenswrapper[4830]: I0318 18:36:56.550873 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bnzn7" Mar 18 18:36:56 crc kubenswrapper[4830]: I0318 18:36:56.550894 4830 scope.go:117] "RemoveContainer" containerID="6a849308186f7d25de2b9f3722dfdd291db9a37a565199b9293d59b540a0866d" Mar 18 18:36:56 crc kubenswrapper[4830]: I0318 18:36:56.550880 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bnzn7" event={"ID":"32d70335-6ca2-442d-ad26-38400b2b2ffc","Type":"ContainerDied","Data":"1c5e08c24f1f95ba0b22a4f309aeee6a0f7225a81888db7957468bd979748a5b"} Mar 18 18:36:56 crc kubenswrapper[4830]: I0318 18:36:56.584576 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bnzn7"] Mar 18 18:36:56 crc kubenswrapper[4830]: I0318 18:36:56.586934 4830 scope.go:117] "RemoveContainer" containerID="d86a9c7e2d48d9e986e17e0f093eb3cbb5b81fdbde1d996aa1f842b3c45dbc22" Mar 18 18:36:56 crc kubenswrapper[4830]: I0318 18:36:56.590656 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bnzn7"] Mar 18 18:36:56 crc kubenswrapper[4830]: I0318 18:36:56.605154 4830 scope.go:117] "RemoveContainer" containerID="1ad41ebfd0b69d34c2e986ff5d61d5345f130e6ae3ca19d3d0fbf68b97e12b32" Mar 18 18:36:56 crc kubenswrapper[4830]: I0318 18:36:56.645080 4830 scope.go:117] "RemoveContainer" containerID="6a849308186f7d25de2b9f3722dfdd291db9a37a565199b9293d59b540a0866d" Mar 18 18:36:56 crc kubenswrapper[4830]: E0318 18:36:56.649231 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a849308186f7d25de2b9f3722dfdd291db9a37a565199b9293d59b540a0866d\": container with ID starting with 6a849308186f7d25de2b9f3722dfdd291db9a37a565199b9293d59b540a0866d not found: ID does not exist" containerID="6a849308186f7d25de2b9f3722dfdd291db9a37a565199b9293d59b540a0866d" Mar 18 18:36:56 crc kubenswrapper[4830]: I0318 18:36:56.649275 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a849308186f7d25de2b9f3722dfdd291db9a37a565199b9293d59b540a0866d"} err="failed to get container status \"6a849308186f7d25de2b9f3722dfdd291db9a37a565199b9293d59b540a0866d\": rpc error: code = NotFound desc = could not find container \"6a849308186f7d25de2b9f3722dfdd291db9a37a565199b9293d59b540a0866d\": container with ID starting with 6a849308186f7d25de2b9f3722dfdd291db9a37a565199b9293d59b540a0866d not found: ID does not exist" Mar 18 18:36:56 crc kubenswrapper[4830]: I0318 18:36:56.649308 4830 scope.go:117] "RemoveContainer" containerID="d86a9c7e2d48d9e986e17e0f093eb3cbb5b81fdbde1d996aa1f842b3c45dbc22" Mar 18 18:36:56 crc kubenswrapper[4830]: E0318 18:36:56.649613 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d86a9c7e2d48d9e986e17e0f093eb3cbb5b81fdbde1d996aa1f842b3c45dbc22\": container with ID starting with d86a9c7e2d48d9e986e17e0f093eb3cbb5b81fdbde1d996aa1f842b3c45dbc22 not found: ID does not exist" containerID="d86a9c7e2d48d9e986e17e0f093eb3cbb5b81fdbde1d996aa1f842b3c45dbc22" Mar 18 18:36:56 crc kubenswrapper[4830]: I0318 18:36:56.649636 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d86a9c7e2d48d9e986e17e0f093eb3cbb5b81fdbde1d996aa1f842b3c45dbc22"} err="failed to get container status \"d86a9c7e2d48d9e986e17e0f093eb3cbb5b81fdbde1d996aa1f842b3c45dbc22\": rpc error: code = NotFound desc = could not find container \"d86a9c7e2d48d9e986e17e0f093eb3cbb5b81fdbde1d996aa1f842b3c45dbc22\": container with ID starting with d86a9c7e2d48d9e986e17e0f093eb3cbb5b81fdbde1d996aa1f842b3c45dbc22 not found: ID does not exist" Mar 18 18:36:56 crc kubenswrapper[4830]: I0318 18:36:56.649650 4830 scope.go:117] "RemoveContainer" containerID="1ad41ebfd0b69d34c2e986ff5d61d5345f130e6ae3ca19d3d0fbf68b97e12b32" Mar 18 18:36:56 crc kubenswrapper[4830]: E0318 18:36:56.650327 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ad41ebfd0b69d34c2e986ff5d61d5345f130e6ae3ca19d3d0fbf68b97e12b32\": container with ID starting with 1ad41ebfd0b69d34c2e986ff5d61d5345f130e6ae3ca19d3d0fbf68b97e12b32 not found: ID does not exist" containerID="1ad41ebfd0b69d34c2e986ff5d61d5345f130e6ae3ca19d3d0fbf68b97e12b32" Mar 18 18:36:56 crc kubenswrapper[4830]: I0318 18:36:56.650344 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ad41ebfd0b69d34c2e986ff5d61d5345f130e6ae3ca19d3d0fbf68b97e12b32"} err="failed to get container status \"1ad41ebfd0b69d34c2e986ff5d61d5345f130e6ae3ca19d3d0fbf68b97e12b32\": rpc error: code = NotFound desc = could not find container \"1ad41ebfd0b69d34c2e986ff5d61d5345f130e6ae3ca19d3d0fbf68b97e12b32\": container with ID starting with 1ad41ebfd0b69d34c2e986ff5d61d5345f130e6ae3ca19d3d0fbf68b97e12b32 not found: ID does not exist" Mar 18 18:36:58 crc kubenswrapper[4830]: I0318 18:36:58.253476 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32d70335-6ca2-442d-ad26-38400b2b2ffc" path="/var/lib/kubelet/pods/32d70335-6ca2-442d-ad26-38400b2b2ffc/volumes" Mar 18 18:36:59 crc kubenswrapper[4830]: I0318 18:36:59.509628 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:36:59 crc kubenswrapper[4830]: I0318 18:36:59.509751 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:37:12 crc kubenswrapper[4830]: I0318 18:37:12.075713 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9nd72"] Mar 18 18:37:12 crc kubenswrapper[4830]: E0318 18:37:12.076999 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32d70335-6ca2-442d-ad26-38400b2b2ffc" containerName="extract-content" Mar 18 18:37:12 crc kubenswrapper[4830]: I0318 18:37:12.077027 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="32d70335-6ca2-442d-ad26-38400b2b2ffc" containerName="extract-content" Mar 18 18:37:12 crc kubenswrapper[4830]: E0318 18:37:12.077072 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32d70335-6ca2-442d-ad26-38400b2b2ffc" containerName="registry-server" Mar 18 18:37:12 crc kubenswrapper[4830]: I0318 18:37:12.077085 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="32d70335-6ca2-442d-ad26-38400b2b2ffc" containerName="registry-server" Mar 18 18:37:12 crc kubenswrapper[4830]: E0318 18:37:12.077119 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32d70335-6ca2-442d-ad26-38400b2b2ffc" containerName="extract-utilities" Mar 18 18:37:12 crc kubenswrapper[4830]: I0318 18:37:12.077132 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="32d70335-6ca2-442d-ad26-38400b2b2ffc" containerName="extract-utilities" Mar 18 18:37:12 crc kubenswrapper[4830]: I0318 18:37:12.077402 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="32d70335-6ca2-442d-ad26-38400b2b2ffc" containerName="registry-server" Mar 18 18:37:12 crc kubenswrapper[4830]: I0318 18:37:12.080432 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9nd72" Mar 18 18:37:12 crc kubenswrapper[4830]: I0318 18:37:12.100218 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9nd72"] Mar 18 18:37:12 crc kubenswrapper[4830]: I0318 18:37:12.176016 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw92k\" (UniqueName: \"kubernetes.io/projected/623a670f-86c5-477c-a09c-9a1a811a6b8a-kube-api-access-cw92k\") pod \"certified-operators-9nd72\" (UID: \"623a670f-86c5-477c-a09c-9a1a811a6b8a\") " pod="openshift-marketplace/certified-operators-9nd72" Mar 18 18:37:12 crc kubenswrapper[4830]: I0318 18:37:12.176342 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/623a670f-86c5-477c-a09c-9a1a811a6b8a-utilities\") pod \"certified-operators-9nd72\" (UID: \"623a670f-86c5-477c-a09c-9a1a811a6b8a\") " pod="openshift-marketplace/certified-operators-9nd72" Mar 18 18:37:12 crc kubenswrapper[4830]: I0318 18:37:12.176565 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/623a670f-86c5-477c-a09c-9a1a811a6b8a-catalog-content\") pod \"certified-operators-9nd72\" (UID: \"623a670f-86c5-477c-a09c-9a1a811a6b8a\") " pod="openshift-marketplace/certified-operators-9nd72" Mar 18 18:37:12 crc kubenswrapper[4830]: I0318 18:37:12.278219 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/623a670f-86c5-477c-a09c-9a1a811a6b8a-catalog-content\") pod \"certified-operators-9nd72\" (UID: \"623a670f-86c5-477c-a09c-9a1a811a6b8a\") " pod="openshift-marketplace/certified-operators-9nd72" Mar 18 18:37:12 crc kubenswrapper[4830]: I0318 18:37:12.278393 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw92k\" (UniqueName: \"kubernetes.io/projected/623a670f-86c5-477c-a09c-9a1a811a6b8a-kube-api-access-cw92k\") pod \"certified-operators-9nd72\" (UID: \"623a670f-86c5-477c-a09c-9a1a811a6b8a\") " pod="openshift-marketplace/certified-operators-9nd72" Mar 18 18:37:12 crc kubenswrapper[4830]: I0318 18:37:12.278432 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/623a670f-86c5-477c-a09c-9a1a811a6b8a-utilities\") pod \"certified-operators-9nd72\" (UID: \"623a670f-86c5-477c-a09c-9a1a811a6b8a\") " pod="openshift-marketplace/certified-operators-9nd72" Mar 18 18:37:12 crc kubenswrapper[4830]: I0318 18:37:12.278827 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/623a670f-86c5-477c-a09c-9a1a811a6b8a-catalog-content\") pod \"certified-operators-9nd72\" (UID: \"623a670f-86c5-477c-a09c-9a1a811a6b8a\") " pod="openshift-marketplace/certified-operators-9nd72" Mar 18 18:37:12 crc kubenswrapper[4830]: I0318 18:37:12.279028 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/623a670f-86c5-477c-a09c-9a1a811a6b8a-utilities\") pod \"certified-operators-9nd72\" (UID: \"623a670f-86c5-477c-a09c-9a1a811a6b8a\") " pod="openshift-marketplace/certified-operators-9nd72" Mar 18 18:37:12 crc kubenswrapper[4830]: I0318 18:37:12.300834 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw92k\" (UniqueName: \"kubernetes.io/projected/623a670f-86c5-477c-a09c-9a1a811a6b8a-kube-api-access-cw92k\") pod \"certified-operators-9nd72\" (UID: \"623a670f-86c5-477c-a09c-9a1a811a6b8a\") " pod="openshift-marketplace/certified-operators-9nd72" Mar 18 18:37:12 crc kubenswrapper[4830]: I0318 18:37:12.443632 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9nd72" Mar 18 18:37:12 crc kubenswrapper[4830]: I0318 18:37:12.920075 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9nd72"] Mar 18 18:37:13 crc kubenswrapper[4830]: I0318 18:37:13.696020 4830 generic.go:334] "Generic (PLEG): container finished" podID="623a670f-86c5-477c-a09c-9a1a811a6b8a" containerID="8a2219cb5aa9e9964b7170f4530bb2c5cdf6be0184b18ea188cd31820af45447" exitCode=0 Mar 18 18:37:13 crc kubenswrapper[4830]: I0318 18:37:13.696245 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9nd72" event={"ID":"623a670f-86c5-477c-a09c-9a1a811a6b8a","Type":"ContainerDied","Data":"8a2219cb5aa9e9964b7170f4530bb2c5cdf6be0184b18ea188cd31820af45447"} Mar 18 18:37:13 crc kubenswrapper[4830]: I0318 18:37:13.696269 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9nd72" event={"ID":"623a670f-86c5-477c-a09c-9a1a811a6b8a","Type":"ContainerStarted","Data":"1dbb658366d7cbc1234b8188adb07920849f03123a8f66092e4e003ee44338ae"} Mar 18 18:37:13 crc kubenswrapper[4830]: I0318 18:37:13.698067 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 18:37:14 crc kubenswrapper[4830]: I0318 18:37:14.708391 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9nd72" event={"ID":"623a670f-86c5-477c-a09c-9a1a811a6b8a","Type":"ContainerStarted","Data":"d80868a22392495d1a74afb37f1e06756940a0801a903887118d7a44a67f62a7"} Mar 18 18:37:15 crc kubenswrapper[4830]: I0318 18:37:15.720355 4830 generic.go:334] "Generic (PLEG): container finished" podID="623a670f-86c5-477c-a09c-9a1a811a6b8a" containerID="d80868a22392495d1a74afb37f1e06756940a0801a903887118d7a44a67f62a7" exitCode=0 Mar 18 18:37:15 crc kubenswrapper[4830]: I0318 18:37:15.720408 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9nd72" event={"ID":"623a670f-86c5-477c-a09c-9a1a811a6b8a","Type":"ContainerDied","Data":"d80868a22392495d1a74afb37f1e06756940a0801a903887118d7a44a67f62a7"} Mar 18 18:37:16 crc kubenswrapper[4830]: I0318 18:37:16.739827 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9nd72" event={"ID":"623a670f-86c5-477c-a09c-9a1a811a6b8a","Type":"ContainerStarted","Data":"f9dc1dc76a30e5d7c042355338e55948dfb7eb3f5168a0ee4c66d930a4283f76"} Mar 18 18:37:16 crc kubenswrapper[4830]: I0318 18:37:16.767494 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9nd72" podStartSLOduration=2.206588682 podStartE2EDuration="4.767466834s" podCreationTimestamp="2026-03-18 18:37:12 +0000 UTC" firstStartedPulling="2026-03-18 18:37:13.697882173 +0000 UTC m=+2068.265512495" lastFinishedPulling="2026-03-18 18:37:16.258760295 +0000 UTC m=+2070.826390647" observedRunningTime="2026-03-18 18:37:16.762248585 +0000 UTC m=+2071.329878927" watchObservedRunningTime="2026-03-18 18:37:16.767466834 +0000 UTC m=+2071.335097216" Mar 18 18:37:22 crc kubenswrapper[4830]: I0318 18:37:22.444906 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9nd72" Mar 18 18:37:22 crc kubenswrapper[4830]: I0318 18:37:22.445582 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9nd72" Mar 18 18:37:22 crc kubenswrapper[4830]: I0318 18:37:22.524213 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9nd72" Mar 18 18:37:22 crc kubenswrapper[4830]: I0318 18:37:22.868921 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9nd72" Mar 18 18:37:22 crc kubenswrapper[4830]: I0318 18:37:22.934951 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9nd72"] Mar 18 18:37:24 crc kubenswrapper[4830]: I0318 18:37:24.812718 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9nd72" podUID="623a670f-86c5-477c-a09c-9a1a811a6b8a" containerName="registry-server" containerID="cri-o://f9dc1dc76a30e5d7c042355338e55948dfb7eb3f5168a0ee4c66d930a4283f76" gracePeriod=2 Mar 18 18:37:25 crc kubenswrapper[4830]: I0318 18:37:25.795334 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9nd72" Mar 18 18:37:25 crc kubenswrapper[4830]: I0318 18:37:25.827149 4830 generic.go:334] "Generic (PLEG): container finished" podID="623a670f-86c5-477c-a09c-9a1a811a6b8a" containerID="f9dc1dc76a30e5d7c042355338e55948dfb7eb3f5168a0ee4c66d930a4283f76" exitCode=0 Mar 18 18:37:25 crc kubenswrapper[4830]: I0318 18:37:25.827190 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9nd72" event={"ID":"623a670f-86c5-477c-a09c-9a1a811a6b8a","Type":"ContainerDied","Data":"f9dc1dc76a30e5d7c042355338e55948dfb7eb3f5168a0ee4c66d930a4283f76"} Mar 18 18:37:25 crc kubenswrapper[4830]: I0318 18:37:25.827216 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9nd72" event={"ID":"623a670f-86c5-477c-a09c-9a1a811a6b8a","Type":"ContainerDied","Data":"1dbb658366d7cbc1234b8188adb07920849f03123a8f66092e4e003ee44338ae"} Mar 18 18:37:25 crc kubenswrapper[4830]: I0318 18:37:25.827233 4830 scope.go:117] "RemoveContainer" containerID="f9dc1dc76a30e5d7c042355338e55948dfb7eb3f5168a0ee4c66d930a4283f76" Mar 18 18:37:25 crc kubenswrapper[4830]: I0318 18:37:25.827249 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9nd72" Mar 18 18:37:25 crc kubenswrapper[4830]: I0318 18:37:25.881866 4830 scope.go:117] "RemoveContainer" containerID="d80868a22392495d1a74afb37f1e06756940a0801a903887118d7a44a67f62a7" Mar 18 18:37:25 crc kubenswrapper[4830]: I0318 18:37:25.900942 4830 scope.go:117] "RemoveContainer" containerID="8a2219cb5aa9e9964b7170f4530bb2c5cdf6be0184b18ea188cd31820af45447" Mar 18 18:37:25 crc kubenswrapper[4830]: I0318 18:37:25.913554 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/623a670f-86c5-477c-a09c-9a1a811a6b8a-utilities\") pod \"623a670f-86c5-477c-a09c-9a1a811a6b8a\" (UID: \"623a670f-86c5-477c-a09c-9a1a811a6b8a\") " Mar 18 18:37:25 crc kubenswrapper[4830]: I0318 18:37:25.913600 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cw92k\" (UniqueName: \"kubernetes.io/projected/623a670f-86c5-477c-a09c-9a1a811a6b8a-kube-api-access-cw92k\") pod \"623a670f-86c5-477c-a09c-9a1a811a6b8a\" (UID: \"623a670f-86c5-477c-a09c-9a1a811a6b8a\") " Mar 18 18:37:25 crc kubenswrapper[4830]: I0318 18:37:25.913642 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/623a670f-86c5-477c-a09c-9a1a811a6b8a-catalog-content\") pod \"623a670f-86c5-477c-a09c-9a1a811a6b8a\" (UID: \"623a670f-86c5-477c-a09c-9a1a811a6b8a\") " Mar 18 18:37:25 crc kubenswrapper[4830]: I0318 18:37:25.915114 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/623a670f-86c5-477c-a09c-9a1a811a6b8a-utilities" (OuterVolumeSpecName: "utilities") pod "623a670f-86c5-477c-a09c-9a1a811a6b8a" (UID: "623a670f-86c5-477c-a09c-9a1a811a6b8a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:37:25 crc kubenswrapper[4830]: I0318 18:37:25.920826 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/623a670f-86c5-477c-a09c-9a1a811a6b8a-kube-api-access-cw92k" (OuterVolumeSpecName: "kube-api-access-cw92k") pod "623a670f-86c5-477c-a09c-9a1a811a6b8a" (UID: "623a670f-86c5-477c-a09c-9a1a811a6b8a"). InnerVolumeSpecName "kube-api-access-cw92k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:37:25 crc kubenswrapper[4830]: I0318 18:37:25.935172 4830 scope.go:117] "RemoveContainer" containerID="f9dc1dc76a30e5d7c042355338e55948dfb7eb3f5168a0ee4c66d930a4283f76" Mar 18 18:37:25 crc kubenswrapper[4830]: E0318 18:37:25.936531 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9dc1dc76a30e5d7c042355338e55948dfb7eb3f5168a0ee4c66d930a4283f76\": container with ID starting with f9dc1dc76a30e5d7c042355338e55948dfb7eb3f5168a0ee4c66d930a4283f76 not found: ID does not exist" containerID="f9dc1dc76a30e5d7c042355338e55948dfb7eb3f5168a0ee4c66d930a4283f76" Mar 18 18:37:25 crc kubenswrapper[4830]: I0318 18:37:25.936588 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9dc1dc76a30e5d7c042355338e55948dfb7eb3f5168a0ee4c66d930a4283f76"} err="failed to get container status \"f9dc1dc76a30e5d7c042355338e55948dfb7eb3f5168a0ee4c66d930a4283f76\": rpc error: code = NotFound desc = could not find container \"f9dc1dc76a30e5d7c042355338e55948dfb7eb3f5168a0ee4c66d930a4283f76\": container with ID starting with f9dc1dc76a30e5d7c042355338e55948dfb7eb3f5168a0ee4c66d930a4283f76 not found: ID does not exist" Mar 18 18:37:25 crc kubenswrapper[4830]: I0318 18:37:25.936631 4830 scope.go:117] "RemoveContainer" containerID="d80868a22392495d1a74afb37f1e06756940a0801a903887118d7a44a67f62a7" Mar 18 18:37:25 crc kubenswrapper[4830]: E0318 18:37:25.937971 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d80868a22392495d1a74afb37f1e06756940a0801a903887118d7a44a67f62a7\": container with ID starting with d80868a22392495d1a74afb37f1e06756940a0801a903887118d7a44a67f62a7 not found: ID does not exist" containerID="d80868a22392495d1a74afb37f1e06756940a0801a903887118d7a44a67f62a7" Mar 18 18:37:25 crc kubenswrapper[4830]: I0318 18:37:25.938112 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d80868a22392495d1a74afb37f1e06756940a0801a903887118d7a44a67f62a7"} err="failed to get container status \"d80868a22392495d1a74afb37f1e06756940a0801a903887118d7a44a67f62a7\": rpc error: code = NotFound desc = could not find container \"d80868a22392495d1a74afb37f1e06756940a0801a903887118d7a44a67f62a7\": container with ID starting with d80868a22392495d1a74afb37f1e06756940a0801a903887118d7a44a67f62a7 not found: ID does not exist" Mar 18 18:37:25 crc kubenswrapper[4830]: I0318 18:37:25.938209 4830 scope.go:117] "RemoveContainer" containerID="8a2219cb5aa9e9964b7170f4530bb2c5cdf6be0184b18ea188cd31820af45447" Mar 18 18:37:25 crc kubenswrapper[4830]: E0318 18:37:25.938574 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a2219cb5aa9e9964b7170f4530bb2c5cdf6be0184b18ea188cd31820af45447\": container with ID starting with 8a2219cb5aa9e9964b7170f4530bb2c5cdf6be0184b18ea188cd31820af45447 not found: ID does not exist" containerID="8a2219cb5aa9e9964b7170f4530bb2c5cdf6be0184b18ea188cd31820af45447" Mar 18 18:37:25 crc kubenswrapper[4830]: I0318 18:37:25.938617 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a2219cb5aa9e9964b7170f4530bb2c5cdf6be0184b18ea188cd31820af45447"} err="failed to get container status \"8a2219cb5aa9e9964b7170f4530bb2c5cdf6be0184b18ea188cd31820af45447\": rpc error: code = NotFound desc = could not find container \"8a2219cb5aa9e9964b7170f4530bb2c5cdf6be0184b18ea188cd31820af45447\": container with ID starting with 8a2219cb5aa9e9964b7170f4530bb2c5cdf6be0184b18ea188cd31820af45447 not found: ID does not exist" Mar 18 18:37:25 crc kubenswrapper[4830]: I0318 18:37:25.963480 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/623a670f-86c5-477c-a09c-9a1a811a6b8a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "623a670f-86c5-477c-a09c-9a1a811a6b8a" (UID: "623a670f-86c5-477c-a09c-9a1a811a6b8a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:37:26 crc kubenswrapper[4830]: I0318 18:37:26.016603 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/623a670f-86c5-477c-a09c-9a1a811a6b8a-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:37:26 crc kubenswrapper[4830]: I0318 18:37:26.016651 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cw92k\" (UniqueName: \"kubernetes.io/projected/623a670f-86c5-477c-a09c-9a1a811a6b8a-kube-api-access-cw92k\") on node \"crc\" DevicePath \"\"" Mar 18 18:37:26 crc kubenswrapper[4830]: I0318 18:37:26.016664 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/623a670f-86c5-477c-a09c-9a1a811a6b8a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:37:26 crc kubenswrapper[4830]: I0318 18:37:26.151666 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9nd72"] Mar 18 18:37:26 crc kubenswrapper[4830]: I0318 18:37:26.159960 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9nd72"] Mar 18 18:37:26 crc kubenswrapper[4830]: I0318 18:37:26.249870 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="623a670f-86c5-477c-a09c-9a1a811a6b8a" path="/var/lib/kubelet/pods/623a670f-86c5-477c-a09c-9a1a811a6b8a/volumes" Mar 18 18:37:29 crc kubenswrapper[4830]: I0318 18:37:29.510219 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:37:29 crc kubenswrapper[4830]: I0318 18:37:29.510757 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:37:32 crc kubenswrapper[4830]: I0318 18:37:32.847717 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2bgbm"] Mar 18 18:37:32 crc kubenswrapper[4830]: E0318 18:37:32.848495 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="623a670f-86c5-477c-a09c-9a1a811a6b8a" containerName="extract-utilities" Mar 18 18:37:32 crc kubenswrapper[4830]: I0318 18:37:32.848517 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="623a670f-86c5-477c-a09c-9a1a811a6b8a" containerName="extract-utilities" Mar 18 18:37:32 crc kubenswrapper[4830]: E0318 18:37:32.848545 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="623a670f-86c5-477c-a09c-9a1a811a6b8a" containerName="registry-server" Mar 18 18:37:32 crc kubenswrapper[4830]: I0318 18:37:32.848556 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="623a670f-86c5-477c-a09c-9a1a811a6b8a" containerName="registry-server" Mar 18 18:37:32 crc kubenswrapper[4830]: E0318 18:37:32.848583 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="623a670f-86c5-477c-a09c-9a1a811a6b8a" containerName="extract-content" Mar 18 18:37:32 crc kubenswrapper[4830]: I0318 18:37:32.848594 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="623a670f-86c5-477c-a09c-9a1a811a6b8a" containerName="extract-content" Mar 18 18:37:32 crc kubenswrapper[4830]: I0318 18:37:32.848835 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="623a670f-86c5-477c-a09c-9a1a811a6b8a" containerName="registry-server" Mar 18 18:37:32 crc kubenswrapper[4830]: I0318 18:37:32.853903 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2bgbm" Mar 18 18:37:32 crc kubenswrapper[4830]: I0318 18:37:32.859862 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2bgbm"] Mar 18 18:37:32 crc kubenswrapper[4830]: I0318 18:37:32.923880 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ea4e412-cab8-45af-aec9-188680320831-catalog-content\") pod \"redhat-marketplace-2bgbm\" (UID: \"9ea4e412-cab8-45af-aec9-188680320831\") " pod="openshift-marketplace/redhat-marketplace-2bgbm" Mar 18 18:37:32 crc kubenswrapper[4830]: I0318 18:37:32.924186 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ea4e412-cab8-45af-aec9-188680320831-utilities\") pod \"redhat-marketplace-2bgbm\" (UID: \"9ea4e412-cab8-45af-aec9-188680320831\") " pod="openshift-marketplace/redhat-marketplace-2bgbm" Mar 18 18:37:32 crc kubenswrapper[4830]: I0318 18:37:32.924239 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-276t4\" (UniqueName: \"kubernetes.io/projected/9ea4e412-cab8-45af-aec9-188680320831-kube-api-access-276t4\") pod \"redhat-marketplace-2bgbm\" (UID: \"9ea4e412-cab8-45af-aec9-188680320831\") " pod="openshift-marketplace/redhat-marketplace-2bgbm" Mar 18 18:37:33 crc kubenswrapper[4830]: I0318 18:37:33.025824 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-276t4\" (UniqueName: \"kubernetes.io/projected/9ea4e412-cab8-45af-aec9-188680320831-kube-api-access-276t4\") pod \"redhat-marketplace-2bgbm\" (UID: \"9ea4e412-cab8-45af-aec9-188680320831\") " pod="openshift-marketplace/redhat-marketplace-2bgbm" Mar 18 18:37:33 crc kubenswrapper[4830]: I0318 18:37:33.025974 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ea4e412-cab8-45af-aec9-188680320831-utilities\") pod \"redhat-marketplace-2bgbm\" (UID: \"9ea4e412-cab8-45af-aec9-188680320831\") " pod="openshift-marketplace/redhat-marketplace-2bgbm" Mar 18 18:37:33 crc kubenswrapper[4830]: I0318 18:37:33.025998 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ea4e412-cab8-45af-aec9-188680320831-catalog-content\") pod \"redhat-marketplace-2bgbm\" (UID: \"9ea4e412-cab8-45af-aec9-188680320831\") " pod="openshift-marketplace/redhat-marketplace-2bgbm" Mar 18 18:37:33 crc kubenswrapper[4830]: I0318 18:37:33.026628 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ea4e412-cab8-45af-aec9-188680320831-catalog-content\") pod \"redhat-marketplace-2bgbm\" (UID: \"9ea4e412-cab8-45af-aec9-188680320831\") " pod="openshift-marketplace/redhat-marketplace-2bgbm" Mar 18 18:37:33 crc kubenswrapper[4830]: I0318 18:37:33.026676 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ea4e412-cab8-45af-aec9-188680320831-utilities\") pod \"redhat-marketplace-2bgbm\" (UID: \"9ea4e412-cab8-45af-aec9-188680320831\") " pod="openshift-marketplace/redhat-marketplace-2bgbm" Mar 18 18:37:33 crc kubenswrapper[4830]: I0318 18:37:33.053696 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-276t4\" (UniqueName: \"kubernetes.io/projected/9ea4e412-cab8-45af-aec9-188680320831-kube-api-access-276t4\") pod \"redhat-marketplace-2bgbm\" (UID: \"9ea4e412-cab8-45af-aec9-188680320831\") " pod="openshift-marketplace/redhat-marketplace-2bgbm" Mar 18 18:37:33 crc kubenswrapper[4830]: I0318 18:37:33.191684 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2bgbm" Mar 18 18:37:33 crc kubenswrapper[4830]: I0318 18:37:33.663935 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2bgbm"] Mar 18 18:37:33 crc kubenswrapper[4830]: I0318 18:37:33.905783 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2bgbm" event={"ID":"9ea4e412-cab8-45af-aec9-188680320831","Type":"ContainerStarted","Data":"af26eec0a44178d2c670a8156683ec59f010d86ff397b0ca3b39986c235468b0"} Mar 18 18:37:33 crc kubenswrapper[4830]: I0318 18:37:33.905822 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2bgbm" event={"ID":"9ea4e412-cab8-45af-aec9-188680320831","Type":"ContainerStarted","Data":"2cc4f2211085776c7b89368cd97412ebd02ad80f0b06bca8f077bad89030334c"} Mar 18 18:37:34 crc kubenswrapper[4830]: I0318 18:37:34.931720 4830 generic.go:334] "Generic (PLEG): container finished" podID="9ea4e412-cab8-45af-aec9-188680320831" containerID="af26eec0a44178d2c670a8156683ec59f010d86ff397b0ca3b39986c235468b0" exitCode=0 Mar 18 18:37:34 crc kubenswrapper[4830]: I0318 18:37:34.931821 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2bgbm" event={"ID":"9ea4e412-cab8-45af-aec9-188680320831","Type":"ContainerDied","Data":"af26eec0a44178d2c670a8156683ec59f010d86ff397b0ca3b39986c235468b0"} Mar 18 18:37:36 crc kubenswrapper[4830]: I0318 18:37:36.961665 4830 generic.go:334] "Generic (PLEG): container finished" podID="9ea4e412-cab8-45af-aec9-188680320831" containerID="64b84f9bdcf72f300c461ea3236e1188b8c8775e43d2d3fa9f742be5cd8016cf" exitCode=0 Mar 18 18:37:36 crc kubenswrapper[4830]: I0318 18:37:36.961947 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2bgbm" event={"ID":"9ea4e412-cab8-45af-aec9-188680320831","Type":"ContainerDied","Data":"64b84f9bdcf72f300c461ea3236e1188b8c8775e43d2d3fa9f742be5cd8016cf"} Mar 18 18:37:37 crc kubenswrapper[4830]: I0318 18:37:37.974106 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2bgbm" event={"ID":"9ea4e412-cab8-45af-aec9-188680320831","Type":"ContainerStarted","Data":"62b9e0de2122d3beba51b8b1cc094eef00b48f39ef7b525f8e93b85662dc7e58"} Mar 18 18:37:38 crc kubenswrapper[4830]: I0318 18:37:38.007367 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2bgbm" podStartSLOduration=3.43749782 podStartE2EDuration="6.007340207s" podCreationTimestamp="2026-03-18 18:37:32 +0000 UTC" firstStartedPulling="2026-03-18 18:37:34.937571811 +0000 UTC m=+2089.505202183" lastFinishedPulling="2026-03-18 18:37:37.507414208 +0000 UTC m=+2092.075044570" observedRunningTime="2026-03-18 18:37:37.998179386 +0000 UTC m=+2092.565809808" watchObservedRunningTime="2026-03-18 18:37:38.007340207 +0000 UTC m=+2092.574970569" Mar 18 18:37:43 crc kubenswrapper[4830]: I0318 18:37:43.192318 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2bgbm" Mar 18 18:37:43 crc kubenswrapper[4830]: I0318 18:37:43.192899 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2bgbm" Mar 18 18:37:43 crc kubenswrapper[4830]: I0318 18:37:43.254460 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2bgbm" Mar 18 18:37:44 crc kubenswrapper[4830]: I0318 18:37:44.088458 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2bgbm" Mar 18 18:37:44 crc kubenswrapper[4830]: I0318 18:37:44.142434 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2bgbm"] Mar 18 18:37:46 crc kubenswrapper[4830]: I0318 18:37:46.045742 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2bgbm" podUID="9ea4e412-cab8-45af-aec9-188680320831" containerName="registry-server" containerID="cri-o://62b9e0de2122d3beba51b8b1cc094eef00b48f39ef7b525f8e93b85662dc7e58" gracePeriod=2 Mar 18 18:37:46 crc kubenswrapper[4830]: I0318 18:37:46.567708 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2bgbm" Mar 18 18:37:46 crc kubenswrapper[4830]: I0318 18:37:46.729978 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ea4e412-cab8-45af-aec9-188680320831-catalog-content\") pod \"9ea4e412-cab8-45af-aec9-188680320831\" (UID: \"9ea4e412-cab8-45af-aec9-188680320831\") " Mar 18 18:37:46 crc kubenswrapper[4830]: I0318 18:37:46.730058 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-276t4\" (UniqueName: \"kubernetes.io/projected/9ea4e412-cab8-45af-aec9-188680320831-kube-api-access-276t4\") pod \"9ea4e412-cab8-45af-aec9-188680320831\" (UID: \"9ea4e412-cab8-45af-aec9-188680320831\") " Mar 18 18:37:46 crc kubenswrapper[4830]: I0318 18:37:46.730139 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ea4e412-cab8-45af-aec9-188680320831-utilities\") pod \"9ea4e412-cab8-45af-aec9-188680320831\" (UID: \"9ea4e412-cab8-45af-aec9-188680320831\") " Mar 18 18:37:46 crc kubenswrapper[4830]: I0318 18:37:46.731162 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ea4e412-cab8-45af-aec9-188680320831-utilities" (OuterVolumeSpecName: "utilities") pod "9ea4e412-cab8-45af-aec9-188680320831" (UID: "9ea4e412-cab8-45af-aec9-188680320831"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:37:46 crc kubenswrapper[4830]: I0318 18:37:46.740132 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ea4e412-cab8-45af-aec9-188680320831-kube-api-access-276t4" (OuterVolumeSpecName: "kube-api-access-276t4") pod "9ea4e412-cab8-45af-aec9-188680320831" (UID: "9ea4e412-cab8-45af-aec9-188680320831"). InnerVolumeSpecName "kube-api-access-276t4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:37:46 crc kubenswrapper[4830]: I0318 18:37:46.797659 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ea4e412-cab8-45af-aec9-188680320831-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9ea4e412-cab8-45af-aec9-188680320831" (UID: "9ea4e412-cab8-45af-aec9-188680320831"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:37:46 crc kubenswrapper[4830]: I0318 18:37:46.832579 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ea4e412-cab8-45af-aec9-188680320831-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:37:46 crc kubenswrapper[4830]: I0318 18:37:46.832635 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-276t4\" (UniqueName: \"kubernetes.io/projected/9ea4e412-cab8-45af-aec9-188680320831-kube-api-access-276t4\") on node \"crc\" DevicePath \"\"" Mar 18 18:37:46 crc kubenswrapper[4830]: I0318 18:37:46.832657 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ea4e412-cab8-45af-aec9-188680320831-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:37:47 crc kubenswrapper[4830]: I0318 18:37:47.060660 4830 generic.go:334] "Generic (PLEG): container finished" podID="9ea4e412-cab8-45af-aec9-188680320831" containerID="62b9e0de2122d3beba51b8b1cc094eef00b48f39ef7b525f8e93b85662dc7e58" exitCode=0 Mar 18 18:37:47 crc kubenswrapper[4830]: I0318 18:37:47.060804 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2bgbm" Mar 18 18:37:47 crc kubenswrapper[4830]: I0318 18:37:47.060813 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2bgbm" event={"ID":"9ea4e412-cab8-45af-aec9-188680320831","Type":"ContainerDied","Data":"62b9e0de2122d3beba51b8b1cc094eef00b48f39ef7b525f8e93b85662dc7e58"} Mar 18 18:37:47 crc kubenswrapper[4830]: I0318 18:37:47.062741 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2bgbm" event={"ID":"9ea4e412-cab8-45af-aec9-188680320831","Type":"ContainerDied","Data":"2cc4f2211085776c7b89368cd97412ebd02ad80f0b06bca8f077bad89030334c"} Mar 18 18:37:47 crc kubenswrapper[4830]: I0318 18:37:47.062790 4830 scope.go:117] "RemoveContainer" containerID="62b9e0de2122d3beba51b8b1cc094eef00b48f39ef7b525f8e93b85662dc7e58" Mar 18 18:37:47 crc kubenswrapper[4830]: I0318 18:37:47.111043 4830 scope.go:117] "RemoveContainer" containerID="64b84f9bdcf72f300c461ea3236e1188b8c8775e43d2d3fa9f742be5cd8016cf" Mar 18 18:37:47 crc kubenswrapper[4830]: I0318 18:37:47.123336 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2bgbm"] Mar 18 18:37:47 crc kubenswrapper[4830]: I0318 18:37:47.137527 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2bgbm"] Mar 18 18:37:47 crc kubenswrapper[4830]: I0318 18:37:47.141561 4830 scope.go:117] "RemoveContainer" containerID="af26eec0a44178d2c670a8156683ec59f010d86ff397b0ca3b39986c235468b0" Mar 18 18:37:47 crc kubenswrapper[4830]: I0318 18:37:47.181748 4830 scope.go:117] "RemoveContainer" containerID="62b9e0de2122d3beba51b8b1cc094eef00b48f39ef7b525f8e93b85662dc7e58" Mar 18 18:37:47 crc kubenswrapper[4830]: E0318 18:37:47.183443 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62b9e0de2122d3beba51b8b1cc094eef00b48f39ef7b525f8e93b85662dc7e58\": container with ID starting with 62b9e0de2122d3beba51b8b1cc094eef00b48f39ef7b525f8e93b85662dc7e58 not found: ID does not exist" containerID="62b9e0de2122d3beba51b8b1cc094eef00b48f39ef7b525f8e93b85662dc7e58" Mar 18 18:37:47 crc kubenswrapper[4830]: I0318 18:37:47.183501 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62b9e0de2122d3beba51b8b1cc094eef00b48f39ef7b525f8e93b85662dc7e58"} err="failed to get container status \"62b9e0de2122d3beba51b8b1cc094eef00b48f39ef7b525f8e93b85662dc7e58\": rpc error: code = NotFound desc = could not find container \"62b9e0de2122d3beba51b8b1cc094eef00b48f39ef7b525f8e93b85662dc7e58\": container with ID starting with 62b9e0de2122d3beba51b8b1cc094eef00b48f39ef7b525f8e93b85662dc7e58 not found: ID does not exist" Mar 18 18:37:47 crc kubenswrapper[4830]: I0318 18:37:47.183533 4830 scope.go:117] "RemoveContainer" containerID="64b84f9bdcf72f300c461ea3236e1188b8c8775e43d2d3fa9f742be5cd8016cf" Mar 18 18:37:47 crc kubenswrapper[4830]: E0318 18:37:47.184141 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64b84f9bdcf72f300c461ea3236e1188b8c8775e43d2d3fa9f742be5cd8016cf\": container with ID starting with 64b84f9bdcf72f300c461ea3236e1188b8c8775e43d2d3fa9f742be5cd8016cf not found: ID does not exist" containerID="64b84f9bdcf72f300c461ea3236e1188b8c8775e43d2d3fa9f742be5cd8016cf" Mar 18 18:37:47 crc kubenswrapper[4830]: I0318 18:37:47.184192 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64b84f9bdcf72f300c461ea3236e1188b8c8775e43d2d3fa9f742be5cd8016cf"} err="failed to get container status \"64b84f9bdcf72f300c461ea3236e1188b8c8775e43d2d3fa9f742be5cd8016cf\": rpc error: code = NotFound desc = could not find container \"64b84f9bdcf72f300c461ea3236e1188b8c8775e43d2d3fa9f742be5cd8016cf\": container with ID starting with 64b84f9bdcf72f300c461ea3236e1188b8c8775e43d2d3fa9f742be5cd8016cf not found: ID does not exist" Mar 18 18:37:47 crc kubenswrapper[4830]: I0318 18:37:47.184232 4830 scope.go:117] "RemoveContainer" containerID="af26eec0a44178d2c670a8156683ec59f010d86ff397b0ca3b39986c235468b0" Mar 18 18:37:47 crc kubenswrapper[4830]: E0318 18:37:47.184858 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af26eec0a44178d2c670a8156683ec59f010d86ff397b0ca3b39986c235468b0\": container with ID starting with af26eec0a44178d2c670a8156683ec59f010d86ff397b0ca3b39986c235468b0 not found: ID does not exist" containerID="af26eec0a44178d2c670a8156683ec59f010d86ff397b0ca3b39986c235468b0" Mar 18 18:37:47 crc kubenswrapper[4830]: I0318 18:37:47.184926 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af26eec0a44178d2c670a8156683ec59f010d86ff397b0ca3b39986c235468b0"} err="failed to get container status \"af26eec0a44178d2c670a8156683ec59f010d86ff397b0ca3b39986c235468b0\": rpc error: code = NotFound desc = could not find container \"af26eec0a44178d2c670a8156683ec59f010d86ff397b0ca3b39986c235468b0\": container with ID starting with af26eec0a44178d2c670a8156683ec59f010d86ff397b0ca3b39986c235468b0 not found: ID does not exist" Mar 18 18:37:48 crc kubenswrapper[4830]: I0318 18:37:48.248469 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ea4e412-cab8-45af-aec9-188680320831" path="/var/lib/kubelet/pods/9ea4e412-cab8-45af-aec9-188680320831/volumes" Mar 18 18:37:59 crc kubenswrapper[4830]: I0318 18:37:59.510044 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:37:59 crc kubenswrapper[4830]: I0318 18:37:59.512019 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:37:59 crc kubenswrapper[4830]: I0318 18:37:59.512106 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" Mar 18 18:37:59 crc kubenswrapper[4830]: I0318 18:37:59.512988 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"45d96ab476768f3520cb380f87fe0545043e6591eaa2f2e1a853cbb7f4d2d3bf"} pod="openshift-machine-config-operator/machine-config-daemon-plzpb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 18:37:59 crc kubenswrapper[4830]: I0318 18:37:59.513094 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" containerID="cri-o://45d96ab476768f3520cb380f87fe0545043e6591eaa2f2e1a853cbb7f4d2d3bf" gracePeriod=600 Mar 18 18:38:00 crc kubenswrapper[4830]: I0318 18:38:00.156003 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564318-vsgrk"] Mar 18 18:38:00 crc kubenswrapper[4830]: E0318 18:38:00.156681 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ea4e412-cab8-45af-aec9-188680320831" containerName="extract-utilities" Mar 18 18:38:00 crc kubenswrapper[4830]: I0318 18:38:00.156715 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ea4e412-cab8-45af-aec9-188680320831" containerName="extract-utilities" Mar 18 18:38:00 crc kubenswrapper[4830]: E0318 18:38:00.156729 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ea4e412-cab8-45af-aec9-188680320831" containerName="extract-content" Mar 18 18:38:00 crc kubenswrapper[4830]: I0318 18:38:00.156738 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ea4e412-cab8-45af-aec9-188680320831" containerName="extract-content" Mar 18 18:38:00 crc kubenswrapper[4830]: E0318 18:38:00.156753 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ea4e412-cab8-45af-aec9-188680320831" containerName="registry-server" Mar 18 18:38:00 crc kubenswrapper[4830]: I0318 18:38:00.156762 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ea4e412-cab8-45af-aec9-188680320831" containerName="registry-server" Mar 18 18:38:00 crc kubenswrapper[4830]: I0318 18:38:00.156981 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ea4e412-cab8-45af-aec9-188680320831" containerName="registry-server" Mar 18 18:38:00 crc kubenswrapper[4830]: I0318 18:38:00.157518 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564318-vsgrk" Mar 18 18:38:00 crc kubenswrapper[4830]: I0318 18:38:00.160189 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:38:00 crc kubenswrapper[4830]: I0318 18:38:00.162610 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:38:00 crc kubenswrapper[4830]: I0318 18:38:00.170693 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 18:38:00 crc kubenswrapper[4830]: I0318 18:38:00.172970 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564318-vsgrk"] Mar 18 18:38:00 crc kubenswrapper[4830]: I0318 18:38:00.174396 4830 generic.go:334] "Generic (PLEG): container finished" podID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerID="45d96ab476768f3520cb380f87fe0545043e6591eaa2f2e1a853cbb7f4d2d3bf" exitCode=0 Mar 18 18:38:00 crc kubenswrapper[4830]: I0318 18:38:00.174437 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerDied","Data":"45d96ab476768f3520cb380f87fe0545043e6591eaa2f2e1a853cbb7f4d2d3bf"} Mar 18 18:38:00 crc kubenswrapper[4830]: I0318 18:38:00.174539 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerStarted","Data":"dfbcf38a330ef2a2c30556ca081227685219cf1b5ceefae31a414ea5e6724c04"} Mar 18 18:38:00 crc kubenswrapper[4830]: I0318 18:38:00.174568 4830 scope.go:117] "RemoveContainer" containerID="46d4f627aa313dcb3a4d23bb3daecb5b1b3e4d6558380e5cd88db3746ce3260b" Mar 18 18:38:00 crc kubenswrapper[4830]: I0318 18:38:00.270927 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpg4z\" (UniqueName: \"kubernetes.io/projected/73eef308-def6-491f-8c14-45a58d8f066f-kube-api-access-cpg4z\") pod \"auto-csr-approver-29564318-vsgrk\" (UID: \"73eef308-def6-491f-8c14-45a58d8f066f\") " pod="openshift-infra/auto-csr-approver-29564318-vsgrk" Mar 18 18:38:00 crc kubenswrapper[4830]: I0318 18:38:00.372400 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpg4z\" (UniqueName: \"kubernetes.io/projected/73eef308-def6-491f-8c14-45a58d8f066f-kube-api-access-cpg4z\") pod \"auto-csr-approver-29564318-vsgrk\" (UID: \"73eef308-def6-491f-8c14-45a58d8f066f\") " pod="openshift-infra/auto-csr-approver-29564318-vsgrk" Mar 18 18:38:00 crc kubenswrapper[4830]: I0318 18:38:00.389811 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpg4z\" (UniqueName: \"kubernetes.io/projected/73eef308-def6-491f-8c14-45a58d8f066f-kube-api-access-cpg4z\") pod \"auto-csr-approver-29564318-vsgrk\" (UID: \"73eef308-def6-491f-8c14-45a58d8f066f\") " pod="openshift-infra/auto-csr-approver-29564318-vsgrk" Mar 18 18:38:00 crc kubenswrapper[4830]: I0318 18:38:00.482977 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564318-vsgrk" Mar 18 18:38:00 crc kubenswrapper[4830]: I0318 18:38:00.720249 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564318-vsgrk"] Mar 18 18:38:00 crc kubenswrapper[4830]: W0318 18:38:00.739254 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73eef308_def6_491f_8c14_45a58d8f066f.slice/crio-5469d6c459aacfb57863fb0afdf617f5ae69cc435ad1e0fb0bc8bc033a3a3c9c WatchSource:0}: Error finding container 5469d6c459aacfb57863fb0afdf617f5ae69cc435ad1e0fb0bc8bc033a3a3c9c: Status 404 returned error can't find the container with id 5469d6c459aacfb57863fb0afdf617f5ae69cc435ad1e0fb0bc8bc033a3a3c9c Mar 18 18:38:01 crc kubenswrapper[4830]: I0318 18:38:01.186504 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564318-vsgrk" event={"ID":"73eef308-def6-491f-8c14-45a58d8f066f","Type":"ContainerStarted","Data":"5469d6c459aacfb57863fb0afdf617f5ae69cc435ad1e0fb0bc8bc033a3a3c9c"} Mar 18 18:38:03 crc kubenswrapper[4830]: I0318 18:38:03.207680 4830 generic.go:334] "Generic (PLEG): container finished" podID="73eef308-def6-491f-8c14-45a58d8f066f" containerID="d4a98c7d5b941638f1e9f91c645c24621a33ec1aca7b51f67c17b7af6333e126" exitCode=0 Mar 18 18:38:03 crc kubenswrapper[4830]: I0318 18:38:03.207840 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564318-vsgrk" event={"ID":"73eef308-def6-491f-8c14-45a58d8f066f","Type":"ContainerDied","Data":"d4a98c7d5b941638f1e9f91c645c24621a33ec1aca7b51f67c17b7af6333e126"} Mar 18 18:38:04 crc kubenswrapper[4830]: I0318 18:38:04.575535 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564318-vsgrk" Mar 18 18:38:04 crc kubenswrapper[4830]: I0318 18:38:04.733173 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpg4z\" (UniqueName: \"kubernetes.io/projected/73eef308-def6-491f-8c14-45a58d8f066f-kube-api-access-cpg4z\") pod \"73eef308-def6-491f-8c14-45a58d8f066f\" (UID: \"73eef308-def6-491f-8c14-45a58d8f066f\") " Mar 18 18:38:04 crc kubenswrapper[4830]: I0318 18:38:04.742173 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73eef308-def6-491f-8c14-45a58d8f066f-kube-api-access-cpg4z" (OuterVolumeSpecName: "kube-api-access-cpg4z") pod "73eef308-def6-491f-8c14-45a58d8f066f" (UID: "73eef308-def6-491f-8c14-45a58d8f066f"). InnerVolumeSpecName "kube-api-access-cpg4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:38:04 crc kubenswrapper[4830]: I0318 18:38:04.835306 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpg4z\" (UniqueName: \"kubernetes.io/projected/73eef308-def6-491f-8c14-45a58d8f066f-kube-api-access-cpg4z\") on node \"crc\" DevicePath \"\"" Mar 18 18:38:05 crc kubenswrapper[4830]: I0318 18:38:05.227183 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564318-vsgrk" event={"ID":"73eef308-def6-491f-8c14-45a58d8f066f","Type":"ContainerDied","Data":"5469d6c459aacfb57863fb0afdf617f5ae69cc435ad1e0fb0bc8bc033a3a3c9c"} Mar 18 18:38:05 crc kubenswrapper[4830]: I0318 18:38:05.227483 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5469d6c459aacfb57863fb0afdf617f5ae69cc435ad1e0fb0bc8bc033a3a3c9c" Mar 18 18:38:05 crc kubenswrapper[4830]: I0318 18:38:05.227251 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564318-vsgrk" Mar 18 18:38:06 crc kubenswrapper[4830]: I0318 18:38:06.192701 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564312-tt6hm"] Mar 18 18:38:06 crc kubenswrapper[4830]: I0318 18:38:06.200740 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564312-tt6hm"] Mar 18 18:38:06 crc kubenswrapper[4830]: I0318 18:38:06.242531 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="685eb33e-eea1-47ad-bbf4-567dc5821cac" path="/var/lib/kubelet/pods/685eb33e-eea1-47ad-bbf4-567dc5821cac/volumes" Mar 18 18:38:16 crc kubenswrapper[4830]: I0318 18:38:16.619028 4830 scope.go:117] "RemoveContainer" containerID="a00bd79a7aa0d7b7bfea0d24aeab33dadc514aba38d34746691e4adcd95d9fe8" Mar 18 18:38:28 crc kubenswrapper[4830]: I0318 18:38:28.725560 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ssvhw"] Mar 18 18:38:28 crc kubenswrapper[4830]: E0318 18:38:28.728286 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73eef308-def6-491f-8c14-45a58d8f066f" containerName="oc" Mar 18 18:38:28 crc kubenswrapper[4830]: I0318 18:38:28.728444 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="73eef308-def6-491f-8c14-45a58d8f066f" containerName="oc" Mar 18 18:38:28 crc kubenswrapper[4830]: I0318 18:38:28.728858 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="73eef308-def6-491f-8c14-45a58d8f066f" containerName="oc" Mar 18 18:38:28 crc kubenswrapper[4830]: I0318 18:38:28.730630 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ssvhw" Mar 18 18:38:28 crc kubenswrapper[4830]: I0318 18:38:28.745935 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ssvhw"] Mar 18 18:38:28 crc kubenswrapper[4830]: I0318 18:38:28.914881 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53eddf8f-529f-46e5-a245-44e4fb1d2abb-catalog-content\") pod \"community-operators-ssvhw\" (UID: \"53eddf8f-529f-46e5-a245-44e4fb1d2abb\") " pod="openshift-marketplace/community-operators-ssvhw" Mar 18 18:38:28 crc kubenswrapper[4830]: I0318 18:38:28.914926 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53eddf8f-529f-46e5-a245-44e4fb1d2abb-utilities\") pod \"community-operators-ssvhw\" (UID: \"53eddf8f-529f-46e5-a245-44e4fb1d2abb\") " pod="openshift-marketplace/community-operators-ssvhw" Mar 18 18:38:28 crc kubenswrapper[4830]: I0318 18:38:28.914983 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvfvg\" (UniqueName: \"kubernetes.io/projected/53eddf8f-529f-46e5-a245-44e4fb1d2abb-kube-api-access-lvfvg\") pod \"community-operators-ssvhw\" (UID: \"53eddf8f-529f-46e5-a245-44e4fb1d2abb\") " pod="openshift-marketplace/community-operators-ssvhw" Mar 18 18:38:29 crc kubenswrapper[4830]: I0318 18:38:29.016523 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvfvg\" (UniqueName: \"kubernetes.io/projected/53eddf8f-529f-46e5-a245-44e4fb1d2abb-kube-api-access-lvfvg\") pod \"community-operators-ssvhw\" (UID: \"53eddf8f-529f-46e5-a245-44e4fb1d2abb\") " pod="openshift-marketplace/community-operators-ssvhw" Mar 18 18:38:29 crc kubenswrapper[4830]: I0318 18:38:29.016623 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53eddf8f-529f-46e5-a245-44e4fb1d2abb-catalog-content\") pod \"community-operators-ssvhw\" (UID: \"53eddf8f-529f-46e5-a245-44e4fb1d2abb\") " pod="openshift-marketplace/community-operators-ssvhw" Mar 18 18:38:29 crc kubenswrapper[4830]: I0318 18:38:29.016645 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53eddf8f-529f-46e5-a245-44e4fb1d2abb-utilities\") pod \"community-operators-ssvhw\" (UID: \"53eddf8f-529f-46e5-a245-44e4fb1d2abb\") " pod="openshift-marketplace/community-operators-ssvhw" Mar 18 18:38:29 crc kubenswrapper[4830]: I0318 18:38:29.017073 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53eddf8f-529f-46e5-a245-44e4fb1d2abb-utilities\") pod \"community-operators-ssvhw\" (UID: \"53eddf8f-529f-46e5-a245-44e4fb1d2abb\") " pod="openshift-marketplace/community-operators-ssvhw" Mar 18 18:38:29 crc kubenswrapper[4830]: I0318 18:38:29.017580 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53eddf8f-529f-46e5-a245-44e4fb1d2abb-catalog-content\") pod \"community-operators-ssvhw\" (UID: \"53eddf8f-529f-46e5-a245-44e4fb1d2abb\") " pod="openshift-marketplace/community-operators-ssvhw" Mar 18 18:38:29 crc kubenswrapper[4830]: I0318 18:38:29.038935 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvfvg\" (UniqueName: \"kubernetes.io/projected/53eddf8f-529f-46e5-a245-44e4fb1d2abb-kube-api-access-lvfvg\") pod \"community-operators-ssvhw\" (UID: \"53eddf8f-529f-46e5-a245-44e4fb1d2abb\") " pod="openshift-marketplace/community-operators-ssvhw" Mar 18 18:38:29 crc kubenswrapper[4830]: I0318 18:38:29.059741 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ssvhw" Mar 18 18:38:29 crc kubenswrapper[4830]: I0318 18:38:29.536931 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ssvhw"] Mar 18 18:38:30 crc kubenswrapper[4830]: I0318 18:38:30.447341 4830 generic.go:334] "Generic (PLEG): container finished" podID="53eddf8f-529f-46e5-a245-44e4fb1d2abb" containerID="eef1295f76ea1291db10753a47c532d7515b77046cfe9f094cf8947d2a0969e9" exitCode=0 Mar 18 18:38:30 crc kubenswrapper[4830]: I0318 18:38:30.447421 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ssvhw" event={"ID":"53eddf8f-529f-46e5-a245-44e4fb1d2abb","Type":"ContainerDied","Data":"eef1295f76ea1291db10753a47c532d7515b77046cfe9f094cf8947d2a0969e9"} Mar 18 18:38:30 crc kubenswrapper[4830]: I0318 18:38:30.447748 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ssvhw" event={"ID":"53eddf8f-529f-46e5-a245-44e4fb1d2abb","Type":"ContainerStarted","Data":"abdb85b15a5c16e21d6a83b4bf70db36d070cb6d63e3f9cd2a544c0c50688097"} Mar 18 18:38:33 crc kubenswrapper[4830]: I0318 18:38:33.910882 4830 generic.go:334] "Generic (PLEG): container finished" podID="53eddf8f-529f-46e5-a245-44e4fb1d2abb" containerID="bfc756ace1316933fa6da48d83f15b536b311acefc5f237f6bfce576b305c454" exitCode=0 Mar 18 18:38:33 crc kubenswrapper[4830]: I0318 18:38:33.912034 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ssvhw" event={"ID":"53eddf8f-529f-46e5-a245-44e4fb1d2abb","Type":"ContainerDied","Data":"bfc756ace1316933fa6da48d83f15b536b311acefc5f237f6bfce576b305c454"} Mar 18 18:38:35 crc kubenswrapper[4830]: I0318 18:38:35.941436 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ssvhw" event={"ID":"53eddf8f-529f-46e5-a245-44e4fb1d2abb","Type":"ContainerStarted","Data":"608bba8a89f39de461eb011c45d500eb08c283acebe0cddaf64e1b7282914ae6"} Mar 18 18:38:35 crc kubenswrapper[4830]: I0318 18:38:35.969289 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ssvhw" podStartSLOduration=2.925908418 podStartE2EDuration="7.96925924s" podCreationTimestamp="2026-03-18 18:38:28 +0000 UTC" firstStartedPulling="2026-03-18 18:38:30.450061317 +0000 UTC m=+2145.017691669" lastFinishedPulling="2026-03-18 18:38:35.493412119 +0000 UTC m=+2150.061042491" observedRunningTime="2026-03-18 18:38:35.968120558 +0000 UTC m=+2150.535750900" watchObservedRunningTime="2026-03-18 18:38:35.96925924 +0000 UTC m=+2150.536889612" Mar 18 18:38:39 crc kubenswrapper[4830]: I0318 18:38:39.060839 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ssvhw" Mar 18 18:38:39 crc kubenswrapper[4830]: I0318 18:38:39.061162 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ssvhw" Mar 18 18:38:39 crc kubenswrapper[4830]: I0318 18:38:39.134349 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ssvhw" Mar 18 18:38:40 crc kubenswrapper[4830]: I0318 18:38:40.339873 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ssvhw" Mar 18 18:38:43 crc kubenswrapper[4830]: I0318 18:38:43.549255 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ssvhw"] Mar 18 18:38:43 crc kubenswrapper[4830]: I0318 18:38:43.550127 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ssvhw" podUID="53eddf8f-529f-46e5-a245-44e4fb1d2abb" containerName="registry-server" containerID="cri-o://608bba8a89f39de461eb011c45d500eb08c283acebe0cddaf64e1b7282914ae6" gracePeriod=2 Mar 18 18:38:44 crc kubenswrapper[4830]: I0318 18:38:44.046235 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ssvhw" Mar 18 18:38:44 crc kubenswrapper[4830]: I0318 18:38:44.057159 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53eddf8f-529f-46e5-a245-44e4fb1d2abb-utilities\") pod \"53eddf8f-529f-46e5-a245-44e4fb1d2abb\" (UID: \"53eddf8f-529f-46e5-a245-44e4fb1d2abb\") " Mar 18 18:38:44 crc kubenswrapper[4830]: I0318 18:38:44.057322 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53eddf8f-529f-46e5-a245-44e4fb1d2abb-catalog-content\") pod \"53eddf8f-529f-46e5-a245-44e4fb1d2abb\" (UID: \"53eddf8f-529f-46e5-a245-44e4fb1d2abb\") " Mar 18 18:38:44 crc kubenswrapper[4830]: I0318 18:38:44.059002 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvfvg\" (UniqueName: \"kubernetes.io/projected/53eddf8f-529f-46e5-a245-44e4fb1d2abb-kube-api-access-lvfvg\") pod \"53eddf8f-529f-46e5-a245-44e4fb1d2abb\" (UID: \"53eddf8f-529f-46e5-a245-44e4fb1d2abb\") " Mar 18 18:38:44 crc kubenswrapper[4830]: I0318 18:38:44.059963 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53eddf8f-529f-46e5-a245-44e4fb1d2abb-utilities" (OuterVolumeSpecName: "utilities") pod "53eddf8f-529f-46e5-a245-44e4fb1d2abb" (UID: "53eddf8f-529f-46e5-a245-44e4fb1d2abb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:38:44 crc kubenswrapper[4830]: I0318 18:38:44.078213 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53eddf8f-529f-46e5-a245-44e4fb1d2abb-kube-api-access-lvfvg" (OuterVolumeSpecName: "kube-api-access-lvfvg") pod "53eddf8f-529f-46e5-a245-44e4fb1d2abb" (UID: "53eddf8f-529f-46e5-a245-44e4fb1d2abb"). InnerVolumeSpecName "kube-api-access-lvfvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:38:44 crc kubenswrapper[4830]: I0318 18:38:44.160544 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53eddf8f-529f-46e5-a245-44e4fb1d2abb-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:38:44 crc kubenswrapper[4830]: I0318 18:38:44.160596 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvfvg\" (UniqueName: \"kubernetes.io/projected/53eddf8f-529f-46e5-a245-44e4fb1d2abb-kube-api-access-lvfvg\") on node \"crc\" DevicePath \"\"" Mar 18 18:38:44 crc kubenswrapper[4830]: I0318 18:38:44.341382 4830 generic.go:334] "Generic (PLEG): container finished" podID="53eddf8f-529f-46e5-a245-44e4fb1d2abb" containerID="608bba8a89f39de461eb011c45d500eb08c283acebe0cddaf64e1b7282914ae6" exitCode=0 Mar 18 18:38:44 crc kubenswrapper[4830]: I0318 18:38:44.341472 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ssvhw" event={"ID":"53eddf8f-529f-46e5-a245-44e4fb1d2abb","Type":"ContainerDied","Data":"608bba8a89f39de461eb011c45d500eb08c283acebe0cddaf64e1b7282914ae6"} Mar 18 18:38:44 crc kubenswrapper[4830]: I0318 18:38:44.341532 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ssvhw" event={"ID":"53eddf8f-529f-46e5-a245-44e4fb1d2abb","Type":"ContainerDied","Data":"abdb85b15a5c16e21d6a83b4bf70db36d070cb6d63e3f9cd2a544c0c50688097"} Mar 18 18:38:44 crc kubenswrapper[4830]: I0318 18:38:44.341556 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ssvhw" Mar 18 18:38:44 crc kubenswrapper[4830]: I0318 18:38:44.341566 4830 scope.go:117] "RemoveContainer" containerID="608bba8a89f39de461eb011c45d500eb08c283acebe0cddaf64e1b7282914ae6" Mar 18 18:38:44 crc kubenswrapper[4830]: I0318 18:38:44.363249 4830 scope.go:117] "RemoveContainer" containerID="bfc756ace1316933fa6da48d83f15b536b311acefc5f237f6bfce576b305c454" Mar 18 18:38:44 crc kubenswrapper[4830]: I0318 18:38:44.385215 4830 scope.go:117] "RemoveContainer" containerID="eef1295f76ea1291db10753a47c532d7515b77046cfe9f094cf8947d2a0969e9" Mar 18 18:38:44 crc kubenswrapper[4830]: I0318 18:38:44.418128 4830 scope.go:117] "RemoveContainer" containerID="608bba8a89f39de461eb011c45d500eb08c283acebe0cddaf64e1b7282914ae6" Mar 18 18:38:44 crc kubenswrapper[4830]: E0318 18:38:44.418720 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"608bba8a89f39de461eb011c45d500eb08c283acebe0cddaf64e1b7282914ae6\": container with ID starting with 608bba8a89f39de461eb011c45d500eb08c283acebe0cddaf64e1b7282914ae6 not found: ID does not exist" containerID="608bba8a89f39de461eb011c45d500eb08c283acebe0cddaf64e1b7282914ae6" Mar 18 18:38:44 crc kubenswrapper[4830]: I0318 18:38:44.418795 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"608bba8a89f39de461eb011c45d500eb08c283acebe0cddaf64e1b7282914ae6"} err="failed to get container status \"608bba8a89f39de461eb011c45d500eb08c283acebe0cddaf64e1b7282914ae6\": rpc error: code = NotFound desc = could not find container \"608bba8a89f39de461eb011c45d500eb08c283acebe0cddaf64e1b7282914ae6\": container with ID starting with 608bba8a89f39de461eb011c45d500eb08c283acebe0cddaf64e1b7282914ae6 not found: ID does not exist" Mar 18 18:38:44 crc kubenswrapper[4830]: I0318 18:38:44.418822 4830 scope.go:117] "RemoveContainer" containerID="bfc756ace1316933fa6da48d83f15b536b311acefc5f237f6bfce576b305c454" Mar 18 18:38:44 crc kubenswrapper[4830]: E0318 18:38:44.419534 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfc756ace1316933fa6da48d83f15b536b311acefc5f237f6bfce576b305c454\": container with ID starting with bfc756ace1316933fa6da48d83f15b536b311acefc5f237f6bfce576b305c454 not found: ID does not exist" containerID="bfc756ace1316933fa6da48d83f15b536b311acefc5f237f6bfce576b305c454" Mar 18 18:38:44 crc kubenswrapper[4830]: I0318 18:38:44.419571 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfc756ace1316933fa6da48d83f15b536b311acefc5f237f6bfce576b305c454"} err="failed to get container status \"bfc756ace1316933fa6da48d83f15b536b311acefc5f237f6bfce576b305c454\": rpc error: code = NotFound desc = could not find container \"bfc756ace1316933fa6da48d83f15b536b311acefc5f237f6bfce576b305c454\": container with ID starting with bfc756ace1316933fa6da48d83f15b536b311acefc5f237f6bfce576b305c454 not found: ID does not exist" Mar 18 18:38:44 crc kubenswrapper[4830]: I0318 18:38:44.419599 4830 scope.go:117] "RemoveContainer" containerID="eef1295f76ea1291db10753a47c532d7515b77046cfe9f094cf8947d2a0969e9" Mar 18 18:38:44 crc kubenswrapper[4830]: E0318 18:38:44.420128 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eef1295f76ea1291db10753a47c532d7515b77046cfe9f094cf8947d2a0969e9\": container with ID starting with eef1295f76ea1291db10753a47c532d7515b77046cfe9f094cf8947d2a0969e9 not found: ID does not exist" containerID="eef1295f76ea1291db10753a47c532d7515b77046cfe9f094cf8947d2a0969e9" Mar 18 18:38:44 crc kubenswrapper[4830]: I0318 18:38:44.420156 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eef1295f76ea1291db10753a47c532d7515b77046cfe9f094cf8947d2a0969e9"} err="failed to get container status \"eef1295f76ea1291db10753a47c532d7515b77046cfe9f094cf8947d2a0969e9\": rpc error: code = NotFound desc = could not find container \"eef1295f76ea1291db10753a47c532d7515b77046cfe9f094cf8947d2a0969e9\": container with ID starting with eef1295f76ea1291db10753a47c532d7515b77046cfe9f094cf8947d2a0969e9 not found: ID does not exist" Mar 18 18:38:44 crc kubenswrapper[4830]: I0318 18:38:44.667354 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53eddf8f-529f-46e5-a245-44e4fb1d2abb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "53eddf8f-529f-46e5-a245-44e4fb1d2abb" (UID: "53eddf8f-529f-46e5-a245-44e4fb1d2abb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:38:44 crc kubenswrapper[4830]: I0318 18:38:44.768730 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53eddf8f-529f-46e5-a245-44e4fb1d2abb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:38:44 crc kubenswrapper[4830]: I0318 18:38:44.992029 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ssvhw"] Mar 18 18:38:45 crc kubenswrapper[4830]: I0318 18:38:45.002416 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ssvhw"] Mar 18 18:38:46 crc kubenswrapper[4830]: I0318 18:38:46.254990 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53eddf8f-529f-46e5-a245-44e4fb1d2abb" path="/var/lib/kubelet/pods/53eddf8f-529f-46e5-a245-44e4fb1d2abb/volumes" Mar 18 18:39:59 crc kubenswrapper[4830]: I0318 18:39:59.509991 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:39:59 crc kubenswrapper[4830]: I0318 18:39:59.510760 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:40:00 crc kubenswrapper[4830]: I0318 18:40:00.155696 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564320-fdgsj"] Mar 18 18:40:00 crc kubenswrapper[4830]: E0318 18:40:00.156215 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53eddf8f-529f-46e5-a245-44e4fb1d2abb" containerName="extract-content" Mar 18 18:40:00 crc kubenswrapper[4830]: I0318 18:40:00.156250 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="53eddf8f-529f-46e5-a245-44e4fb1d2abb" containerName="extract-content" Mar 18 18:40:00 crc kubenswrapper[4830]: E0318 18:40:00.156289 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53eddf8f-529f-46e5-a245-44e4fb1d2abb" containerName="registry-server" Mar 18 18:40:00 crc kubenswrapper[4830]: I0318 18:40:00.156302 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="53eddf8f-529f-46e5-a245-44e4fb1d2abb" containerName="registry-server" Mar 18 18:40:00 crc kubenswrapper[4830]: E0318 18:40:00.156335 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53eddf8f-529f-46e5-a245-44e4fb1d2abb" containerName="extract-utilities" Mar 18 18:40:00 crc kubenswrapper[4830]: I0318 18:40:00.156350 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="53eddf8f-529f-46e5-a245-44e4fb1d2abb" containerName="extract-utilities" Mar 18 18:40:00 crc kubenswrapper[4830]: I0318 18:40:00.156579 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="53eddf8f-529f-46e5-a245-44e4fb1d2abb" containerName="registry-server" Mar 18 18:40:00 crc kubenswrapper[4830]: I0318 18:40:00.157400 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564320-fdgsj" Mar 18 18:40:00 crc kubenswrapper[4830]: I0318 18:40:00.161213 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:40:00 crc kubenswrapper[4830]: I0318 18:40:00.162819 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 18:40:00 crc kubenswrapper[4830]: I0318 18:40:00.164139 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564320-fdgsj"] Mar 18 18:40:00 crc kubenswrapper[4830]: I0318 18:40:00.165315 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:40:00 crc kubenswrapper[4830]: I0318 18:40:00.249187 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbmqg\" (UniqueName: \"kubernetes.io/projected/9de10e5e-26e9-4d40-847e-ef92da467d45-kube-api-access-hbmqg\") pod \"auto-csr-approver-29564320-fdgsj\" (UID: \"9de10e5e-26e9-4d40-847e-ef92da467d45\") " pod="openshift-infra/auto-csr-approver-29564320-fdgsj" Mar 18 18:40:00 crc kubenswrapper[4830]: I0318 18:40:00.351029 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbmqg\" (UniqueName: \"kubernetes.io/projected/9de10e5e-26e9-4d40-847e-ef92da467d45-kube-api-access-hbmqg\") pod \"auto-csr-approver-29564320-fdgsj\" (UID: \"9de10e5e-26e9-4d40-847e-ef92da467d45\") " pod="openshift-infra/auto-csr-approver-29564320-fdgsj" Mar 18 18:40:00 crc kubenswrapper[4830]: I0318 18:40:00.375670 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbmqg\" (UniqueName: \"kubernetes.io/projected/9de10e5e-26e9-4d40-847e-ef92da467d45-kube-api-access-hbmqg\") pod \"auto-csr-approver-29564320-fdgsj\" (UID: \"9de10e5e-26e9-4d40-847e-ef92da467d45\") " pod="openshift-infra/auto-csr-approver-29564320-fdgsj" Mar 18 18:40:00 crc kubenswrapper[4830]: I0318 18:40:00.485532 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564320-fdgsj" Mar 18 18:40:00 crc kubenswrapper[4830]: I0318 18:40:00.745337 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564320-fdgsj"] Mar 18 18:40:01 crc kubenswrapper[4830]: I0318 18:40:01.360085 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564320-fdgsj" event={"ID":"9de10e5e-26e9-4d40-847e-ef92da467d45","Type":"ContainerStarted","Data":"562b50b2f80a25f2ad29a2bf7e73f9928d10cd363e8622ce4367f1fbbfadb233"} Mar 18 18:40:02 crc kubenswrapper[4830]: I0318 18:40:02.371268 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564320-fdgsj" event={"ID":"9de10e5e-26e9-4d40-847e-ef92da467d45","Type":"ContainerStarted","Data":"18bf01fadb0260ed507c33f4c2e9f2d227ee4e7ea0ef5009bcb576c2798ec120"} Mar 18 18:40:02 crc kubenswrapper[4830]: I0318 18:40:02.405101 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564320-fdgsj" podStartSLOduration=1.22570526 podStartE2EDuration="2.405066785s" podCreationTimestamp="2026-03-18 18:40:00 +0000 UTC" firstStartedPulling="2026-03-18 18:40:00.749941051 +0000 UTC m=+2235.317571383" lastFinishedPulling="2026-03-18 18:40:01.929302536 +0000 UTC m=+2236.496932908" observedRunningTime="2026-03-18 18:40:02.400147586 +0000 UTC m=+2236.967777928" watchObservedRunningTime="2026-03-18 18:40:02.405066785 +0000 UTC m=+2236.972697147" Mar 18 18:40:03 crc kubenswrapper[4830]: I0318 18:40:03.382574 4830 generic.go:334] "Generic (PLEG): container finished" podID="9de10e5e-26e9-4d40-847e-ef92da467d45" containerID="18bf01fadb0260ed507c33f4c2e9f2d227ee4e7ea0ef5009bcb576c2798ec120" exitCode=0 Mar 18 18:40:03 crc kubenswrapper[4830]: I0318 18:40:03.382636 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564320-fdgsj" event={"ID":"9de10e5e-26e9-4d40-847e-ef92da467d45","Type":"ContainerDied","Data":"18bf01fadb0260ed507c33f4c2e9f2d227ee4e7ea0ef5009bcb576c2798ec120"} Mar 18 18:40:04 crc kubenswrapper[4830]: I0318 18:40:04.738701 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564320-fdgsj" Mar 18 18:40:04 crc kubenswrapper[4830]: I0318 18:40:04.831950 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbmqg\" (UniqueName: \"kubernetes.io/projected/9de10e5e-26e9-4d40-847e-ef92da467d45-kube-api-access-hbmqg\") pod \"9de10e5e-26e9-4d40-847e-ef92da467d45\" (UID: \"9de10e5e-26e9-4d40-847e-ef92da467d45\") " Mar 18 18:40:04 crc kubenswrapper[4830]: I0318 18:40:04.842349 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9de10e5e-26e9-4d40-847e-ef92da467d45-kube-api-access-hbmqg" (OuterVolumeSpecName: "kube-api-access-hbmqg") pod "9de10e5e-26e9-4d40-847e-ef92da467d45" (UID: "9de10e5e-26e9-4d40-847e-ef92da467d45"). InnerVolumeSpecName "kube-api-access-hbmqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:40:04 crc kubenswrapper[4830]: I0318 18:40:04.933725 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbmqg\" (UniqueName: \"kubernetes.io/projected/9de10e5e-26e9-4d40-847e-ef92da467d45-kube-api-access-hbmqg\") on node \"crc\" DevicePath \"\"" Mar 18 18:40:05 crc kubenswrapper[4830]: I0318 18:40:05.412753 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564320-fdgsj" event={"ID":"9de10e5e-26e9-4d40-847e-ef92da467d45","Type":"ContainerDied","Data":"562b50b2f80a25f2ad29a2bf7e73f9928d10cd363e8622ce4367f1fbbfadb233"} Mar 18 18:40:05 crc kubenswrapper[4830]: I0318 18:40:05.413299 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="562b50b2f80a25f2ad29a2bf7e73f9928d10cd363e8622ce4367f1fbbfadb233" Mar 18 18:40:05 crc kubenswrapper[4830]: I0318 18:40:05.413089 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564320-fdgsj" Mar 18 18:40:05 crc kubenswrapper[4830]: I0318 18:40:05.491707 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564314-lxbdr"] Mar 18 18:40:05 crc kubenswrapper[4830]: I0318 18:40:05.500865 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564314-lxbdr"] Mar 18 18:40:06 crc kubenswrapper[4830]: I0318 18:40:06.258905 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddccc616-3d55-449e-a9fe-b0769f1b1034" path="/var/lib/kubelet/pods/ddccc616-3d55-449e-a9fe-b0769f1b1034/volumes" Mar 18 18:40:16 crc kubenswrapper[4830]: I0318 18:40:16.764040 4830 scope.go:117] "RemoveContainer" containerID="d3d1f2f024b45ab977ec859af2ce98d94c4a018f026966edfde536e41cb83407" Mar 18 18:40:29 crc kubenswrapper[4830]: I0318 18:40:29.509702 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:40:29 crc kubenswrapper[4830]: I0318 18:40:29.510909 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:40:59 crc kubenswrapper[4830]: I0318 18:40:59.509706 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:40:59 crc kubenswrapper[4830]: I0318 18:40:59.510632 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:40:59 crc kubenswrapper[4830]: I0318 18:40:59.510720 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" Mar 18 18:40:59 crc kubenswrapper[4830]: I0318 18:40:59.511765 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dfbcf38a330ef2a2c30556ca081227685219cf1b5ceefae31a414ea5e6724c04"} pod="openshift-machine-config-operator/machine-config-daemon-plzpb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 18:40:59 crc kubenswrapper[4830]: I0318 18:40:59.511918 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" containerID="cri-o://dfbcf38a330ef2a2c30556ca081227685219cf1b5ceefae31a414ea5e6724c04" gracePeriod=600 Mar 18 18:40:59 crc kubenswrapper[4830]: E0318 18:40:59.648311 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:41:00 crc kubenswrapper[4830]: I0318 18:41:00.065846 4830 generic.go:334] "Generic (PLEG): container finished" podID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerID="dfbcf38a330ef2a2c30556ca081227685219cf1b5ceefae31a414ea5e6724c04" exitCode=0 Mar 18 18:41:00 crc kubenswrapper[4830]: I0318 18:41:00.065891 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerDied","Data":"dfbcf38a330ef2a2c30556ca081227685219cf1b5ceefae31a414ea5e6724c04"} Mar 18 18:41:00 crc kubenswrapper[4830]: I0318 18:41:00.065923 4830 scope.go:117] "RemoveContainer" containerID="45d96ab476768f3520cb380f87fe0545043e6591eaa2f2e1a853cbb7f4d2d3bf" Mar 18 18:41:00 crc kubenswrapper[4830]: I0318 18:41:00.066622 4830 scope.go:117] "RemoveContainer" containerID="dfbcf38a330ef2a2c30556ca081227685219cf1b5ceefae31a414ea5e6724c04" Mar 18 18:41:00 crc kubenswrapper[4830]: E0318 18:41:00.067051 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:41:11 crc kubenswrapper[4830]: I0318 18:41:11.234817 4830 scope.go:117] "RemoveContainer" containerID="dfbcf38a330ef2a2c30556ca081227685219cf1b5ceefae31a414ea5e6724c04" Mar 18 18:41:11 crc kubenswrapper[4830]: E0318 18:41:11.235756 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:41:23 crc kubenswrapper[4830]: I0318 18:41:23.234678 4830 scope.go:117] "RemoveContainer" containerID="dfbcf38a330ef2a2c30556ca081227685219cf1b5ceefae31a414ea5e6724c04" Mar 18 18:41:23 crc kubenswrapper[4830]: E0318 18:41:23.235727 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:41:38 crc kubenswrapper[4830]: I0318 18:41:38.235286 4830 scope.go:117] "RemoveContainer" containerID="dfbcf38a330ef2a2c30556ca081227685219cf1b5ceefae31a414ea5e6724c04" Mar 18 18:41:38 crc kubenswrapper[4830]: E0318 18:41:38.236405 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:41:52 crc kubenswrapper[4830]: I0318 18:41:52.234828 4830 scope.go:117] "RemoveContainer" containerID="dfbcf38a330ef2a2c30556ca081227685219cf1b5ceefae31a414ea5e6724c04" Mar 18 18:41:52 crc kubenswrapper[4830]: E0318 18:41:52.236216 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:42:00 crc kubenswrapper[4830]: I0318 18:42:00.160558 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564322-jh2dp"] Mar 18 18:42:00 crc kubenswrapper[4830]: E0318 18:42:00.161709 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9de10e5e-26e9-4d40-847e-ef92da467d45" containerName="oc" Mar 18 18:42:00 crc kubenswrapper[4830]: I0318 18:42:00.161733 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="9de10e5e-26e9-4d40-847e-ef92da467d45" containerName="oc" Mar 18 18:42:00 crc kubenswrapper[4830]: I0318 18:42:00.162015 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="9de10e5e-26e9-4d40-847e-ef92da467d45" containerName="oc" Mar 18 18:42:00 crc kubenswrapper[4830]: I0318 18:42:00.162708 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564322-jh2dp" Mar 18 18:42:00 crc kubenswrapper[4830]: I0318 18:42:00.166194 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:42:00 crc kubenswrapper[4830]: I0318 18:42:00.166553 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:42:00 crc kubenswrapper[4830]: I0318 18:42:00.169466 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 18:42:00 crc kubenswrapper[4830]: I0318 18:42:00.171850 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564322-jh2dp"] Mar 18 18:42:00 crc kubenswrapper[4830]: I0318 18:42:00.269049 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jphln\" (UniqueName: \"kubernetes.io/projected/daf3284a-e858-4458-b1d5-a0d3f19226d1-kube-api-access-jphln\") pod \"auto-csr-approver-29564322-jh2dp\" (UID: \"daf3284a-e858-4458-b1d5-a0d3f19226d1\") " pod="openshift-infra/auto-csr-approver-29564322-jh2dp" Mar 18 18:42:00 crc kubenswrapper[4830]: I0318 18:42:00.371239 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jphln\" (UniqueName: \"kubernetes.io/projected/daf3284a-e858-4458-b1d5-a0d3f19226d1-kube-api-access-jphln\") pod \"auto-csr-approver-29564322-jh2dp\" (UID: \"daf3284a-e858-4458-b1d5-a0d3f19226d1\") " pod="openshift-infra/auto-csr-approver-29564322-jh2dp" Mar 18 18:42:00 crc kubenswrapper[4830]: I0318 18:42:00.409314 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jphln\" (UniqueName: \"kubernetes.io/projected/daf3284a-e858-4458-b1d5-a0d3f19226d1-kube-api-access-jphln\") pod \"auto-csr-approver-29564322-jh2dp\" (UID: \"daf3284a-e858-4458-b1d5-a0d3f19226d1\") " pod="openshift-infra/auto-csr-approver-29564322-jh2dp" Mar 18 18:42:00 crc kubenswrapper[4830]: I0318 18:42:00.514273 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564322-jh2dp" Mar 18 18:42:00 crc kubenswrapper[4830]: I0318 18:42:00.818538 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564322-jh2dp"] Mar 18 18:42:01 crc kubenswrapper[4830]: I0318 18:42:01.670557 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564322-jh2dp" event={"ID":"daf3284a-e858-4458-b1d5-a0d3f19226d1","Type":"ContainerStarted","Data":"ab6fe6f2d4d3c6fe55e42230dfc506939cf2f7a7d3b00f92654b674cf6032208"} Mar 18 18:42:03 crc kubenswrapper[4830]: I0318 18:42:03.700565 4830 generic.go:334] "Generic (PLEG): container finished" podID="daf3284a-e858-4458-b1d5-a0d3f19226d1" containerID="87d22b35ee34ec1ab9a227e039de5f91f3a7820d8b0c64c9139efe4f5ac709b4" exitCode=0 Mar 18 18:42:03 crc kubenswrapper[4830]: I0318 18:42:03.700715 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564322-jh2dp" event={"ID":"daf3284a-e858-4458-b1d5-a0d3f19226d1","Type":"ContainerDied","Data":"87d22b35ee34ec1ab9a227e039de5f91f3a7820d8b0c64c9139efe4f5ac709b4"} Mar 18 18:42:05 crc kubenswrapper[4830]: I0318 18:42:05.130965 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564322-jh2dp" Mar 18 18:42:05 crc kubenswrapper[4830]: I0318 18:42:05.244922 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jphln\" (UniqueName: \"kubernetes.io/projected/daf3284a-e858-4458-b1d5-a0d3f19226d1-kube-api-access-jphln\") pod \"daf3284a-e858-4458-b1d5-a0d3f19226d1\" (UID: \"daf3284a-e858-4458-b1d5-a0d3f19226d1\") " Mar 18 18:42:05 crc kubenswrapper[4830]: I0318 18:42:05.253145 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daf3284a-e858-4458-b1d5-a0d3f19226d1-kube-api-access-jphln" (OuterVolumeSpecName: "kube-api-access-jphln") pod "daf3284a-e858-4458-b1d5-a0d3f19226d1" (UID: "daf3284a-e858-4458-b1d5-a0d3f19226d1"). InnerVolumeSpecName "kube-api-access-jphln". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:42:05 crc kubenswrapper[4830]: I0318 18:42:05.347393 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jphln\" (UniqueName: \"kubernetes.io/projected/daf3284a-e858-4458-b1d5-a0d3f19226d1-kube-api-access-jphln\") on node \"crc\" DevicePath \"\"" Mar 18 18:42:05 crc kubenswrapper[4830]: I0318 18:42:05.724264 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564322-jh2dp" event={"ID":"daf3284a-e858-4458-b1d5-a0d3f19226d1","Type":"ContainerDied","Data":"ab6fe6f2d4d3c6fe55e42230dfc506939cf2f7a7d3b00f92654b674cf6032208"} Mar 18 18:42:05 crc kubenswrapper[4830]: I0318 18:42:05.724335 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab6fe6f2d4d3c6fe55e42230dfc506939cf2f7a7d3b00f92654b674cf6032208" Mar 18 18:42:05 crc kubenswrapper[4830]: I0318 18:42:05.724475 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564322-jh2dp" Mar 18 18:42:06 crc kubenswrapper[4830]: I0318 18:42:06.224243 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564316-q5l4v"] Mar 18 18:42:06 crc kubenswrapper[4830]: I0318 18:42:06.234405 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564316-q5l4v"] Mar 18 18:42:06 crc kubenswrapper[4830]: I0318 18:42:06.272916 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ca6178c-b87d-48ce-9e4d-bed2d9d62174" path="/var/lib/kubelet/pods/8ca6178c-b87d-48ce-9e4d-bed2d9d62174/volumes" Mar 18 18:42:07 crc kubenswrapper[4830]: I0318 18:42:07.234674 4830 scope.go:117] "RemoveContainer" containerID="dfbcf38a330ef2a2c30556ca081227685219cf1b5ceefae31a414ea5e6724c04" Mar 18 18:42:07 crc kubenswrapper[4830]: E0318 18:42:07.234943 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:42:16 crc kubenswrapper[4830]: I0318 18:42:16.885927 4830 scope.go:117] "RemoveContainer" containerID="669ee0a9ffa94ead7911f10d351ad6cf26d95073b1bbfaea68e97275434fe9d2" Mar 18 18:42:19 crc kubenswrapper[4830]: I0318 18:42:19.235205 4830 scope.go:117] "RemoveContainer" containerID="dfbcf38a330ef2a2c30556ca081227685219cf1b5ceefae31a414ea5e6724c04" Mar 18 18:42:19 crc kubenswrapper[4830]: E0318 18:42:19.235981 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:42:34 crc kubenswrapper[4830]: I0318 18:42:34.236027 4830 scope.go:117] "RemoveContainer" containerID="dfbcf38a330ef2a2c30556ca081227685219cf1b5ceefae31a414ea5e6724c04" Mar 18 18:42:34 crc kubenswrapper[4830]: E0318 18:42:34.237092 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:42:49 crc kubenswrapper[4830]: I0318 18:42:49.235304 4830 scope.go:117] "RemoveContainer" containerID="dfbcf38a330ef2a2c30556ca081227685219cf1b5ceefae31a414ea5e6724c04" Mar 18 18:42:49 crc kubenswrapper[4830]: E0318 18:42:49.238717 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:43:02 crc kubenswrapper[4830]: I0318 18:43:02.238891 4830 scope.go:117] "RemoveContainer" containerID="dfbcf38a330ef2a2c30556ca081227685219cf1b5ceefae31a414ea5e6724c04" Mar 18 18:43:02 crc kubenswrapper[4830]: E0318 18:43:02.245850 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:43:13 crc kubenswrapper[4830]: I0318 18:43:13.235715 4830 scope.go:117] "RemoveContainer" containerID="dfbcf38a330ef2a2c30556ca081227685219cf1b5ceefae31a414ea5e6724c04" Mar 18 18:43:13 crc kubenswrapper[4830]: E0318 18:43:13.236707 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:43:28 crc kubenswrapper[4830]: I0318 18:43:28.235320 4830 scope.go:117] "RemoveContainer" containerID="dfbcf38a330ef2a2c30556ca081227685219cf1b5ceefae31a414ea5e6724c04" Mar 18 18:43:28 crc kubenswrapper[4830]: E0318 18:43:28.236418 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:43:43 crc kubenswrapper[4830]: I0318 18:43:43.235374 4830 scope.go:117] "RemoveContainer" containerID="dfbcf38a330ef2a2c30556ca081227685219cf1b5ceefae31a414ea5e6724c04" Mar 18 18:43:43 crc kubenswrapper[4830]: E0318 18:43:43.236362 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:43:58 crc kubenswrapper[4830]: I0318 18:43:58.235407 4830 scope.go:117] "RemoveContainer" containerID="dfbcf38a330ef2a2c30556ca081227685219cf1b5ceefae31a414ea5e6724c04" Mar 18 18:43:58 crc kubenswrapper[4830]: E0318 18:43:58.237586 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:44:00 crc kubenswrapper[4830]: I0318 18:44:00.172690 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564324-6pxbs"] Mar 18 18:44:00 crc kubenswrapper[4830]: E0318 18:44:00.173116 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daf3284a-e858-4458-b1d5-a0d3f19226d1" containerName="oc" Mar 18 18:44:00 crc kubenswrapper[4830]: I0318 18:44:00.173137 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="daf3284a-e858-4458-b1d5-a0d3f19226d1" containerName="oc" Mar 18 18:44:00 crc kubenswrapper[4830]: I0318 18:44:00.173421 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="daf3284a-e858-4458-b1d5-a0d3f19226d1" containerName="oc" Mar 18 18:44:00 crc kubenswrapper[4830]: I0318 18:44:00.174166 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564324-6pxbs" Mar 18 18:44:00 crc kubenswrapper[4830]: I0318 18:44:00.176224 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:44:00 crc kubenswrapper[4830]: I0318 18:44:00.176508 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 18:44:00 crc kubenswrapper[4830]: I0318 18:44:00.177513 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:44:00 crc kubenswrapper[4830]: I0318 18:44:00.194298 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564324-6pxbs"] Mar 18 18:44:00 crc kubenswrapper[4830]: I0318 18:44:00.197184 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n9rg\" (UniqueName: \"kubernetes.io/projected/6cc0d3a3-b53d-4742-b41b-306b83656408-kube-api-access-8n9rg\") pod \"auto-csr-approver-29564324-6pxbs\" (UID: \"6cc0d3a3-b53d-4742-b41b-306b83656408\") " pod="openshift-infra/auto-csr-approver-29564324-6pxbs" Mar 18 18:44:00 crc kubenswrapper[4830]: I0318 18:44:00.298924 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n9rg\" (UniqueName: \"kubernetes.io/projected/6cc0d3a3-b53d-4742-b41b-306b83656408-kube-api-access-8n9rg\") pod \"auto-csr-approver-29564324-6pxbs\" (UID: \"6cc0d3a3-b53d-4742-b41b-306b83656408\") " pod="openshift-infra/auto-csr-approver-29564324-6pxbs" Mar 18 18:44:00 crc kubenswrapper[4830]: I0318 18:44:00.330395 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n9rg\" (UniqueName: \"kubernetes.io/projected/6cc0d3a3-b53d-4742-b41b-306b83656408-kube-api-access-8n9rg\") pod \"auto-csr-approver-29564324-6pxbs\" (UID: \"6cc0d3a3-b53d-4742-b41b-306b83656408\") " pod="openshift-infra/auto-csr-approver-29564324-6pxbs" Mar 18 18:44:00 crc kubenswrapper[4830]: I0318 18:44:00.497414 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564324-6pxbs" Mar 18 18:44:01 crc kubenswrapper[4830]: I0318 18:44:01.013553 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564324-6pxbs"] Mar 18 18:44:01 crc kubenswrapper[4830]: I0318 18:44:01.028946 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 18:44:01 crc kubenswrapper[4830]: I0318 18:44:01.888057 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564324-6pxbs" event={"ID":"6cc0d3a3-b53d-4742-b41b-306b83656408","Type":"ContainerStarted","Data":"a8bbc59b9f3a8902f4c03a21ee07d8c0f88751abcff1f3507ddcd662ac3bbe5e"} Mar 18 18:44:02 crc kubenswrapper[4830]: I0318 18:44:02.900189 4830 generic.go:334] "Generic (PLEG): container finished" podID="6cc0d3a3-b53d-4742-b41b-306b83656408" containerID="78e9cf6ef81244f467059302f6f09545389de7fbe97b41bd2d82422f1439819c" exitCode=0 Mar 18 18:44:02 crc kubenswrapper[4830]: I0318 18:44:02.900407 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564324-6pxbs" event={"ID":"6cc0d3a3-b53d-4742-b41b-306b83656408","Type":"ContainerDied","Data":"78e9cf6ef81244f467059302f6f09545389de7fbe97b41bd2d82422f1439819c"} Mar 18 18:44:04 crc kubenswrapper[4830]: I0318 18:44:04.262850 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564324-6pxbs" Mar 18 18:44:04 crc kubenswrapper[4830]: I0318 18:44:04.364261 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8n9rg\" (UniqueName: \"kubernetes.io/projected/6cc0d3a3-b53d-4742-b41b-306b83656408-kube-api-access-8n9rg\") pod \"6cc0d3a3-b53d-4742-b41b-306b83656408\" (UID: \"6cc0d3a3-b53d-4742-b41b-306b83656408\") " Mar 18 18:44:04 crc kubenswrapper[4830]: I0318 18:44:04.374765 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cc0d3a3-b53d-4742-b41b-306b83656408-kube-api-access-8n9rg" (OuterVolumeSpecName: "kube-api-access-8n9rg") pod "6cc0d3a3-b53d-4742-b41b-306b83656408" (UID: "6cc0d3a3-b53d-4742-b41b-306b83656408"). InnerVolumeSpecName "kube-api-access-8n9rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:44:04 crc kubenswrapper[4830]: I0318 18:44:04.476192 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8n9rg\" (UniqueName: \"kubernetes.io/projected/6cc0d3a3-b53d-4742-b41b-306b83656408-kube-api-access-8n9rg\") on node \"crc\" DevicePath \"\"" Mar 18 18:44:04 crc kubenswrapper[4830]: I0318 18:44:04.917251 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564324-6pxbs" Mar 18 18:44:04 crc kubenswrapper[4830]: I0318 18:44:04.917203 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564324-6pxbs" event={"ID":"6cc0d3a3-b53d-4742-b41b-306b83656408","Type":"ContainerDied","Data":"a8bbc59b9f3a8902f4c03a21ee07d8c0f88751abcff1f3507ddcd662ac3bbe5e"} Mar 18 18:44:04 crc kubenswrapper[4830]: I0318 18:44:04.917595 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8bbc59b9f3a8902f4c03a21ee07d8c0f88751abcff1f3507ddcd662ac3bbe5e" Mar 18 18:44:05 crc kubenswrapper[4830]: I0318 18:44:05.326586 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564318-vsgrk"] Mar 18 18:44:05 crc kubenswrapper[4830]: I0318 18:44:05.332464 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564318-vsgrk"] Mar 18 18:44:06 crc kubenswrapper[4830]: I0318 18:44:06.527891 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73eef308-def6-491f-8c14-45a58d8f066f" path="/var/lib/kubelet/pods/73eef308-def6-491f-8c14-45a58d8f066f/volumes" Mar 18 18:44:13 crc kubenswrapper[4830]: I0318 18:44:13.234997 4830 scope.go:117] "RemoveContainer" containerID="dfbcf38a330ef2a2c30556ca081227685219cf1b5ceefae31a414ea5e6724c04" Mar 18 18:44:13 crc kubenswrapper[4830]: E0318 18:44:13.236278 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:44:16 crc kubenswrapper[4830]: I0318 18:44:16.980498 4830 scope.go:117] "RemoveContainer" containerID="d4a98c7d5b941638f1e9f91c645c24621a33ec1aca7b51f67c17b7af6333e126" Mar 18 18:44:28 crc kubenswrapper[4830]: I0318 18:44:28.234902 4830 scope.go:117] "RemoveContainer" containerID="dfbcf38a330ef2a2c30556ca081227685219cf1b5ceefae31a414ea5e6724c04" Mar 18 18:44:28 crc kubenswrapper[4830]: E0318 18:44:28.235955 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:44:42 crc kubenswrapper[4830]: I0318 18:44:42.235679 4830 scope.go:117] "RemoveContainer" containerID="dfbcf38a330ef2a2c30556ca081227685219cf1b5ceefae31a414ea5e6724c04" Mar 18 18:44:42 crc kubenswrapper[4830]: E0318 18:44:42.236884 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:44:57 crc kubenswrapper[4830]: I0318 18:44:57.235061 4830 scope.go:117] "RemoveContainer" containerID="dfbcf38a330ef2a2c30556ca081227685219cf1b5ceefae31a414ea5e6724c04" Mar 18 18:44:57 crc kubenswrapper[4830]: E0318 18:44:57.235982 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:45:00 crc kubenswrapper[4830]: I0318 18:45:00.148243 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564325-zzz7z"] Mar 18 18:45:00 crc kubenswrapper[4830]: E0318 18:45:00.148801 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cc0d3a3-b53d-4742-b41b-306b83656408" containerName="oc" Mar 18 18:45:00 crc kubenswrapper[4830]: I0318 18:45:00.148813 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cc0d3a3-b53d-4742-b41b-306b83656408" containerName="oc" Mar 18 18:45:00 crc kubenswrapper[4830]: I0318 18:45:00.148985 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cc0d3a3-b53d-4742-b41b-306b83656408" containerName="oc" Mar 18 18:45:00 crc kubenswrapper[4830]: I0318 18:45:00.149411 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564325-zzz7z" Mar 18 18:45:00 crc kubenswrapper[4830]: I0318 18:45:00.154733 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564325-zzz7z"] Mar 18 18:45:00 crc kubenswrapper[4830]: I0318 18:45:00.158261 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 18:45:00 crc kubenswrapper[4830]: I0318 18:45:00.158535 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 18:45:00 crc kubenswrapper[4830]: I0318 18:45:00.248206 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1957d43b-bc6c-4df6-92fa-b934f78770ea-config-volume\") pod \"collect-profiles-29564325-zzz7z\" (UID: \"1957d43b-bc6c-4df6-92fa-b934f78770ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564325-zzz7z" Mar 18 18:45:00 crc kubenswrapper[4830]: I0318 18:45:00.248298 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1957d43b-bc6c-4df6-92fa-b934f78770ea-secret-volume\") pod \"collect-profiles-29564325-zzz7z\" (UID: \"1957d43b-bc6c-4df6-92fa-b934f78770ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564325-zzz7z" Mar 18 18:45:00 crc kubenswrapper[4830]: I0318 18:45:00.248555 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcnv2\" (UniqueName: \"kubernetes.io/projected/1957d43b-bc6c-4df6-92fa-b934f78770ea-kube-api-access-pcnv2\") pod \"collect-profiles-29564325-zzz7z\" (UID: \"1957d43b-bc6c-4df6-92fa-b934f78770ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564325-zzz7z" Mar 18 18:45:00 crc kubenswrapper[4830]: I0318 18:45:00.350013 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1957d43b-bc6c-4df6-92fa-b934f78770ea-secret-volume\") pod \"collect-profiles-29564325-zzz7z\" (UID: \"1957d43b-bc6c-4df6-92fa-b934f78770ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564325-zzz7z" Mar 18 18:45:00 crc kubenswrapper[4830]: I0318 18:45:00.350146 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcnv2\" (UniqueName: \"kubernetes.io/projected/1957d43b-bc6c-4df6-92fa-b934f78770ea-kube-api-access-pcnv2\") pod \"collect-profiles-29564325-zzz7z\" (UID: \"1957d43b-bc6c-4df6-92fa-b934f78770ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564325-zzz7z" Mar 18 18:45:00 crc kubenswrapper[4830]: I0318 18:45:00.350190 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1957d43b-bc6c-4df6-92fa-b934f78770ea-config-volume\") pod \"collect-profiles-29564325-zzz7z\" (UID: \"1957d43b-bc6c-4df6-92fa-b934f78770ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564325-zzz7z" Mar 18 18:45:00 crc kubenswrapper[4830]: I0318 18:45:00.351236 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1957d43b-bc6c-4df6-92fa-b934f78770ea-config-volume\") pod \"collect-profiles-29564325-zzz7z\" (UID: \"1957d43b-bc6c-4df6-92fa-b934f78770ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564325-zzz7z" Mar 18 18:45:00 crc kubenswrapper[4830]: I0318 18:45:00.356304 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1957d43b-bc6c-4df6-92fa-b934f78770ea-secret-volume\") pod \"collect-profiles-29564325-zzz7z\" (UID: \"1957d43b-bc6c-4df6-92fa-b934f78770ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564325-zzz7z" Mar 18 18:45:00 crc kubenswrapper[4830]: I0318 18:45:00.369396 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcnv2\" (UniqueName: \"kubernetes.io/projected/1957d43b-bc6c-4df6-92fa-b934f78770ea-kube-api-access-pcnv2\") pod \"collect-profiles-29564325-zzz7z\" (UID: \"1957d43b-bc6c-4df6-92fa-b934f78770ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564325-zzz7z" Mar 18 18:45:00 crc kubenswrapper[4830]: I0318 18:45:00.486625 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564325-zzz7z" Mar 18 18:45:00 crc kubenswrapper[4830]: I0318 18:45:00.945292 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564325-zzz7z"] Mar 18 18:45:01 crc kubenswrapper[4830]: I0318 18:45:01.646948 4830 generic.go:334] "Generic (PLEG): container finished" podID="1957d43b-bc6c-4df6-92fa-b934f78770ea" containerID="3335daa75f545881a57e4c88cb6ce0b05e351fc8483403bbb1dd1e8800cec6d2" exitCode=0 Mar 18 18:45:01 crc kubenswrapper[4830]: I0318 18:45:01.647006 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564325-zzz7z" event={"ID":"1957d43b-bc6c-4df6-92fa-b934f78770ea","Type":"ContainerDied","Data":"3335daa75f545881a57e4c88cb6ce0b05e351fc8483403bbb1dd1e8800cec6d2"} Mar 18 18:45:01 crc kubenswrapper[4830]: I0318 18:45:01.647043 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564325-zzz7z" event={"ID":"1957d43b-bc6c-4df6-92fa-b934f78770ea","Type":"ContainerStarted","Data":"bc58bf0c741756d64d6ee7a34e20e56c5cb7d66d1ecf20e4133f61715f93c2cc"} Mar 18 18:45:02 crc kubenswrapper[4830]: I0318 18:45:02.983519 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564325-zzz7z" Mar 18 18:45:03 crc kubenswrapper[4830]: I0318 18:45:03.089931 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1957d43b-bc6c-4df6-92fa-b934f78770ea-secret-volume\") pod \"1957d43b-bc6c-4df6-92fa-b934f78770ea\" (UID: \"1957d43b-bc6c-4df6-92fa-b934f78770ea\") " Mar 18 18:45:03 crc kubenswrapper[4830]: I0318 18:45:03.089990 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcnv2\" (UniqueName: \"kubernetes.io/projected/1957d43b-bc6c-4df6-92fa-b934f78770ea-kube-api-access-pcnv2\") pod \"1957d43b-bc6c-4df6-92fa-b934f78770ea\" (UID: \"1957d43b-bc6c-4df6-92fa-b934f78770ea\") " Mar 18 18:45:03 crc kubenswrapper[4830]: I0318 18:45:03.090116 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1957d43b-bc6c-4df6-92fa-b934f78770ea-config-volume\") pod \"1957d43b-bc6c-4df6-92fa-b934f78770ea\" (UID: \"1957d43b-bc6c-4df6-92fa-b934f78770ea\") " Mar 18 18:45:03 crc kubenswrapper[4830]: I0318 18:45:03.090934 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1957d43b-bc6c-4df6-92fa-b934f78770ea-config-volume" (OuterVolumeSpecName: "config-volume") pod "1957d43b-bc6c-4df6-92fa-b934f78770ea" (UID: "1957d43b-bc6c-4df6-92fa-b934f78770ea"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:45:03 crc kubenswrapper[4830]: I0318 18:45:03.096380 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1957d43b-bc6c-4df6-92fa-b934f78770ea-kube-api-access-pcnv2" (OuterVolumeSpecName: "kube-api-access-pcnv2") pod "1957d43b-bc6c-4df6-92fa-b934f78770ea" (UID: "1957d43b-bc6c-4df6-92fa-b934f78770ea"). InnerVolumeSpecName "kube-api-access-pcnv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:45:03 crc kubenswrapper[4830]: I0318 18:45:03.097349 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1957d43b-bc6c-4df6-92fa-b934f78770ea-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1957d43b-bc6c-4df6-92fa-b934f78770ea" (UID: "1957d43b-bc6c-4df6-92fa-b934f78770ea"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:45:03 crc kubenswrapper[4830]: I0318 18:45:03.191956 4830 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1957d43b-bc6c-4df6-92fa-b934f78770ea-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 18:45:03 crc kubenswrapper[4830]: I0318 18:45:03.192333 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcnv2\" (UniqueName: \"kubernetes.io/projected/1957d43b-bc6c-4df6-92fa-b934f78770ea-kube-api-access-pcnv2\") on node \"crc\" DevicePath \"\"" Mar 18 18:45:03 crc kubenswrapper[4830]: I0318 18:45:03.192462 4830 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1957d43b-bc6c-4df6-92fa-b934f78770ea-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 18:45:03 crc kubenswrapper[4830]: I0318 18:45:03.666851 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564325-zzz7z" event={"ID":"1957d43b-bc6c-4df6-92fa-b934f78770ea","Type":"ContainerDied","Data":"bc58bf0c741756d64d6ee7a34e20e56c5cb7d66d1ecf20e4133f61715f93c2cc"} Mar 18 18:45:03 crc kubenswrapper[4830]: I0318 18:45:03.667318 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc58bf0c741756d64d6ee7a34e20e56c5cb7d66d1ecf20e4133f61715f93c2cc" Mar 18 18:45:03 crc kubenswrapper[4830]: I0318 18:45:03.667430 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564325-zzz7z" Mar 18 18:45:04 crc kubenswrapper[4830]: I0318 18:45:04.077617 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564280-6hx2v"] Mar 18 18:45:04 crc kubenswrapper[4830]: I0318 18:45:04.082186 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564280-6hx2v"] Mar 18 18:45:04 crc kubenswrapper[4830]: I0318 18:45:04.244521 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19aab548-96a0-4056-8226-f9e7cf4b3ca3" path="/var/lib/kubelet/pods/19aab548-96a0-4056-8226-f9e7cf4b3ca3/volumes" Mar 18 18:45:11 crc kubenswrapper[4830]: I0318 18:45:11.235132 4830 scope.go:117] "RemoveContainer" containerID="dfbcf38a330ef2a2c30556ca081227685219cf1b5ceefae31a414ea5e6724c04" Mar 18 18:45:11 crc kubenswrapper[4830]: E0318 18:45:11.236312 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:45:17 crc kubenswrapper[4830]: I0318 18:45:17.059295 4830 scope.go:117] "RemoveContainer" containerID="de7a7195b9b43f06d4c613b87f6701cc48012d5464974eeeccde1f5a1e890958" Mar 18 18:45:22 crc kubenswrapper[4830]: I0318 18:45:22.234838 4830 scope.go:117] "RemoveContainer" containerID="dfbcf38a330ef2a2c30556ca081227685219cf1b5ceefae31a414ea5e6724c04" Mar 18 18:45:22 crc kubenswrapper[4830]: E0318 18:45:22.235834 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:45:35 crc kubenswrapper[4830]: I0318 18:45:35.235225 4830 scope.go:117] "RemoveContainer" containerID="dfbcf38a330ef2a2c30556ca081227685219cf1b5ceefae31a414ea5e6724c04" Mar 18 18:45:35 crc kubenswrapper[4830]: E0318 18:45:35.236409 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:45:46 crc kubenswrapper[4830]: I0318 18:45:46.242657 4830 scope.go:117] "RemoveContainer" containerID="dfbcf38a330ef2a2c30556ca081227685219cf1b5ceefae31a414ea5e6724c04" Mar 18 18:45:46 crc kubenswrapper[4830]: E0318 18:45:46.243546 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:45:57 crc kubenswrapper[4830]: I0318 18:45:57.234843 4830 scope.go:117] "RemoveContainer" containerID="dfbcf38a330ef2a2c30556ca081227685219cf1b5ceefae31a414ea5e6724c04" Mar 18 18:45:57 crc kubenswrapper[4830]: E0318 18:45:57.236245 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:46:00 crc kubenswrapper[4830]: I0318 18:46:00.158518 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564326-5z5ds"] Mar 18 18:46:00 crc kubenswrapper[4830]: E0318 18:46:00.159415 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1957d43b-bc6c-4df6-92fa-b934f78770ea" containerName="collect-profiles" Mar 18 18:46:00 crc kubenswrapper[4830]: I0318 18:46:00.159436 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1957d43b-bc6c-4df6-92fa-b934f78770ea" containerName="collect-profiles" Mar 18 18:46:00 crc kubenswrapper[4830]: I0318 18:46:00.159742 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="1957d43b-bc6c-4df6-92fa-b934f78770ea" containerName="collect-profiles" Mar 18 18:46:00 crc kubenswrapper[4830]: I0318 18:46:00.160676 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564326-5z5ds" Mar 18 18:46:00 crc kubenswrapper[4830]: I0318 18:46:00.163269 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 18:46:00 crc kubenswrapper[4830]: I0318 18:46:00.163625 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:46:00 crc kubenswrapper[4830]: I0318 18:46:00.164013 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:46:00 crc kubenswrapper[4830]: I0318 18:46:00.171294 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564326-5z5ds"] Mar 18 18:46:00 crc kubenswrapper[4830]: I0318 18:46:00.238173 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glmcd\" (UniqueName: \"kubernetes.io/projected/a180a8ce-62ba-4d86-847f-dd0db0f293b9-kube-api-access-glmcd\") pod \"auto-csr-approver-29564326-5z5ds\" (UID: \"a180a8ce-62ba-4d86-847f-dd0db0f293b9\") " pod="openshift-infra/auto-csr-approver-29564326-5z5ds" Mar 18 18:46:00 crc kubenswrapper[4830]: I0318 18:46:00.341109 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glmcd\" (UniqueName: \"kubernetes.io/projected/a180a8ce-62ba-4d86-847f-dd0db0f293b9-kube-api-access-glmcd\") pod \"auto-csr-approver-29564326-5z5ds\" (UID: \"a180a8ce-62ba-4d86-847f-dd0db0f293b9\") " pod="openshift-infra/auto-csr-approver-29564326-5z5ds" Mar 18 18:46:00 crc kubenswrapper[4830]: I0318 18:46:00.363689 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glmcd\" (UniqueName: \"kubernetes.io/projected/a180a8ce-62ba-4d86-847f-dd0db0f293b9-kube-api-access-glmcd\") pod \"auto-csr-approver-29564326-5z5ds\" (UID: \"a180a8ce-62ba-4d86-847f-dd0db0f293b9\") " pod="openshift-infra/auto-csr-approver-29564326-5z5ds" Mar 18 18:46:00 crc kubenswrapper[4830]: I0318 18:46:00.497519 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564326-5z5ds" Mar 18 18:46:00 crc kubenswrapper[4830]: W0318 18:46:00.754543 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda180a8ce_62ba_4d86_847f_dd0db0f293b9.slice/crio-a25a0b3be11f64992121b1663877b1283c4d69710c77e6684dacc360a9dd9655 WatchSource:0}: Error finding container a25a0b3be11f64992121b1663877b1283c4d69710c77e6684dacc360a9dd9655: Status 404 returned error can't find the container with id a25a0b3be11f64992121b1663877b1283c4d69710c77e6684dacc360a9dd9655 Mar 18 18:46:00 crc kubenswrapper[4830]: I0318 18:46:00.757136 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564326-5z5ds"] Mar 18 18:46:01 crc kubenswrapper[4830]: I0318 18:46:01.402713 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564326-5z5ds" event={"ID":"a180a8ce-62ba-4d86-847f-dd0db0f293b9","Type":"ContainerStarted","Data":"a25a0b3be11f64992121b1663877b1283c4d69710c77e6684dacc360a9dd9655"} Mar 18 18:46:03 crc kubenswrapper[4830]: I0318 18:46:03.421057 4830 generic.go:334] "Generic (PLEG): container finished" podID="a180a8ce-62ba-4d86-847f-dd0db0f293b9" containerID="5a596de1cc1ef25b9c2f1bf13a756c76d693878a7468ef0d8169a8b61643c8d3" exitCode=0 Mar 18 18:46:03 crc kubenswrapper[4830]: I0318 18:46:03.421171 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564326-5z5ds" event={"ID":"a180a8ce-62ba-4d86-847f-dd0db0f293b9","Type":"ContainerDied","Data":"5a596de1cc1ef25b9c2f1bf13a756c76d693878a7468ef0d8169a8b61643c8d3"} Mar 18 18:46:04 crc kubenswrapper[4830]: I0318 18:46:04.781304 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564326-5z5ds" Mar 18 18:46:04 crc kubenswrapper[4830]: I0318 18:46:04.984191 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glmcd\" (UniqueName: \"kubernetes.io/projected/a180a8ce-62ba-4d86-847f-dd0db0f293b9-kube-api-access-glmcd\") pod \"a180a8ce-62ba-4d86-847f-dd0db0f293b9\" (UID: \"a180a8ce-62ba-4d86-847f-dd0db0f293b9\") " Mar 18 18:46:04 crc kubenswrapper[4830]: I0318 18:46:04.994965 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a180a8ce-62ba-4d86-847f-dd0db0f293b9-kube-api-access-glmcd" (OuterVolumeSpecName: "kube-api-access-glmcd") pod "a180a8ce-62ba-4d86-847f-dd0db0f293b9" (UID: "a180a8ce-62ba-4d86-847f-dd0db0f293b9"). InnerVolumeSpecName "kube-api-access-glmcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:46:05 crc kubenswrapper[4830]: I0318 18:46:05.086315 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glmcd\" (UniqueName: \"kubernetes.io/projected/a180a8ce-62ba-4d86-847f-dd0db0f293b9-kube-api-access-glmcd\") on node \"crc\" DevicePath \"\"" Mar 18 18:46:05 crc kubenswrapper[4830]: I0318 18:46:05.448166 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564326-5z5ds" event={"ID":"a180a8ce-62ba-4d86-847f-dd0db0f293b9","Type":"ContainerDied","Data":"a25a0b3be11f64992121b1663877b1283c4d69710c77e6684dacc360a9dd9655"} Mar 18 18:46:05 crc kubenswrapper[4830]: I0318 18:46:05.448233 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a25a0b3be11f64992121b1663877b1283c4d69710c77e6684dacc360a9dd9655" Mar 18 18:46:05 crc kubenswrapper[4830]: I0318 18:46:05.448325 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564326-5z5ds" Mar 18 18:46:05 crc kubenswrapper[4830]: I0318 18:46:05.873526 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564320-fdgsj"] Mar 18 18:46:05 crc kubenswrapper[4830]: I0318 18:46:05.883549 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564320-fdgsj"] Mar 18 18:46:06 crc kubenswrapper[4830]: I0318 18:46:06.257568 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9de10e5e-26e9-4d40-847e-ef92da467d45" path="/var/lib/kubelet/pods/9de10e5e-26e9-4d40-847e-ef92da467d45/volumes" Mar 18 18:46:11 crc kubenswrapper[4830]: I0318 18:46:11.235144 4830 scope.go:117] "RemoveContainer" containerID="dfbcf38a330ef2a2c30556ca081227685219cf1b5ceefae31a414ea5e6724c04" Mar 18 18:46:12 crc kubenswrapper[4830]: I0318 18:46:12.513425 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerStarted","Data":"4b326b3494842e2defd3cf2873c94d0096bb5d3646fec642506cd4fe0c2fedd4"} Mar 18 18:46:17 crc kubenswrapper[4830]: I0318 18:46:17.137081 4830 scope.go:117] "RemoveContainer" containerID="18bf01fadb0260ed507c33f4c2e9f2d227ee4e7ea0ef5009bcb576c2798ec120" Mar 18 18:47:45 crc kubenswrapper[4830]: I0318 18:47:45.803917 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-r7vcs"] Mar 18 18:47:45 crc kubenswrapper[4830]: E0318 18:47:45.807264 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a180a8ce-62ba-4d86-847f-dd0db0f293b9" containerName="oc" Mar 18 18:47:45 crc kubenswrapper[4830]: I0318 18:47:45.807289 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a180a8ce-62ba-4d86-847f-dd0db0f293b9" containerName="oc" Mar 18 18:47:45 crc kubenswrapper[4830]: I0318 18:47:45.807534 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="a180a8ce-62ba-4d86-847f-dd0db0f293b9" containerName="oc" Mar 18 18:47:45 crc kubenswrapper[4830]: I0318 18:47:45.809134 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r7vcs" Mar 18 18:47:45 crc kubenswrapper[4830]: I0318 18:47:45.823829 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r7vcs"] Mar 18 18:47:45 crc kubenswrapper[4830]: I0318 18:47:45.956498 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/870fe6e6-266b-46da-bc1f-10bb50124ca4-catalog-content\") pod \"certified-operators-r7vcs\" (UID: \"870fe6e6-266b-46da-bc1f-10bb50124ca4\") " pod="openshift-marketplace/certified-operators-r7vcs" Mar 18 18:47:45 crc kubenswrapper[4830]: I0318 18:47:45.956748 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbc6g\" (UniqueName: \"kubernetes.io/projected/870fe6e6-266b-46da-bc1f-10bb50124ca4-kube-api-access-tbc6g\") pod \"certified-operators-r7vcs\" (UID: \"870fe6e6-266b-46da-bc1f-10bb50124ca4\") " pod="openshift-marketplace/certified-operators-r7vcs" Mar 18 18:47:45 crc kubenswrapper[4830]: I0318 18:47:45.956898 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/870fe6e6-266b-46da-bc1f-10bb50124ca4-utilities\") pod \"certified-operators-r7vcs\" (UID: \"870fe6e6-266b-46da-bc1f-10bb50124ca4\") " pod="openshift-marketplace/certified-operators-r7vcs" Mar 18 18:47:46 crc kubenswrapper[4830]: I0318 18:47:46.057863 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/870fe6e6-266b-46da-bc1f-10bb50124ca4-utilities\") pod \"certified-operators-r7vcs\" (UID: \"870fe6e6-266b-46da-bc1f-10bb50124ca4\") " pod="openshift-marketplace/certified-operators-r7vcs" Mar 18 18:47:46 crc kubenswrapper[4830]: I0318 18:47:46.057926 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/870fe6e6-266b-46da-bc1f-10bb50124ca4-catalog-content\") pod \"certified-operators-r7vcs\" (UID: \"870fe6e6-266b-46da-bc1f-10bb50124ca4\") " pod="openshift-marketplace/certified-operators-r7vcs" Mar 18 18:47:46 crc kubenswrapper[4830]: I0318 18:47:46.057993 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbc6g\" (UniqueName: \"kubernetes.io/projected/870fe6e6-266b-46da-bc1f-10bb50124ca4-kube-api-access-tbc6g\") pod \"certified-operators-r7vcs\" (UID: \"870fe6e6-266b-46da-bc1f-10bb50124ca4\") " pod="openshift-marketplace/certified-operators-r7vcs" Mar 18 18:47:46 crc kubenswrapper[4830]: I0318 18:47:46.058427 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/870fe6e6-266b-46da-bc1f-10bb50124ca4-utilities\") pod \"certified-operators-r7vcs\" (UID: \"870fe6e6-266b-46da-bc1f-10bb50124ca4\") " pod="openshift-marketplace/certified-operators-r7vcs" Mar 18 18:47:46 crc kubenswrapper[4830]: I0318 18:47:46.058554 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/870fe6e6-266b-46da-bc1f-10bb50124ca4-catalog-content\") pod \"certified-operators-r7vcs\" (UID: \"870fe6e6-266b-46da-bc1f-10bb50124ca4\") " pod="openshift-marketplace/certified-operators-r7vcs" Mar 18 18:47:46 crc kubenswrapper[4830]: I0318 18:47:46.090007 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbc6g\" (UniqueName: \"kubernetes.io/projected/870fe6e6-266b-46da-bc1f-10bb50124ca4-kube-api-access-tbc6g\") pod \"certified-operators-r7vcs\" (UID: \"870fe6e6-266b-46da-bc1f-10bb50124ca4\") " pod="openshift-marketplace/certified-operators-r7vcs" Mar 18 18:47:46 crc kubenswrapper[4830]: I0318 18:47:46.136216 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r7vcs" Mar 18 18:47:46 crc kubenswrapper[4830]: I0318 18:47:46.648070 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r7vcs"] Mar 18 18:47:47 crc kubenswrapper[4830]: I0318 18:47:47.458647 4830 generic.go:334] "Generic (PLEG): container finished" podID="870fe6e6-266b-46da-bc1f-10bb50124ca4" containerID="35ba0ac2681d8bae85c8f0b641f7d5c041227b9b99b28cecbd9d96adea3eaee2" exitCode=0 Mar 18 18:47:47 crc kubenswrapper[4830]: I0318 18:47:47.458735 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r7vcs" event={"ID":"870fe6e6-266b-46da-bc1f-10bb50124ca4","Type":"ContainerDied","Data":"35ba0ac2681d8bae85c8f0b641f7d5c041227b9b99b28cecbd9d96adea3eaee2"} Mar 18 18:47:47 crc kubenswrapper[4830]: I0318 18:47:47.458816 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r7vcs" event={"ID":"870fe6e6-266b-46da-bc1f-10bb50124ca4","Type":"ContainerStarted","Data":"cfe0d6d49cefe7a794e22ca0c6056d96d82471c7059ad5a76820d4ff6a6e88a3"} Mar 18 18:47:48 crc kubenswrapper[4830]: I0318 18:47:48.470401 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r7vcs" event={"ID":"870fe6e6-266b-46da-bc1f-10bb50124ca4","Type":"ContainerStarted","Data":"f02cceb372984fc86631d8033f3052186fcd82ab02e15265a1b08de47cd622ac"} Mar 18 18:47:49 crc kubenswrapper[4830]: I0318 18:47:49.481215 4830 generic.go:334] "Generic (PLEG): container finished" podID="870fe6e6-266b-46da-bc1f-10bb50124ca4" containerID="f02cceb372984fc86631d8033f3052186fcd82ab02e15265a1b08de47cd622ac" exitCode=0 Mar 18 18:47:49 crc kubenswrapper[4830]: I0318 18:47:49.481340 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r7vcs" event={"ID":"870fe6e6-266b-46da-bc1f-10bb50124ca4","Type":"ContainerDied","Data":"f02cceb372984fc86631d8033f3052186fcd82ab02e15265a1b08de47cd622ac"} Mar 18 18:47:50 crc kubenswrapper[4830]: I0318 18:47:50.493069 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r7vcs" event={"ID":"870fe6e6-266b-46da-bc1f-10bb50124ca4","Type":"ContainerStarted","Data":"1c43421f82f0d7b07c082c17cb6c1e22e5eafae84e9c236cb4488cdd776a18b7"} Mar 18 18:47:50 crc kubenswrapper[4830]: I0318 18:47:50.518866 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-r7vcs" podStartSLOduration=2.994329191 podStartE2EDuration="5.518845484s" podCreationTimestamp="2026-03-18 18:47:45 +0000 UTC" firstStartedPulling="2026-03-18 18:47:47.461932983 +0000 UTC m=+2702.029563365" lastFinishedPulling="2026-03-18 18:47:49.986449296 +0000 UTC m=+2704.554079658" observedRunningTime="2026-03-18 18:47:50.513808342 +0000 UTC m=+2705.081438694" watchObservedRunningTime="2026-03-18 18:47:50.518845484 +0000 UTC m=+2705.086475826" Mar 18 18:47:56 crc kubenswrapper[4830]: I0318 18:47:56.138057 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-r7vcs" Mar 18 18:47:56 crc kubenswrapper[4830]: I0318 18:47:56.138820 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-r7vcs" Mar 18 18:47:56 crc kubenswrapper[4830]: I0318 18:47:56.209596 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-r7vcs" Mar 18 18:47:56 crc kubenswrapper[4830]: I0318 18:47:56.625847 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-r7vcs" Mar 18 18:47:56 crc kubenswrapper[4830]: I0318 18:47:56.700904 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r7vcs"] Mar 18 18:47:58 crc kubenswrapper[4830]: I0318 18:47:58.573241 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-r7vcs" podUID="870fe6e6-266b-46da-bc1f-10bb50124ca4" containerName="registry-server" containerID="cri-o://1c43421f82f0d7b07c082c17cb6c1e22e5eafae84e9c236cb4488cdd776a18b7" gracePeriod=2 Mar 18 18:47:59 crc kubenswrapper[4830]: I0318 18:47:59.581850 4830 generic.go:334] "Generic (PLEG): container finished" podID="870fe6e6-266b-46da-bc1f-10bb50124ca4" containerID="1c43421f82f0d7b07c082c17cb6c1e22e5eafae84e9c236cb4488cdd776a18b7" exitCode=0 Mar 18 18:47:59 crc kubenswrapper[4830]: I0318 18:47:59.581927 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r7vcs" event={"ID":"870fe6e6-266b-46da-bc1f-10bb50124ca4","Type":"ContainerDied","Data":"1c43421f82f0d7b07c082c17cb6c1e22e5eafae84e9c236cb4488cdd776a18b7"} Mar 18 18:47:59 crc kubenswrapper[4830]: I0318 18:47:59.582122 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r7vcs" event={"ID":"870fe6e6-266b-46da-bc1f-10bb50124ca4","Type":"ContainerDied","Data":"cfe0d6d49cefe7a794e22ca0c6056d96d82471c7059ad5a76820d4ff6a6e88a3"} Mar 18 18:47:59 crc kubenswrapper[4830]: I0318 18:47:59.582134 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfe0d6d49cefe7a794e22ca0c6056d96d82471c7059ad5a76820d4ff6a6e88a3" Mar 18 18:47:59 crc kubenswrapper[4830]: I0318 18:47:59.601480 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r7vcs" Mar 18 18:47:59 crc kubenswrapper[4830]: I0318 18:47:59.698428 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/870fe6e6-266b-46da-bc1f-10bb50124ca4-catalog-content\") pod \"870fe6e6-266b-46da-bc1f-10bb50124ca4\" (UID: \"870fe6e6-266b-46da-bc1f-10bb50124ca4\") " Mar 18 18:47:59 crc kubenswrapper[4830]: I0318 18:47:59.698602 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/870fe6e6-266b-46da-bc1f-10bb50124ca4-utilities\") pod \"870fe6e6-266b-46da-bc1f-10bb50124ca4\" (UID: \"870fe6e6-266b-46da-bc1f-10bb50124ca4\") " Mar 18 18:47:59 crc kubenswrapper[4830]: I0318 18:47:59.698651 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbc6g\" (UniqueName: \"kubernetes.io/projected/870fe6e6-266b-46da-bc1f-10bb50124ca4-kube-api-access-tbc6g\") pod \"870fe6e6-266b-46da-bc1f-10bb50124ca4\" (UID: \"870fe6e6-266b-46da-bc1f-10bb50124ca4\") " Mar 18 18:47:59 crc kubenswrapper[4830]: I0318 18:47:59.699693 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/870fe6e6-266b-46da-bc1f-10bb50124ca4-utilities" (OuterVolumeSpecName: "utilities") pod "870fe6e6-266b-46da-bc1f-10bb50124ca4" (UID: "870fe6e6-266b-46da-bc1f-10bb50124ca4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:47:59 crc kubenswrapper[4830]: I0318 18:47:59.704062 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/870fe6e6-266b-46da-bc1f-10bb50124ca4-kube-api-access-tbc6g" (OuterVolumeSpecName: "kube-api-access-tbc6g") pod "870fe6e6-266b-46da-bc1f-10bb50124ca4" (UID: "870fe6e6-266b-46da-bc1f-10bb50124ca4"). InnerVolumeSpecName "kube-api-access-tbc6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:47:59 crc kubenswrapper[4830]: I0318 18:47:59.767830 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/870fe6e6-266b-46da-bc1f-10bb50124ca4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "870fe6e6-266b-46da-bc1f-10bb50124ca4" (UID: "870fe6e6-266b-46da-bc1f-10bb50124ca4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:47:59 crc kubenswrapper[4830]: I0318 18:47:59.800464 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/870fe6e6-266b-46da-bc1f-10bb50124ca4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:47:59 crc kubenswrapper[4830]: I0318 18:47:59.800502 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/870fe6e6-266b-46da-bc1f-10bb50124ca4-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:47:59 crc kubenswrapper[4830]: I0318 18:47:59.800516 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbc6g\" (UniqueName: \"kubernetes.io/projected/870fe6e6-266b-46da-bc1f-10bb50124ca4-kube-api-access-tbc6g\") on node \"crc\" DevicePath \"\"" Mar 18 18:48:00 crc kubenswrapper[4830]: I0318 18:48:00.162817 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564328-lrppf"] Mar 18 18:48:00 crc kubenswrapper[4830]: E0318 18:48:00.163290 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="870fe6e6-266b-46da-bc1f-10bb50124ca4" containerName="extract-content" Mar 18 18:48:00 crc kubenswrapper[4830]: I0318 18:48:00.163317 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="870fe6e6-266b-46da-bc1f-10bb50124ca4" containerName="extract-content" Mar 18 18:48:00 crc kubenswrapper[4830]: E0318 18:48:00.163359 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="870fe6e6-266b-46da-bc1f-10bb50124ca4" containerName="registry-server" Mar 18 18:48:00 crc kubenswrapper[4830]: I0318 18:48:00.163373 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="870fe6e6-266b-46da-bc1f-10bb50124ca4" containerName="registry-server" Mar 18 18:48:00 crc kubenswrapper[4830]: E0318 18:48:00.163402 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="870fe6e6-266b-46da-bc1f-10bb50124ca4" containerName="extract-utilities" Mar 18 18:48:00 crc kubenswrapper[4830]: I0318 18:48:00.163418 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="870fe6e6-266b-46da-bc1f-10bb50124ca4" containerName="extract-utilities" Mar 18 18:48:00 crc kubenswrapper[4830]: I0318 18:48:00.163664 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="870fe6e6-266b-46da-bc1f-10bb50124ca4" containerName="registry-server" Mar 18 18:48:00 crc kubenswrapper[4830]: I0318 18:48:00.164452 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564328-lrppf" Mar 18 18:48:00 crc kubenswrapper[4830]: I0318 18:48:00.168114 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 18:48:00 crc kubenswrapper[4830]: I0318 18:48:00.168188 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:48:00 crc kubenswrapper[4830]: I0318 18:48:00.168194 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:48:00 crc kubenswrapper[4830]: I0318 18:48:00.174664 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564328-lrppf"] Mar 18 18:48:00 crc kubenswrapper[4830]: I0318 18:48:00.311877 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6fch\" (UniqueName: \"kubernetes.io/projected/d373e727-b951-4e75-a61a-64ab2069e835-kube-api-access-v6fch\") pod \"auto-csr-approver-29564328-lrppf\" (UID: \"d373e727-b951-4e75-a61a-64ab2069e835\") " pod="openshift-infra/auto-csr-approver-29564328-lrppf" Mar 18 18:48:00 crc kubenswrapper[4830]: I0318 18:48:00.414017 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6fch\" (UniqueName: \"kubernetes.io/projected/d373e727-b951-4e75-a61a-64ab2069e835-kube-api-access-v6fch\") pod \"auto-csr-approver-29564328-lrppf\" (UID: \"d373e727-b951-4e75-a61a-64ab2069e835\") " pod="openshift-infra/auto-csr-approver-29564328-lrppf" Mar 18 18:48:00 crc kubenswrapper[4830]: I0318 18:48:00.431753 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6fch\" (UniqueName: \"kubernetes.io/projected/d373e727-b951-4e75-a61a-64ab2069e835-kube-api-access-v6fch\") pod \"auto-csr-approver-29564328-lrppf\" (UID: \"d373e727-b951-4e75-a61a-64ab2069e835\") " pod="openshift-infra/auto-csr-approver-29564328-lrppf" Mar 18 18:48:00 crc kubenswrapper[4830]: I0318 18:48:00.485629 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564328-lrppf" Mar 18 18:48:00 crc kubenswrapper[4830]: I0318 18:48:00.591003 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r7vcs" Mar 18 18:48:00 crc kubenswrapper[4830]: I0318 18:48:00.630006 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r7vcs"] Mar 18 18:48:00 crc kubenswrapper[4830]: I0318 18:48:00.639557 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-r7vcs"] Mar 18 18:48:00 crc kubenswrapper[4830]: I0318 18:48:00.771927 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564328-lrppf"] Mar 18 18:48:00 crc kubenswrapper[4830]: W0318 18:48:00.780945 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd373e727_b951_4e75_a61a_64ab2069e835.slice/crio-3787354733d0713ec1a6c7e87a0edfe1cf876c87892aa336c4d52799eedcd007 WatchSource:0}: Error finding container 3787354733d0713ec1a6c7e87a0edfe1cf876c87892aa336c4d52799eedcd007: Status 404 returned error can't find the container with id 3787354733d0713ec1a6c7e87a0edfe1cf876c87892aa336c4d52799eedcd007 Mar 18 18:48:01 crc kubenswrapper[4830]: I0318 18:48:01.603506 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564328-lrppf" event={"ID":"d373e727-b951-4e75-a61a-64ab2069e835","Type":"ContainerStarted","Data":"3787354733d0713ec1a6c7e87a0edfe1cf876c87892aa336c4d52799eedcd007"} Mar 18 18:48:02 crc kubenswrapper[4830]: I0318 18:48:02.252254 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="870fe6e6-266b-46da-bc1f-10bb50124ca4" path="/var/lib/kubelet/pods/870fe6e6-266b-46da-bc1f-10bb50124ca4/volumes" Mar 18 18:48:02 crc kubenswrapper[4830]: I0318 18:48:02.611984 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564328-lrppf" event={"ID":"d373e727-b951-4e75-a61a-64ab2069e835","Type":"ContainerStarted","Data":"de253278754c4cf91bf942e58e4c9caf033ee69c98b5ffbfdb02128871a5cfab"} Mar 18 18:48:02 crc kubenswrapper[4830]: I0318 18:48:02.628583 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564328-lrppf" podStartSLOduration=1.504548086 podStartE2EDuration="2.628566739s" podCreationTimestamp="2026-03-18 18:48:00 +0000 UTC" firstStartedPulling="2026-03-18 18:48:00.784273749 +0000 UTC m=+2715.351904091" lastFinishedPulling="2026-03-18 18:48:01.908292392 +0000 UTC m=+2716.475922744" observedRunningTime="2026-03-18 18:48:02.626888461 +0000 UTC m=+2717.194518793" watchObservedRunningTime="2026-03-18 18:48:02.628566739 +0000 UTC m=+2717.196197071" Mar 18 18:48:03 crc kubenswrapper[4830]: I0318 18:48:03.621991 4830 generic.go:334] "Generic (PLEG): container finished" podID="d373e727-b951-4e75-a61a-64ab2069e835" containerID="de253278754c4cf91bf942e58e4c9caf033ee69c98b5ffbfdb02128871a5cfab" exitCode=0 Mar 18 18:48:03 crc kubenswrapper[4830]: I0318 18:48:03.622097 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564328-lrppf" event={"ID":"d373e727-b951-4e75-a61a-64ab2069e835","Type":"ContainerDied","Data":"de253278754c4cf91bf942e58e4c9caf033ee69c98b5ffbfdb02128871a5cfab"} Mar 18 18:48:04 crc kubenswrapper[4830]: I0318 18:48:04.925003 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564328-lrppf" Mar 18 18:48:04 crc kubenswrapper[4830]: I0318 18:48:04.980530 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6fch\" (UniqueName: \"kubernetes.io/projected/d373e727-b951-4e75-a61a-64ab2069e835-kube-api-access-v6fch\") pod \"d373e727-b951-4e75-a61a-64ab2069e835\" (UID: \"d373e727-b951-4e75-a61a-64ab2069e835\") " Mar 18 18:48:04 crc kubenswrapper[4830]: I0318 18:48:04.989398 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d373e727-b951-4e75-a61a-64ab2069e835-kube-api-access-v6fch" (OuterVolumeSpecName: "kube-api-access-v6fch") pod "d373e727-b951-4e75-a61a-64ab2069e835" (UID: "d373e727-b951-4e75-a61a-64ab2069e835"). InnerVolumeSpecName "kube-api-access-v6fch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:48:05 crc kubenswrapper[4830]: I0318 18:48:05.081949 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6fch\" (UniqueName: \"kubernetes.io/projected/d373e727-b951-4e75-a61a-64ab2069e835-kube-api-access-v6fch\") on node \"crc\" DevicePath \"\"" Mar 18 18:48:05 crc kubenswrapper[4830]: I0318 18:48:05.636176 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564328-lrppf" event={"ID":"d373e727-b951-4e75-a61a-64ab2069e835","Type":"ContainerDied","Data":"3787354733d0713ec1a6c7e87a0edfe1cf876c87892aa336c4d52799eedcd007"} Mar 18 18:48:05 crc kubenswrapper[4830]: I0318 18:48:05.636215 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3787354733d0713ec1a6c7e87a0edfe1cf876c87892aa336c4d52799eedcd007" Mar 18 18:48:05 crc kubenswrapper[4830]: I0318 18:48:05.636248 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564328-lrppf" Mar 18 18:48:05 crc kubenswrapper[4830]: I0318 18:48:05.721159 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564322-jh2dp"] Mar 18 18:48:05 crc kubenswrapper[4830]: I0318 18:48:05.729463 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564322-jh2dp"] Mar 18 18:48:06 crc kubenswrapper[4830]: I0318 18:48:06.247327 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="daf3284a-e858-4458-b1d5-a0d3f19226d1" path="/var/lib/kubelet/pods/daf3284a-e858-4458-b1d5-a0d3f19226d1/volumes" Mar 18 18:48:17 crc kubenswrapper[4830]: I0318 18:48:17.276072 4830 scope.go:117] "RemoveContainer" containerID="87d22b35ee34ec1ab9a227e039de5f91f3a7820d8b0c64c9139efe4f5ac709b4" Mar 18 18:48:29 crc kubenswrapper[4830]: I0318 18:48:29.509857 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:48:29 crc kubenswrapper[4830]: I0318 18:48:29.510884 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:48:59 crc kubenswrapper[4830]: I0318 18:48:59.509341 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:48:59 crc kubenswrapper[4830]: I0318 18:48:59.510236 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:49:29 crc kubenswrapper[4830]: I0318 18:49:29.510207 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:49:29 crc kubenswrapper[4830]: I0318 18:49:29.511000 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:49:29 crc kubenswrapper[4830]: I0318 18:49:29.511085 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" Mar 18 18:49:29 crc kubenswrapper[4830]: I0318 18:49:29.512350 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4b326b3494842e2defd3cf2873c94d0096bb5d3646fec642506cd4fe0c2fedd4"} pod="openshift-machine-config-operator/machine-config-daemon-plzpb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 18:49:29 crc kubenswrapper[4830]: I0318 18:49:29.512452 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" containerID="cri-o://4b326b3494842e2defd3cf2873c94d0096bb5d3646fec642506cd4fe0c2fedd4" gracePeriod=600 Mar 18 18:49:30 crc kubenswrapper[4830]: I0318 18:49:30.429053 4830 generic.go:334] "Generic (PLEG): container finished" podID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerID="4b326b3494842e2defd3cf2873c94d0096bb5d3646fec642506cd4fe0c2fedd4" exitCode=0 Mar 18 18:49:30 crc kubenswrapper[4830]: I0318 18:49:30.429155 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerDied","Data":"4b326b3494842e2defd3cf2873c94d0096bb5d3646fec642506cd4fe0c2fedd4"} Mar 18 18:49:30 crc kubenswrapper[4830]: I0318 18:49:30.429703 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerStarted","Data":"9ebf0be27630d291f563664287588810c1f00205f7e12e3c7064ff19e68f2365"} Mar 18 18:49:30 crc kubenswrapper[4830]: I0318 18:49:30.429738 4830 scope.go:117] "RemoveContainer" containerID="dfbcf38a330ef2a2c30556ca081227685219cf1b5ceefae31a414ea5e6724c04" Mar 18 18:49:32 crc kubenswrapper[4830]: I0318 18:49:32.991591 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-t2jp6"] Mar 18 18:49:32 crc kubenswrapper[4830]: E0318 18:49:32.992561 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d373e727-b951-4e75-a61a-64ab2069e835" containerName="oc" Mar 18 18:49:32 crc kubenswrapper[4830]: I0318 18:49:32.992579 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="d373e727-b951-4e75-a61a-64ab2069e835" containerName="oc" Mar 18 18:49:32 crc kubenswrapper[4830]: I0318 18:49:32.992793 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="d373e727-b951-4e75-a61a-64ab2069e835" containerName="oc" Mar 18 18:49:32 crc kubenswrapper[4830]: I0318 18:49:32.993948 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t2jp6" Mar 18 18:49:33 crc kubenswrapper[4830]: I0318 18:49:33.007641 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t2jp6"] Mar 18 18:49:33 crc kubenswrapper[4830]: I0318 18:49:33.162693 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73d64342-5047-4ed1-87f9-fc9c280468b2-catalog-content\") pod \"community-operators-t2jp6\" (UID: \"73d64342-5047-4ed1-87f9-fc9c280468b2\") " pod="openshift-marketplace/community-operators-t2jp6" Mar 18 18:49:33 crc kubenswrapper[4830]: I0318 18:49:33.162890 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24p88\" (UniqueName: \"kubernetes.io/projected/73d64342-5047-4ed1-87f9-fc9c280468b2-kube-api-access-24p88\") pod \"community-operators-t2jp6\" (UID: \"73d64342-5047-4ed1-87f9-fc9c280468b2\") " pod="openshift-marketplace/community-operators-t2jp6" Mar 18 18:49:33 crc kubenswrapper[4830]: I0318 18:49:33.162956 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73d64342-5047-4ed1-87f9-fc9c280468b2-utilities\") pod \"community-operators-t2jp6\" (UID: \"73d64342-5047-4ed1-87f9-fc9c280468b2\") " pod="openshift-marketplace/community-operators-t2jp6" Mar 18 18:49:33 crc kubenswrapper[4830]: I0318 18:49:33.264401 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73d64342-5047-4ed1-87f9-fc9c280468b2-utilities\") pod \"community-operators-t2jp6\" (UID: \"73d64342-5047-4ed1-87f9-fc9c280468b2\") " pod="openshift-marketplace/community-operators-t2jp6" Mar 18 18:49:33 crc kubenswrapper[4830]: I0318 18:49:33.264936 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73d64342-5047-4ed1-87f9-fc9c280468b2-utilities\") pod \"community-operators-t2jp6\" (UID: \"73d64342-5047-4ed1-87f9-fc9c280468b2\") " pod="openshift-marketplace/community-operators-t2jp6" Mar 18 18:49:33 crc kubenswrapper[4830]: I0318 18:49:33.265116 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73d64342-5047-4ed1-87f9-fc9c280468b2-catalog-content\") pod \"community-operators-t2jp6\" (UID: \"73d64342-5047-4ed1-87f9-fc9c280468b2\") " pod="openshift-marketplace/community-operators-t2jp6" Mar 18 18:49:33 crc kubenswrapper[4830]: I0318 18:49:33.265410 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73d64342-5047-4ed1-87f9-fc9c280468b2-catalog-content\") pod \"community-operators-t2jp6\" (UID: \"73d64342-5047-4ed1-87f9-fc9c280468b2\") " pod="openshift-marketplace/community-operators-t2jp6" Mar 18 18:49:33 crc kubenswrapper[4830]: I0318 18:49:33.265853 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24p88\" (UniqueName: \"kubernetes.io/projected/73d64342-5047-4ed1-87f9-fc9c280468b2-kube-api-access-24p88\") pod \"community-operators-t2jp6\" (UID: \"73d64342-5047-4ed1-87f9-fc9c280468b2\") " pod="openshift-marketplace/community-operators-t2jp6" Mar 18 18:49:33 crc kubenswrapper[4830]: I0318 18:49:33.291420 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24p88\" (UniqueName: \"kubernetes.io/projected/73d64342-5047-4ed1-87f9-fc9c280468b2-kube-api-access-24p88\") pod \"community-operators-t2jp6\" (UID: \"73d64342-5047-4ed1-87f9-fc9c280468b2\") " pod="openshift-marketplace/community-operators-t2jp6" Mar 18 18:49:33 crc kubenswrapper[4830]: I0318 18:49:33.325054 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t2jp6" Mar 18 18:49:33 crc kubenswrapper[4830]: I0318 18:49:33.854404 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t2jp6"] Mar 18 18:49:34 crc kubenswrapper[4830]: I0318 18:49:34.514674 4830 generic.go:334] "Generic (PLEG): container finished" podID="73d64342-5047-4ed1-87f9-fc9c280468b2" containerID="bd14af7370d706b04b7c3cfe66ebc1171b5c023c914e2dfc212935b9c256429d" exitCode=0 Mar 18 18:49:34 crc kubenswrapper[4830]: I0318 18:49:34.514815 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t2jp6" event={"ID":"73d64342-5047-4ed1-87f9-fc9c280468b2","Type":"ContainerDied","Data":"bd14af7370d706b04b7c3cfe66ebc1171b5c023c914e2dfc212935b9c256429d"} Mar 18 18:49:34 crc kubenswrapper[4830]: I0318 18:49:34.515116 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t2jp6" event={"ID":"73d64342-5047-4ed1-87f9-fc9c280468b2","Type":"ContainerStarted","Data":"27e588ac3e344b9fa42e8396a9464634e70becc88dc1e06c70689f8d0a7499c3"} Mar 18 18:49:34 crc kubenswrapper[4830]: I0318 18:49:34.519130 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 18:49:36 crc kubenswrapper[4830]: I0318 18:49:36.534758 4830 generic.go:334] "Generic (PLEG): container finished" podID="73d64342-5047-4ed1-87f9-fc9c280468b2" containerID="7941355f14fdbf3167a9ef5b0b3970e26b942857df04f965612f61b7e1038148" exitCode=0 Mar 18 18:49:36 crc kubenswrapper[4830]: I0318 18:49:36.535006 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t2jp6" event={"ID":"73d64342-5047-4ed1-87f9-fc9c280468b2","Type":"ContainerDied","Data":"7941355f14fdbf3167a9ef5b0b3970e26b942857df04f965612f61b7e1038148"} Mar 18 18:49:37 crc kubenswrapper[4830]: I0318 18:49:37.552251 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t2jp6" event={"ID":"73d64342-5047-4ed1-87f9-fc9c280468b2","Type":"ContainerStarted","Data":"42ad489e289c39f55c53ce39cc97fd376eb91e2cf0c1e222e075594a226da9ae"} Mar 18 18:49:37 crc kubenswrapper[4830]: I0318 18:49:37.588951 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-t2jp6" podStartSLOduration=2.993378272 podStartE2EDuration="5.588927425s" podCreationTimestamp="2026-03-18 18:49:32 +0000 UTC" firstStartedPulling="2026-03-18 18:49:34.518673726 +0000 UTC m=+2809.086304088" lastFinishedPulling="2026-03-18 18:49:37.114222869 +0000 UTC m=+2811.681853241" observedRunningTime="2026-03-18 18:49:37.581812813 +0000 UTC m=+2812.149443185" watchObservedRunningTime="2026-03-18 18:49:37.588927425 +0000 UTC m=+2812.156557767" Mar 18 18:49:43 crc kubenswrapper[4830]: I0318 18:49:43.325979 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-t2jp6" Mar 18 18:49:43 crc kubenswrapper[4830]: I0318 18:49:43.326432 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-t2jp6" Mar 18 18:49:43 crc kubenswrapper[4830]: I0318 18:49:43.401886 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-t2jp6" Mar 18 18:49:43 crc kubenswrapper[4830]: I0318 18:49:43.689090 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-t2jp6" Mar 18 18:49:43 crc kubenswrapper[4830]: I0318 18:49:43.763331 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t2jp6"] Mar 18 18:49:45 crc kubenswrapper[4830]: I0318 18:49:45.631692 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-t2jp6" podUID="73d64342-5047-4ed1-87f9-fc9c280468b2" containerName="registry-server" containerID="cri-o://42ad489e289c39f55c53ce39cc97fd376eb91e2cf0c1e222e075594a226da9ae" gracePeriod=2 Mar 18 18:49:46 crc kubenswrapper[4830]: I0318 18:49:46.109699 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t2jp6" Mar 18 18:49:46 crc kubenswrapper[4830]: I0318 18:49:46.295925 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24p88\" (UniqueName: \"kubernetes.io/projected/73d64342-5047-4ed1-87f9-fc9c280468b2-kube-api-access-24p88\") pod \"73d64342-5047-4ed1-87f9-fc9c280468b2\" (UID: \"73d64342-5047-4ed1-87f9-fc9c280468b2\") " Mar 18 18:49:46 crc kubenswrapper[4830]: I0318 18:49:46.296523 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73d64342-5047-4ed1-87f9-fc9c280468b2-catalog-content\") pod \"73d64342-5047-4ed1-87f9-fc9c280468b2\" (UID: \"73d64342-5047-4ed1-87f9-fc9c280468b2\") " Mar 18 18:49:46 crc kubenswrapper[4830]: I0318 18:49:46.296729 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73d64342-5047-4ed1-87f9-fc9c280468b2-utilities\") pod \"73d64342-5047-4ed1-87f9-fc9c280468b2\" (UID: \"73d64342-5047-4ed1-87f9-fc9c280468b2\") " Mar 18 18:49:46 crc kubenswrapper[4830]: I0318 18:49:46.298300 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73d64342-5047-4ed1-87f9-fc9c280468b2-utilities" (OuterVolumeSpecName: "utilities") pod "73d64342-5047-4ed1-87f9-fc9c280468b2" (UID: "73d64342-5047-4ed1-87f9-fc9c280468b2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:49:46 crc kubenswrapper[4830]: I0318 18:49:46.308141 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73d64342-5047-4ed1-87f9-fc9c280468b2-kube-api-access-24p88" (OuterVolumeSpecName: "kube-api-access-24p88") pod "73d64342-5047-4ed1-87f9-fc9c280468b2" (UID: "73d64342-5047-4ed1-87f9-fc9c280468b2"). InnerVolumeSpecName "kube-api-access-24p88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:49:46 crc kubenswrapper[4830]: I0318 18:49:46.398741 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24p88\" (UniqueName: \"kubernetes.io/projected/73d64342-5047-4ed1-87f9-fc9c280468b2-kube-api-access-24p88\") on node \"crc\" DevicePath \"\"" Mar 18 18:49:46 crc kubenswrapper[4830]: I0318 18:49:46.399310 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73d64342-5047-4ed1-87f9-fc9c280468b2-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:49:46 crc kubenswrapper[4830]: I0318 18:49:46.646658 4830 generic.go:334] "Generic (PLEG): container finished" podID="73d64342-5047-4ed1-87f9-fc9c280468b2" containerID="42ad489e289c39f55c53ce39cc97fd376eb91e2cf0c1e222e075594a226da9ae" exitCode=0 Mar 18 18:49:46 crc kubenswrapper[4830]: I0318 18:49:46.646727 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t2jp6" event={"ID":"73d64342-5047-4ed1-87f9-fc9c280468b2","Type":"ContainerDied","Data":"42ad489e289c39f55c53ce39cc97fd376eb91e2cf0c1e222e075594a226da9ae"} Mar 18 18:49:46 crc kubenswrapper[4830]: I0318 18:49:46.646903 4830 scope.go:117] "RemoveContainer" containerID="42ad489e289c39f55c53ce39cc97fd376eb91e2cf0c1e222e075594a226da9ae" Mar 18 18:49:46 crc kubenswrapper[4830]: I0318 18:49:46.646929 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t2jp6" event={"ID":"73d64342-5047-4ed1-87f9-fc9c280468b2","Type":"ContainerDied","Data":"27e588ac3e344b9fa42e8396a9464634e70becc88dc1e06c70689f8d0a7499c3"} Mar 18 18:49:46 crc kubenswrapper[4830]: I0318 18:49:46.648668 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t2jp6" Mar 18 18:49:46 crc kubenswrapper[4830]: I0318 18:49:46.677863 4830 scope.go:117] "RemoveContainer" containerID="7941355f14fdbf3167a9ef5b0b3970e26b942857df04f965612f61b7e1038148" Mar 18 18:49:46 crc kubenswrapper[4830]: I0318 18:49:46.713391 4830 scope.go:117] "RemoveContainer" containerID="bd14af7370d706b04b7c3cfe66ebc1171b5c023c914e2dfc212935b9c256429d" Mar 18 18:49:46 crc kubenswrapper[4830]: I0318 18:49:46.754407 4830 scope.go:117] "RemoveContainer" containerID="42ad489e289c39f55c53ce39cc97fd376eb91e2cf0c1e222e075594a226da9ae" Mar 18 18:49:46 crc kubenswrapper[4830]: E0318 18:49:46.755251 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42ad489e289c39f55c53ce39cc97fd376eb91e2cf0c1e222e075594a226da9ae\": container with ID starting with 42ad489e289c39f55c53ce39cc97fd376eb91e2cf0c1e222e075594a226da9ae not found: ID does not exist" containerID="42ad489e289c39f55c53ce39cc97fd376eb91e2cf0c1e222e075594a226da9ae" Mar 18 18:49:46 crc kubenswrapper[4830]: I0318 18:49:46.755442 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42ad489e289c39f55c53ce39cc97fd376eb91e2cf0c1e222e075594a226da9ae"} err="failed to get container status \"42ad489e289c39f55c53ce39cc97fd376eb91e2cf0c1e222e075594a226da9ae\": rpc error: code = NotFound desc = could not find container \"42ad489e289c39f55c53ce39cc97fd376eb91e2cf0c1e222e075594a226da9ae\": container with ID starting with 42ad489e289c39f55c53ce39cc97fd376eb91e2cf0c1e222e075594a226da9ae not found: ID does not exist" Mar 18 18:49:46 crc kubenswrapper[4830]: I0318 18:49:46.755578 4830 scope.go:117] "RemoveContainer" containerID="7941355f14fdbf3167a9ef5b0b3970e26b942857df04f965612f61b7e1038148" Mar 18 18:49:46 crc kubenswrapper[4830]: E0318 18:49:46.756278 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7941355f14fdbf3167a9ef5b0b3970e26b942857df04f965612f61b7e1038148\": container with ID starting with 7941355f14fdbf3167a9ef5b0b3970e26b942857df04f965612f61b7e1038148 not found: ID does not exist" containerID="7941355f14fdbf3167a9ef5b0b3970e26b942857df04f965612f61b7e1038148" Mar 18 18:49:46 crc kubenswrapper[4830]: I0318 18:49:46.756356 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7941355f14fdbf3167a9ef5b0b3970e26b942857df04f965612f61b7e1038148"} err="failed to get container status \"7941355f14fdbf3167a9ef5b0b3970e26b942857df04f965612f61b7e1038148\": rpc error: code = NotFound desc = could not find container \"7941355f14fdbf3167a9ef5b0b3970e26b942857df04f965612f61b7e1038148\": container with ID starting with 7941355f14fdbf3167a9ef5b0b3970e26b942857df04f965612f61b7e1038148 not found: ID does not exist" Mar 18 18:49:46 crc kubenswrapper[4830]: I0318 18:49:46.756402 4830 scope.go:117] "RemoveContainer" containerID="bd14af7370d706b04b7c3cfe66ebc1171b5c023c914e2dfc212935b9c256429d" Mar 18 18:49:46 crc kubenswrapper[4830]: E0318 18:49:46.756922 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd14af7370d706b04b7c3cfe66ebc1171b5c023c914e2dfc212935b9c256429d\": container with ID starting with bd14af7370d706b04b7c3cfe66ebc1171b5c023c914e2dfc212935b9c256429d not found: ID does not exist" containerID="bd14af7370d706b04b7c3cfe66ebc1171b5c023c914e2dfc212935b9c256429d" Mar 18 18:49:46 crc kubenswrapper[4830]: I0318 18:49:46.756990 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd14af7370d706b04b7c3cfe66ebc1171b5c023c914e2dfc212935b9c256429d"} err="failed to get container status \"bd14af7370d706b04b7c3cfe66ebc1171b5c023c914e2dfc212935b9c256429d\": rpc error: code = NotFound desc = could not find container \"bd14af7370d706b04b7c3cfe66ebc1171b5c023c914e2dfc212935b9c256429d\": container with ID starting with bd14af7370d706b04b7c3cfe66ebc1171b5c023c914e2dfc212935b9c256429d not found: ID does not exist" Mar 18 18:49:46 crc kubenswrapper[4830]: I0318 18:49:46.898279 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73d64342-5047-4ed1-87f9-fc9c280468b2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "73d64342-5047-4ed1-87f9-fc9c280468b2" (UID: "73d64342-5047-4ed1-87f9-fc9c280468b2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:49:46 crc kubenswrapper[4830]: I0318 18:49:46.907386 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73d64342-5047-4ed1-87f9-fc9c280468b2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:49:47 crc kubenswrapper[4830]: I0318 18:49:47.011970 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t2jp6"] Mar 18 18:49:47 crc kubenswrapper[4830]: I0318 18:49:47.022531 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-t2jp6"] Mar 18 18:49:48 crc kubenswrapper[4830]: I0318 18:49:48.247324 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73d64342-5047-4ed1-87f9-fc9c280468b2" path="/var/lib/kubelet/pods/73d64342-5047-4ed1-87f9-fc9c280468b2/volumes" Mar 18 18:50:00 crc kubenswrapper[4830]: I0318 18:50:00.149083 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564330-4rgf2"] Mar 18 18:50:00 crc kubenswrapper[4830]: E0318 18:50:00.150419 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73d64342-5047-4ed1-87f9-fc9c280468b2" containerName="extract-content" Mar 18 18:50:00 crc kubenswrapper[4830]: I0318 18:50:00.150444 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="73d64342-5047-4ed1-87f9-fc9c280468b2" containerName="extract-content" Mar 18 18:50:00 crc kubenswrapper[4830]: E0318 18:50:00.150491 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73d64342-5047-4ed1-87f9-fc9c280468b2" containerName="extract-utilities" Mar 18 18:50:00 crc kubenswrapper[4830]: I0318 18:50:00.150505 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="73d64342-5047-4ed1-87f9-fc9c280468b2" containerName="extract-utilities" Mar 18 18:50:00 crc kubenswrapper[4830]: E0318 18:50:00.150530 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73d64342-5047-4ed1-87f9-fc9c280468b2" containerName="registry-server" Mar 18 18:50:00 crc kubenswrapper[4830]: I0318 18:50:00.150544 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="73d64342-5047-4ed1-87f9-fc9c280468b2" containerName="registry-server" Mar 18 18:50:00 crc kubenswrapper[4830]: I0318 18:50:00.150881 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="73d64342-5047-4ed1-87f9-fc9c280468b2" containerName="registry-server" Mar 18 18:50:00 crc kubenswrapper[4830]: I0318 18:50:00.151660 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564330-4rgf2" Mar 18 18:50:00 crc kubenswrapper[4830]: I0318 18:50:00.154113 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 18:50:00 crc kubenswrapper[4830]: I0318 18:50:00.154283 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:50:00 crc kubenswrapper[4830]: I0318 18:50:00.154720 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564330-4rgf2"] Mar 18 18:50:00 crc kubenswrapper[4830]: I0318 18:50:00.156759 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:50:00 crc kubenswrapper[4830]: I0318 18:50:00.257346 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcwvx\" (UniqueName: \"kubernetes.io/projected/5226b313-e059-4748-8821-6d8875e4b11a-kube-api-access-vcwvx\") pod \"auto-csr-approver-29564330-4rgf2\" (UID: \"5226b313-e059-4748-8821-6d8875e4b11a\") " pod="openshift-infra/auto-csr-approver-29564330-4rgf2" Mar 18 18:50:00 crc kubenswrapper[4830]: I0318 18:50:00.359390 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcwvx\" (UniqueName: \"kubernetes.io/projected/5226b313-e059-4748-8821-6d8875e4b11a-kube-api-access-vcwvx\") pod \"auto-csr-approver-29564330-4rgf2\" (UID: \"5226b313-e059-4748-8821-6d8875e4b11a\") " pod="openshift-infra/auto-csr-approver-29564330-4rgf2" Mar 18 18:50:00 crc kubenswrapper[4830]: I0318 18:50:00.391999 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcwvx\" (UniqueName: \"kubernetes.io/projected/5226b313-e059-4748-8821-6d8875e4b11a-kube-api-access-vcwvx\") pod \"auto-csr-approver-29564330-4rgf2\" (UID: \"5226b313-e059-4748-8821-6d8875e4b11a\") " pod="openshift-infra/auto-csr-approver-29564330-4rgf2" Mar 18 18:50:00 crc kubenswrapper[4830]: I0318 18:50:00.479537 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564330-4rgf2" Mar 18 18:50:00 crc kubenswrapper[4830]: I0318 18:50:00.795491 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564330-4rgf2"] Mar 18 18:50:01 crc kubenswrapper[4830]: I0318 18:50:01.805958 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564330-4rgf2" event={"ID":"5226b313-e059-4748-8821-6d8875e4b11a","Type":"ContainerStarted","Data":"cb819e2993233a0d7cfbfb3d213f814d8bc61889e25f56867e1a1cc95bd89ff4"} Mar 18 18:50:02 crc kubenswrapper[4830]: I0318 18:50:02.814971 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564330-4rgf2" event={"ID":"5226b313-e059-4748-8821-6d8875e4b11a","Type":"ContainerStarted","Data":"fdc2a7f8c694c80b2b77e6a271954a157eab03b828db5e951263b7de51ba1d5a"} Mar 18 18:50:02 crc kubenswrapper[4830]: I0318 18:50:02.835252 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564330-4rgf2" podStartSLOduration=1.260301689 podStartE2EDuration="2.835216304s" podCreationTimestamp="2026-03-18 18:50:00 +0000 UTC" firstStartedPulling="2026-03-18 18:50:00.807624647 +0000 UTC m=+2835.375255019" lastFinishedPulling="2026-03-18 18:50:02.382539262 +0000 UTC m=+2836.950169634" observedRunningTime="2026-03-18 18:50:02.830996625 +0000 UTC m=+2837.398626967" watchObservedRunningTime="2026-03-18 18:50:02.835216304 +0000 UTC m=+2837.402846736" Mar 18 18:50:03 crc kubenswrapper[4830]: I0318 18:50:03.829858 4830 generic.go:334] "Generic (PLEG): container finished" podID="5226b313-e059-4748-8821-6d8875e4b11a" containerID="fdc2a7f8c694c80b2b77e6a271954a157eab03b828db5e951263b7de51ba1d5a" exitCode=0 Mar 18 18:50:03 crc kubenswrapper[4830]: I0318 18:50:03.829925 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564330-4rgf2" event={"ID":"5226b313-e059-4748-8821-6d8875e4b11a","Type":"ContainerDied","Data":"fdc2a7f8c694c80b2b77e6a271954a157eab03b828db5e951263b7de51ba1d5a"} Mar 18 18:50:05 crc kubenswrapper[4830]: I0318 18:50:05.305335 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564330-4rgf2" Mar 18 18:50:05 crc kubenswrapper[4830]: I0318 18:50:05.432024 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcwvx\" (UniqueName: \"kubernetes.io/projected/5226b313-e059-4748-8821-6d8875e4b11a-kube-api-access-vcwvx\") pod \"5226b313-e059-4748-8821-6d8875e4b11a\" (UID: \"5226b313-e059-4748-8821-6d8875e4b11a\") " Mar 18 18:50:05 crc kubenswrapper[4830]: I0318 18:50:05.437894 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5226b313-e059-4748-8821-6d8875e4b11a-kube-api-access-vcwvx" (OuterVolumeSpecName: "kube-api-access-vcwvx") pod "5226b313-e059-4748-8821-6d8875e4b11a" (UID: "5226b313-e059-4748-8821-6d8875e4b11a"). InnerVolumeSpecName "kube-api-access-vcwvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:50:05 crc kubenswrapper[4830]: I0318 18:50:05.534322 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcwvx\" (UniqueName: \"kubernetes.io/projected/5226b313-e059-4748-8821-6d8875e4b11a-kube-api-access-vcwvx\") on node \"crc\" DevicePath \"\"" Mar 18 18:50:05 crc kubenswrapper[4830]: I0318 18:50:05.851084 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564330-4rgf2" event={"ID":"5226b313-e059-4748-8821-6d8875e4b11a","Type":"ContainerDied","Data":"cb819e2993233a0d7cfbfb3d213f814d8bc61889e25f56867e1a1cc95bd89ff4"} Mar 18 18:50:05 crc kubenswrapper[4830]: I0318 18:50:05.851153 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564330-4rgf2" Mar 18 18:50:05 crc kubenswrapper[4830]: I0318 18:50:05.851161 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb819e2993233a0d7cfbfb3d213f814d8bc61889e25f56867e1a1cc95bd89ff4" Mar 18 18:50:05 crc kubenswrapper[4830]: I0318 18:50:05.924719 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564324-6pxbs"] Mar 18 18:50:05 crc kubenswrapper[4830]: I0318 18:50:05.934326 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564324-6pxbs"] Mar 18 18:50:06 crc kubenswrapper[4830]: I0318 18:50:06.250945 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cc0d3a3-b53d-4742-b41b-306b83656408" path="/var/lib/kubelet/pods/6cc0d3a3-b53d-4742-b41b-306b83656408/volumes" Mar 18 18:50:17 crc kubenswrapper[4830]: I0318 18:50:17.398027 4830 scope.go:117] "RemoveContainer" containerID="78e9cf6ef81244f467059302f6f09545389de7fbe97b41bd2d82422f1439819c" Mar 18 18:51:29 crc kubenswrapper[4830]: I0318 18:51:29.509895 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:51:29 crc kubenswrapper[4830]: I0318 18:51:29.510849 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:51:59 crc kubenswrapper[4830]: I0318 18:51:59.509699 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:51:59 crc kubenswrapper[4830]: I0318 18:51:59.510390 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:52:00 crc kubenswrapper[4830]: I0318 18:52:00.166897 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564332-9gdch"] Mar 18 18:52:00 crc kubenswrapper[4830]: E0318 18:52:00.167452 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5226b313-e059-4748-8821-6d8875e4b11a" containerName="oc" Mar 18 18:52:00 crc kubenswrapper[4830]: I0318 18:52:00.167483 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="5226b313-e059-4748-8821-6d8875e4b11a" containerName="oc" Mar 18 18:52:00 crc kubenswrapper[4830]: I0318 18:52:00.167883 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="5226b313-e059-4748-8821-6d8875e4b11a" containerName="oc" Mar 18 18:52:00 crc kubenswrapper[4830]: I0318 18:52:00.168674 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564332-9gdch" Mar 18 18:52:00 crc kubenswrapper[4830]: I0318 18:52:00.171246 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:52:00 crc kubenswrapper[4830]: I0318 18:52:00.171921 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:52:00 crc kubenswrapper[4830]: I0318 18:52:00.176299 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 18:52:00 crc kubenswrapper[4830]: I0318 18:52:00.177442 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564332-9gdch"] Mar 18 18:52:00 crc kubenswrapper[4830]: I0318 18:52:00.289622 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm4dn\" (UniqueName: \"kubernetes.io/projected/87f5fc92-b479-4cf5-9fb9-94f54e3d1822-kube-api-access-dm4dn\") pod \"auto-csr-approver-29564332-9gdch\" (UID: \"87f5fc92-b479-4cf5-9fb9-94f54e3d1822\") " pod="openshift-infra/auto-csr-approver-29564332-9gdch" Mar 18 18:52:00 crc kubenswrapper[4830]: I0318 18:52:00.391557 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm4dn\" (UniqueName: \"kubernetes.io/projected/87f5fc92-b479-4cf5-9fb9-94f54e3d1822-kube-api-access-dm4dn\") pod \"auto-csr-approver-29564332-9gdch\" (UID: \"87f5fc92-b479-4cf5-9fb9-94f54e3d1822\") " pod="openshift-infra/auto-csr-approver-29564332-9gdch" Mar 18 18:52:00 crc kubenswrapper[4830]: I0318 18:52:00.426196 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm4dn\" (UniqueName: \"kubernetes.io/projected/87f5fc92-b479-4cf5-9fb9-94f54e3d1822-kube-api-access-dm4dn\") pod \"auto-csr-approver-29564332-9gdch\" (UID: \"87f5fc92-b479-4cf5-9fb9-94f54e3d1822\") " pod="openshift-infra/auto-csr-approver-29564332-9gdch" Mar 18 18:52:00 crc kubenswrapper[4830]: I0318 18:52:00.493304 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564332-9gdch" Mar 18 18:52:00 crc kubenswrapper[4830]: I0318 18:52:00.748576 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564332-9gdch"] Mar 18 18:52:00 crc kubenswrapper[4830]: I0318 18:52:00.983589 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564332-9gdch" event={"ID":"87f5fc92-b479-4cf5-9fb9-94f54e3d1822","Type":"ContainerStarted","Data":"6243b0c84d738ffa2e9e564ef82057cac7d804feda38d835b23f38e1a03d20f8"} Mar 18 18:52:03 crc kubenswrapper[4830]: I0318 18:52:03.002801 4830 generic.go:334] "Generic (PLEG): container finished" podID="87f5fc92-b479-4cf5-9fb9-94f54e3d1822" containerID="817e167b742a955d33429b2e3309b89787dc927e322abc34fd86838fb665b397" exitCode=0 Mar 18 18:52:03 crc kubenswrapper[4830]: I0318 18:52:03.002950 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564332-9gdch" event={"ID":"87f5fc92-b479-4cf5-9fb9-94f54e3d1822","Type":"ContainerDied","Data":"817e167b742a955d33429b2e3309b89787dc927e322abc34fd86838fb665b397"} Mar 18 18:52:04 crc kubenswrapper[4830]: I0318 18:52:04.304744 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564332-9gdch" Mar 18 18:52:04 crc kubenswrapper[4830]: I0318 18:52:04.354661 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dm4dn\" (UniqueName: \"kubernetes.io/projected/87f5fc92-b479-4cf5-9fb9-94f54e3d1822-kube-api-access-dm4dn\") pod \"87f5fc92-b479-4cf5-9fb9-94f54e3d1822\" (UID: \"87f5fc92-b479-4cf5-9fb9-94f54e3d1822\") " Mar 18 18:52:04 crc kubenswrapper[4830]: I0318 18:52:04.363960 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87f5fc92-b479-4cf5-9fb9-94f54e3d1822-kube-api-access-dm4dn" (OuterVolumeSpecName: "kube-api-access-dm4dn") pod "87f5fc92-b479-4cf5-9fb9-94f54e3d1822" (UID: "87f5fc92-b479-4cf5-9fb9-94f54e3d1822"). InnerVolumeSpecName "kube-api-access-dm4dn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:52:04 crc kubenswrapper[4830]: I0318 18:52:04.456805 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dm4dn\" (UniqueName: \"kubernetes.io/projected/87f5fc92-b479-4cf5-9fb9-94f54e3d1822-kube-api-access-dm4dn\") on node \"crc\" DevicePath \"\"" Mar 18 18:52:05 crc kubenswrapper[4830]: I0318 18:52:05.023159 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564332-9gdch" event={"ID":"87f5fc92-b479-4cf5-9fb9-94f54e3d1822","Type":"ContainerDied","Data":"6243b0c84d738ffa2e9e564ef82057cac7d804feda38d835b23f38e1a03d20f8"} Mar 18 18:52:05 crc kubenswrapper[4830]: I0318 18:52:05.023520 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6243b0c84d738ffa2e9e564ef82057cac7d804feda38d835b23f38e1a03d20f8" Mar 18 18:52:05 crc kubenswrapper[4830]: I0318 18:52:05.023319 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564332-9gdch" Mar 18 18:52:05 crc kubenswrapper[4830]: I0318 18:52:05.402183 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564326-5z5ds"] Mar 18 18:52:05 crc kubenswrapper[4830]: I0318 18:52:05.418120 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564326-5z5ds"] Mar 18 18:52:06 crc kubenswrapper[4830]: I0318 18:52:06.252628 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a180a8ce-62ba-4d86-847f-dd0db0f293b9" path="/var/lib/kubelet/pods/a180a8ce-62ba-4d86-847f-dd0db0f293b9/volumes" Mar 18 18:52:17 crc kubenswrapper[4830]: I0318 18:52:17.554710 4830 scope.go:117] "RemoveContainer" containerID="5a596de1cc1ef25b9c2f1bf13a756c76d693878a7468ef0d8169a8b61643c8d3" Mar 18 18:52:29 crc kubenswrapper[4830]: I0318 18:52:29.509499 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:52:29 crc kubenswrapper[4830]: I0318 18:52:29.510238 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:52:29 crc kubenswrapper[4830]: I0318 18:52:29.510310 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" Mar 18 18:52:29 crc kubenswrapper[4830]: I0318 18:52:29.511319 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9ebf0be27630d291f563664287588810c1f00205f7e12e3c7064ff19e68f2365"} pod="openshift-machine-config-operator/machine-config-daemon-plzpb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 18:52:29 crc kubenswrapper[4830]: I0318 18:52:29.511406 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" containerID="cri-o://9ebf0be27630d291f563664287588810c1f00205f7e12e3c7064ff19e68f2365" gracePeriod=600 Mar 18 18:52:29 crc kubenswrapper[4830]: E0318 18:52:29.649815 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:52:30 crc kubenswrapper[4830]: I0318 18:52:30.285721 4830 generic.go:334] "Generic (PLEG): container finished" podID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerID="9ebf0be27630d291f563664287588810c1f00205f7e12e3c7064ff19e68f2365" exitCode=0 Mar 18 18:52:30 crc kubenswrapper[4830]: I0318 18:52:30.285820 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerDied","Data":"9ebf0be27630d291f563664287588810c1f00205f7e12e3c7064ff19e68f2365"} Mar 18 18:52:30 crc kubenswrapper[4830]: I0318 18:52:30.285993 4830 scope.go:117] "RemoveContainer" containerID="4b326b3494842e2defd3cf2873c94d0096bb5d3646fec642506cd4fe0c2fedd4" Mar 18 18:52:30 crc kubenswrapper[4830]: I0318 18:52:30.286813 4830 scope.go:117] "RemoveContainer" containerID="9ebf0be27630d291f563664287588810c1f00205f7e12e3c7064ff19e68f2365" Mar 18 18:52:30 crc kubenswrapper[4830]: E0318 18:52:30.287640 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:52:42 crc kubenswrapper[4830]: I0318 18:52:42.235641 4830 scope.go:117] "RemoveContainer" containerID="9ebf0be27630d291f563664287588810c1f00205f7e12e3c7064ff19e68f2365" Mar 18 18:52:42 crc kubenswrapper[4830]: E0318 18:52:42.238130 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:52:56 crc kubenswrapper[4830]: I0318 18:52:56.242550 4830 scope.go:117] "RemoveContainer" containerID="9ebf0be27630d291f563664287588810c1f00205f7e12e3c7064ff19e68f2365" Mar 18 18:52:56 crc kubenswrapper[4830]: E0318 18:52:56.243536 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:53:08 crc kubenswrapper[4830]: I0318 18:53:08.237231 4830 scope.go:117] "RemoveContainer" containerID="9ebf0be27630d291f563664287588810c1f00205f7e12e3c7064ff19e68f2365" Mar 18 18:53:08 crc kubenswrapper[4830]: E0318 18:53:08.239809 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:53:23 crc kubenswrapper[4830]: I0318 18:53:23.235518 4830 scope.go:117] "RemoveContainer" containerID="9ebf0be27630d291f563664287588810c1f00205f7e12e3c7064ff19e68f2365" Mar 18 18:53:23 crc kubenswrapper[4830]: E0318 18:53:23.238109 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:53:37 crc kubenswrapper[4830]: I0318 18:53:37.234403 4830 scope.go:117] "RemoveContainer" containerID="9ebf0be27630d291f563664287588810c1f00205f7e12e3c7064ff19e68f2365" Mar 18 18:53:37 crc kubenswrapper[4830]: E0318 18:53:37.235598 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:53:48 crc kubenswrapper[4830]: I0318 18:53:48.234926 4830 scope.go:117] "RemoveContainer" containerID="9ebf0be27630d291f563664287588810c1f00205f7e12e3c7064ff19e68f2365" Mar 18 18:53:48 crc kubenswrapper[4830]: E0318 18:53:48.236322 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:54:00 crc kubenswrapper[4830]: I0318 18:54:00.147746 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564334-s8dn2"] Mar 18 18:54:00 crc kubenswrapper[4830]: E0318 18:54:00.148576 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87f5fc92-b479-4cf5-9fb9-94f54e3d1822" containerName="oc" Mar 18 18:54:00 crc kubenswrapper[4830]: I0318 18:54:00.148589 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="87f5fc92-b479-4cf5-9fb9-94f54e3d1822" containerName="oc" Mar 18 18:54:00 crc kubenswrapper[4830]: I0318 18:54:00.148730 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="87f5fc92-b479-4cf5-9fb9-94f54e3d1822" containerName="oc" Mar 18 18:54:00 crc kubenswrapper[4830]: I0318 18:54:00.149307 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564334-s8dn2" Mar 18 18:54:00 crc kubenswrapper[4830]: I0318 18:54:00.154304 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:54:00 crc kubenswrapper[4830]: I0318 18:54:00.154332 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:54:00 crc kubenswrapper[4830]: I0318 18:54:00.154404 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 18:54:00 crc kubenswrapper[4830]: I0318 18:54:00.160384 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564334-s8dn2"] Mar 18 18:54:00 crc kubenswrapper[4830]: I0318 18:54:00.301485 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqtqk\" (UniqueName: \"kubernetes.io/projected/cdb4e508-7cc8-431d-9978-0bef156b5adb-kube-api-access-jqtqk\") pod \"auto-csr-approver-29564334-s8dn2\" (UID: \"cdb4e508-7cc8-431d-9978-0bef156b5adb\") " pod="openshift-infra/auto-csr-approver-29564334-s8dn2" Mar 18 18:54:00 crc kubenswrapper[4830]: I0318 18:54:00.403172 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqtqk\" (UniqueName: \"kubernetes.io/projected/cdb4e508-7cc8-431d-9978-0bef156b5adb-kube-api-access-jqtqk\") pod \"auto-csr-approver-29564334-s8dn2\" (UID: \"cdb4e508-7cc8-431d-9978-0bef156b5adb\") " pod="openshift-infra/auto-csr-approver-29564334-s8dn2" Mar 18 18:54:00 crc kubenswrapper[4830]: I0318 18:54:00.429932 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqtqk\" (UniqueName: \"kubernetes.io/projected/cdb4e508-7cc8-431d-9978-0bef156b5adb-kube-api-access-jqtqk\") pod \"auto-csr-approver-29564334-s8dn2\" (UID: \"cdb4e508-7cc8-431d-9978-0bef156b5adb\") " pod="openshift-infra/auto-csr-approver-29564334-s8dn2" Mar 18 18:54:00 crc kubenswrapper[4830]: I0318 18:54:00.514054 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564334-s8dn2" Mar 18 18:54:01 crc kubenswrapper[4830]: I0318 18:54:01.010826 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564334-s8dn2"] Mar 18 18:54:01 crc kubenswrapper[4830]: I0318 18:54:01.171103 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564334-s8dn2" event={"ID":"cdb4e508-7cc8-431d-9978-0bef156b5adb","Type":"ContainerStarted","Data":"dcea73f3af364dce5e0a454f00d3af3bee6c83ed1b38562def27b9ee1931a83d"} Mar 18 18:54:02 crc kubenswrapper[4830]: I0318 18:54:02.235187 4830 scope.go:117] "RemoveContainer" containerID="9ebf0be27630d291f563664287588810c1f00205f7e12e3c7064ff19e68f2365" Mar 18 18:54:02 crc kubenswrapper[4830]: E0318 18:54:02.235594 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:54:03 crc kubenswrapper[4830]: I0318 18:54:03.189072 4830 generic.go:334] "Generic (PLEG): container finished" podID="cdb4e508-7cc8-431d-9978-0bef156b5adb" containerID="b9dc640e172bbbdfdd8d24c86f2f50f970067cff0e2767fa98a7ab840fefea77" exitCode=0 Mar 18 18:54:03 crc kubenswrapper[4830]: I0318 18:54:03.189158 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564334-s8dn2" event={"ID":"cdb4e508-7cc8-431d-9978-0bef156b5adb","Type":"ContainerDied","Data":"b9dc640e172bbbdfdd8d24c86f2f50f970067cff0e2767fa98a7ab840fefea77"} Mar 18 18:54:04 crc kubenswrapper[4830]: I0318 18:54:04.483475 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564334-s8dn2" Mar 18 18:54:04 crc kubenswrapper[4830]: I0318 18:54:04.573458 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqtqk\" (UniqueName: \"kubernetes.io/projected/cdb4e508-7cc8-431d-9978-0bef156b5adb-kube-api-access-jqtqk\") pod \"cdb4e508-7cc8-431d-9978-0bef156b5adb\" (UID: \"cdb4e508-7cc8-431d-9978-0bef156b5adb\") " Mar 18 18:54:04 crc kubenswrapper[4830]: I0318 18:54:04.578695 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdb4e508-7cc8-431d-9978-0bef156b5adb-kube-api-access-jqtqk" (OuterVolumeSpecName: "kube-api-access-jqtqk") pod "cdb4e508-7cc8-431d-9978-0bef156b5adb" (UID: "cdb4e508-7cc8-431d-9978-0bef156b5adb"). InnerVolumeSpecName "kube-api-access-jqtqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:54:04 crc kubenswrapper[4830]: I0318 18:54:04.675387 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqtqk\" (UniqueName: \"kubernetes.io/projected/cdb4e508-7cc8-431d-9978-0bef156b5adb-kube-api-access-jqtqk\") on node \"crc\" DevicePath \"\"" Mar 18 18:54:05 crc kubenswrapper[4830]: I0318 18:54:05.216848 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564334-s8dn2" event={"ID":"cdb4e508-7cc8-431d-9978-0bef156b5adb","Type":"ContainerDied","Data":"dcea73f3af364dce5e0a454f00d3af3bee6c83ed1b38562def27b9ee1931a83d"} Mar 18 18:54:05 crc kubenswrapper[4830]: I0318 18:54:05.216906 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcea73f3af364dce5e0a454f00d3af3bee6c83ed1b38562def27b9ee1931a83d" Mar 18 18:54:05 crc kubenswrapper[4830]: I0318 18:54:05.217002 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564334-s8dn2" Mar 18 18:54:05 crc kubenswrapper[4830]: I0318 18:54:05.568532 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564328-lrppf"] Mar 18 18:54:05 crc kubenswrapper[4830]: I0318 18:54:05.573891 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564328-lrppf"] Mar 18 18:54:06 crc kubenswrapper[4830]: I0318 18:54:06.251915 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d373e727-b951-4e75-a61a-64ab2069e835" path="/var/lib/kubelet/pods/d373e727-b951-4e75-a61a-64ab2069e835/volumes" Mar 18 18:54:16 crc kubenswrapper[4830]: I0318 18:54:16.243079 4830 scope.go:117] "RemoveContainer" containerID="9ebf0be27630d291f563664287588810c1f00205f7e12e3c7064ff19e68f2365" Mar 18 18:54:16 crc kubenswrapper[4830]: E0318 18:54:16.244190 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:54:17 crc kubenswrapper[4830]: I0318 18:54:17.665613 4830 scope.go:117] "RemoveContainer" containerID="1c43421f82f0d7b07c082c17cb6c1e22e5eafae84e9c236cb4488cdd776a18b7" Mar 18 18:54:17 crc kubenswrapper[4830]: I0318 18:54:17.694577 4830 scope.go:117] "RemoveContainer" containerID="35ba0ac2681d8bae85c8f0b641f7d5c041227b9b99b28cecbd9d96adea3eaee2" Mar 18 18:54:17 crc kubenswrapper[4830]: I0318 18:54:17.728054 4830 scope.go:117] "RemoveContainer" containerID="f02cceb372984fc86631d8033f3052186fcd82ab02e15265a1b08de47cd622ac" Mar 18 18:54:17 crc kubenswrapper[4830]: I0318 18:54:17.778349 4830 scope.go:117] "RemoveContainer" containerID="de253278754c4cf91bf942e58e4c9caf033ee69c98b5ffbfdb02128871a5cfab" Mar 18 18:54:26 crc kubenswrapper[4830]: I0318 18:54:26.635363 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v47sx"] Mar 18 18:54:26 crc kubenswrapper[4830]: E0318 18:54:26.637858 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdb4e508-7cc8-431d-9978-0bef156b5adb" containerName="oc" Mar 18 18:54:26 crc kubenswrapper[4830]: I0318 18:54:26.638031 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdb4e508-7cc8-431d-9978-0bef156b5adb" containerName="oc" Mar 18 18:54:26 crc kubenswrapper[4830]: I0318 18:54:26.638414 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdb4e508-7cc8-431d-9978-0bef156b5adb" containerName="oc" Mar 18 18:54:26 crc kubenswrapper[4830]: I0318 18:54:26.640278 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v47sx" Mar 18 18:54:26 crc kubenswrapper[4830]: I0318 18:54:26.660065 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v47sx"] Mar 18 18:54:26 crc kubenswrapper[4830]: I0318 18:54:26.727238 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6fea67e-88da-4de6-95c7-68b32d1016aa-utilities\") pod \"redhat-operators-v47sx\" (UID: \"d6fea67e-88da-4de6-95c7-68b32d1016aa\") " pod="openshift-marketplace/redhat-operators-v47sx" Mar 18 18:54:26 crc kubenswrapper[4830]: I0318 18:54:26.727303 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngvw6\" (UniqueName: \"kubernetes.io/projected/d6fea67e-88da-4de6-95c7-68b32d1016aa-kube-api-access-ngvw6\") pod \"redhat-operators-v47sx\" (UID: \"d6fea67e-88da-4de6-95c7-68b32d1016aa\") " pod="openshift-marketplace/redhat-operators-v47sx" Mar 18 18:54:26 crc kubenswrapper[4830]: I0318 18:54:26.727389 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6fea67e-88da-4de6-95c7-68b32d1016aa-catalog-content\") pod \"redhat-operators-v47sx\" (UID: \"d6fea67e-88da-4de6-95c7-68b32d1016aa\") " pod="openshift-marketplace/redhat-operators-v47sx" Mar 18 18:54:26 crc kubenswrapper[4830]: I0318 18:54:26.827799 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m766p"] Mar 18 18:54:26 crc kubenswrapper[4830]: I0318 18:54:26.828389 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6fea67e-88da-4de6-95c7-68b32d1016aa-utilities\") pod \"redhat-operators-v47sx\" (UID: \"d6fea67e-88da-4de6-95c7-68b32d1016aa\") " pod="openshift-marketplace/redhat-operators-v47sx" Mar 18 18:54:26 crc kubenswrapper[4830]: I0318 18:54:26.828439 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngvw6\" (UniqueName: \"kubernetes.io/projected/d6fea67e-88da-4de6-95c7-68b32d1016aa-kube-api-access-ngvw6\") pod \"redhat-operators-v47sx\" (UID: \"d6fea67e-88da-4de6-95c7-68b32d1016aa\") " pod="openshift-marketplace/redhat-operators-v47sx" Mar 18 18:54:26 crc kubenswrapper[4830]: I0318 18:54:26.828495 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6fea67e-88da-4de6-95c7-68b32d1016aa-catalog-content\") pod \"redhat-operators-v47sx\" (UID: \"d6fea67e-88da-4de6-95c7-68b32d1016aa\") " pod="openshift-marketplace/redhat-operators-v47sx" Mar 18 18:54:26 crc kubenswrapper[4830]: I0318 18:54:26.829100 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6fea67e-88da-4de6-95c7-68b32d1016aa-catalog-content\") pod \"redhat-operators-v47sx\" (UID: \"d6fea67e-88da-4de6-95c7-68b32d1016aa\") " pod="openshift-marketplace/redhat-operators-v47sx" Mar 18 18:54:26 crc kubenswrapper[4830]: I0318 18:54:26.830067 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m766p" Mar 18 18:54:26 crc kubenswrapper[4830]: I0318 18:54:26.830612 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6fea67e-88da-4de6-95c7-68b32d1016aa-utilities\") pod \"redhat-operators-v47sx\" (UID: \"d6fea67e-88da-4de6-95c7-68b32d1016aa\") " pod="openshift-marketplace/redhat-operators-v47sx" Mar 18 18:54:26 crc kubenswrapper[4830]: I0318 18:54:26.854402 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m766p"] Mar 18 18:54:26 crc kubenswrapper[4830]: I0318 18:54:26.883644 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngvw6\" (UniqueName: \"kubernetes.io/projected/d6fea67e-88da-4de6-95c7-68b32d1016aa-kube-api-access-ngvw6\") pod \"redhat-operators-v47sx\" (UID: \"d6fea67e-88da-4de6-95c7-68b32d1016aa\") " pod="openshift-marketplace/redhat-operators-v47sx" Mar 18 18:54:26 crc kubenswrapper[4830]: I0318 18:54:26.929844 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31bc522d-051d-45fa-b4cd-fc7f12a796ae-utilities\") pod \"redhat-marketplace-m766p\" (UID: \"31bc522d-051d-45fa-b4cd-fc7f12a796ae\") " pod="openshift-marketplace/redhat-marketplace-m766p" Mar 18 18:54:26 crc kubenswrapper[4830]: I0318 18:54:26.929934 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31bc522d-051d-45fa-b4cd-fc7f12a796ae-catalog-content\") pod \"redhat-marketplace-m766p\" (UID: \"31bc522d-051d-45fa-b4cd-fc7f12a796ae\") " pod="openshift-marketplace/redhat-marketplace-m766p" Mar 18 18:54:26 crc kubenswrapper[4830]: I0318 18:54:26.929978 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvhh9\" (UniqueName: \"kubernetes.io/projected/31bc522d-051d-45fa-b4cd-fc7f12a796ae-kube-api-access-tvhh9\") pod \"redhat-marketplace-m766p\" (UID: \"31bc522d-051d-45fa-b4cd-fc7f12a796ae\") " pod="openshift-marketplace/redhat-marketplace-m766p" Mar 18 18:54:27 crc kubenswrapper[4830]: I0318 18:54:27.002003 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v47sx" Mar 18 18:54:27 crc kubenswrapper[4830]: I0318 18:54:27.030860 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31bc522d-051d-45fa-b4cd-fc7f12a796ae-utilities\") pod \"redhat-marketplace-m766p\" (UID: \"31bc522d-051d-45fa-b4cd-fc7f12a796ae\") " pod="openshift-marketplace/redhat-marketplace-m766p" Mar 18 18:54:27 crc kubenswrapper[4830]: I0318 18:54:27.031060 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31bc522d-051d-45fa-b4cd-fc7f12a796ae-catalog-content\") pod \"redhat-marketplace-m766p\" (UID: \"31bc522d-051d-45fa-b4cd-fc7f12a796ae\") " pod="openshift-marketplace/redhat-marketplace-m766p" Mar 18 18:54:27 crc kubenswrapper[4830]: I0318 18:54:27.031217 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvhh9\" (UniqueName: \"kubernetes.io/projected/31bc522d-051d-45fa-b4cd-fc7f12a796ae-kube-api-access-tvhh9\") pod \"redhat-marketplace-m766p\" (UID: \"31bc522d-051d-45fa-b4cd-fc7f12a796ae\") " pod="openshift-marketplace/redhat-marketplace-m766p" Mar 18 18:54:27 crc kubenswrapper[4830]: I0318 18:54:27.032442 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31bc522d-051d-45fa-b4cd-fc7f12a796ae-utilities\") pod \"redhat-marketplace-m766p\" (UID: \"31bc522d-051d-45fa-b4cd-fc7f12a796ae\") " pod="openshift-marketplace/redhat-marketplace-m766p" Mar 18 18:54:27 crc kubenswrapper[4830]: I0318 18:54:27.032537 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31bc522d-051d-45fa-b4cd-fc7f12a796ae-catalog-content\") pod \"redhat-marketplace-m766p\" (UID: \"31bc522d-051d-45fa-b4cd-fc7f12a796ae\") " pod="openshift-marketplace/redhat-marketplace-m766p" Mar 18 18:54:27 crc kubenswrapper[4830]: I0318 18:54:27.056736 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvhh9\" (UniqueName: \"kubernetes.io/projected/31bc522d-051d-45fa-b4cd-fc7f12a796ae-kube-api-access-tvhh9\") pod \"redhat-marketplace-m766p\" (UID: \"31bc522d-051d-45fa-b4cd-fc7f12a796ae\") " pod="openshift-marketplace/redhat-marketplace-m766p" Mar 18 18:54:27 crc kubenswrapper[4830]: I0318 18:54:27.159466 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m766p" Mar 18 18:54:27 crc kubenswrapper[4830]: I0318 18:54:27.234392 4830 scope.go:117] "RemoveContainer" containerID="9ebf0be27630d291f563664287588810c1f00205f7e12e3c7064ff19e68f2365" Mar 18 18:54:27 crc kubenswrapper[4830]: E0318 18:54:27.234596 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:54:27 crc kubenswrapper[4830]: I0318 18:54:27.460814 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v47sx"] Mar 18 18:54:27 crc kubenswrapper[4830]: I0318 18:54:27.589006 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m766p"] Mar 18 18:54:28 crc kubenswrapper[4830]: I0318 18:54:28.421756 4830 generic.go:334] "Generic (PLEG): container finished" podID="d6fea67e-88da-4de6-95c7-68b32d1016aa" containerID="8218a317bbb194fe3c7f2d5e222129c54fb1610fc842b9e3a10ed6333bdce821" exitCode=0 Mar 18 18:54:28 crc kubenswrapper[4830]: I0318 18:54:28.421822 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v47sx" event={"ID":"d6fea67e-88da-4de6-95c7-68b32d1016aa","Type":"ContainerDied","Data":"8218a317bbb194fe3c7f2d5e222129c54fb1610fc842b9e3a10ed6333bdce821"} Mar 18 18:54:28 crc kubenswrapper[4830]: I0318 18:54:28.422193 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v47sx" event={"ID":"d6fea67e-88da-4de6-95c7-68b32d1016aa","Type":"ContainerStarted","Data":"a9eddd176b99ae0d41ec3529785bbba431bafa899a1e12f506dfc959320bcdd0"} Mar 18 18:54:28 crc kubenswrapper[4830]: I0318 18:54:28.425843 4830 generic.go:334] "Generic (PLEG): container finished" podID="31bc522d-051d-45fa-b4cd-fc7f12a796ae" containerID="51c3132c5a72bedd7242d220a9d7f923c692ca64603fa0c59538bc12462ac6ae" exitCode=0 Mar 18 18:54:28 crc kubenswrapper[4830]: I0318 18:54:28.425905 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m766p" event={"ID":"31bc522d-051d-45fa-b4cd-fc7f12a796ae","Type":"ContainerDied","Data":"51c3132c5a72bedd7242d220a9d7f923c692ca64603fa0c59538bc12462ac6ae"} Mar 18 18:54:28 crc kubenswrapper[4830]: I0318 18:54:28.425945 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m766p" event={"ID":"31bc522d-051d-45fa-b4cd-fc7f12a796ae","Type":"ContainerStarted","Data":"4df4528be95b7a678b9f7ac514eeb7fb6172ff150b40c0f070484308bf69c159"} Mar 18 18:54:29 crc kubenswrapper[4830]: I0318 18:54:29.462270 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v47sx" event={"ID":"d6fea67e-88da-4de6-95c7-68b32d1016aa","Type":"ContainerStarted","Data":"3b963c5e8f983e2b8c758a06cc575eb3665d32e014899aec1a58287bbfc64c50"} Mar 18 18:54:29 crc kubenswrapper[4830]: I0318 18:54:29.465397 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m766p" event={"ID":"31bc522d-051d-45fa-b4cd-fc7f12a796ae","Type":"ContainerStarted","Data":"514f0bd6039ae210d47b4a3cdce7aefeb8b004263c4233a421fddd1775b3fcfb"} Mar 18 18:54:30 crc kubenswrapper[4830]: I0318 18:54:30.479652 4830 generic.go:334] "Generic (PLEG): container finished" podID="31bc522d-051d-45fa-b4cd-fc7f12a796ae" containerID="514f0bd6039ae210d47b4a3cdce7aefeb8b004263c4233a421fddd1775b3fcfb" exitCode=0 Mar 18 18:54:30 crc kubenswrapper[4830]: I0318 18:54:30.479728 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m766p" event={"ID":"31bc522d-051d-45fa-b4cd-fc7f12a796ae","Type":"ContainerDied","Data":"514f0bd6039ae210d47b4a3cdce7aefeb8b004263c4233a421fddd1775b3fcfb"} Mar 18 18:54:30 crc kubenswrapper[4830]: I0318 18:54:30.490226 4830 generic.go:334] "Generic (PLEG): container finished" podID="d6fea67e-88da-4de6-95c7-68b32d1016aa" containerID="3b963c5e8f983e2b8c758a06cc575eb3665d32e014899aec1a58287bbfc64c50" exitCode=0 Mar 18 18:54:30 crc kubenswrapper[4830]: I0318 18:54:30.490281 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v47sx" event={"ID":"d6fea67e-88da-4de6-95c7-68b32d1016aa","Type":"ContainerDied","Data":"3b963c5e8f983e2b8c758a06cc575eb3665d32e014899aec1a58287bbfc64c50"} Mar 18 18:54:31 crc kubenswrapper[4830]: I0318 18:54:31.504373 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v47sx" event={"ID":"d6fea67e-88da-4de6-95c7-68b32d1016aa","Type":"ContainerStarted","Data":"c8b51ce3ede11d56998c9da6911c5e91cd265dcb9746d50a72b34e554cffa1ab"} Mar 18 18:54:31 crc kubenswrapper[4830]: I0318 18:54:31.507397 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m766p" event={"ID":"31bc522d-051d-45fa-b4cd-fc7f12a796ae","Type":"ContainerStarted","Data":"c55de0bb92b51cc02ec92a1aced36d6131f1ccc7633574cc12ef19b8e46b3e19"} Mar 18 18:54:31 crc kubenswrapper[4830]: I0318 18:54:31.533087 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v47sx" podStartSLOduration=2.937679134 podStartE2EDuration="5.533069882s" podCreationTimestamp="2026-03-18 18:54:26 +0000 UTC" firstStartedPulling="2026-03-18 18:54:28.424595413 +0000 UTC m=+3102.992225745" lastFinishedPulling="2026-03-18 18:54:31.019986121 +0000 UTC m=+3105.587616493" observedRunningTime="2026-03-18 18:54:31.527343839 +0000 UTC m=+3106.094974211" watchObservedRunningTime="2026-03-18 18:54:31.533069882 +0000 UTC m=+3106.100700214" Mar 18 18:54:31 crc kubenswrapper[4830]: I0318 18:54:31.560916 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m766p" podStartSLOduration=3.041297009 podStartE2EDuration="5.560900836s" podCreationTimestamp="2026-03-18 18:54:26 +0000 UTC" firstStartedPulling="2026-03-18 18:54:28.427122855 +0000 UTC m=+3102.994753187" lastFinishedPulling="2026-03-18 18:54:30.946726672 +0000 UTC m=+3105.514357014" observedRunningTime="2026-03-18 18:54:31.559448334 +0000 UTC m=+3106.127078726" watchObservedRunningTime="2026-03-18 18:54:31.560900836 +0000 UTC m=+3106.128531168" Mar 18 18:54:37 crc kubenswrapper[4830]: I0318 18:54:37.002107 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v47sx" Mar 18 18:54:37 crc kubenswrapper[4830]: I0318 18:54:37.002741 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v47sx" Mar 18 18:54:37 crc kubenswrapper[4830]: I0318 18:54:37.160224 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m766p" Mar 18 18:54:37 crc kubenswrapper[4830]: I0318 18:54:37.160310 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m766p" Mar 18 18:54:37 crc kubenswrapper[4830]: I0318 18:54:37.210901 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m766p" Mar 18 18:54:37 crc kubenswrapper[4830]: I0318 18:54:37.649025 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m766p" Mar 18 18:54:37 crc kubenswrapper[4830]: I0318 18:54:37.715037 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m766p"] Mar 18 18:54:38 crc kubenswrapper[4830]: I0318 18:54:38.050915 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v47sx" podUID="d6fea67e-88da-4de6-95c7-68b32d1016aa" containerName="registry-server" probeResult="failure" output=< Mar 18 18:54:38 crc kubenswrapper[4830]: timeout: failed to connect service ":50051" within 1s Mar 18 18:54:38 crc kubenswrapper[4830]: > Mar 18 18:54:39 crc kubenswrapper[4830]: I0318 18:54:39.589011 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m766p" podUID="31bc522d-051d-45fa-b4cd-fc7f12a796ae" containerName="registry-server" containerID="cri-o://c55de0bb92b51cc02ec92a1aced36d6131f1ccc7633574cc12ef19b8e46b3e19" gracePeriod=2 Mar 18 18:54:41 crc kubenswrapper[4830]: I0318 18:54:41.235535 4830 scope.go:117] "RemoveContainer" containerID="9ebf0be27630d291f563664287588810c1f00205f7e12e3c7064ff19e68f2365" Mar 18 18:54:41 crc kubenswrapper[4830]: E0318 18:54:41.236814 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:54:41 crc kubenswrapper[4830]: I0318 18:54:41.615945 4830 generic.go:334] "Generic (PLEG): container finished" podID="31bc522d-051d-45fa-b4cd-fc7f12a796ae" containerID="c55de0bb92b51cc02ec92a1aced36d6131f1ccc7633574cc12ef19b8e46b3e19" exitCode=0 Mar 18 18:54:41 crc kubenswrapper[4830]: I0318 18:54:41.616053 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m766p" event={"ID":"31bc522d-051d-45fa-b4cd-fc7f12a796ae","Type":"ContainerDied","Data":"c55de0bb92b51cc02ec92a1aced36d6131f1ccc7633574cc12ef19b8e46b3e19"} Mar 18 18:54:41 crc kubenswrapper[4830]: I0318 18:54:41.933599 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m766p" Mar 18 18:54:42 crc kubenswrapper[4830]: I0318 18:54:42.059830 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvhh9\" (UniqueName: \"kubernetes.io/projected/31bc522d-051d-45fa-b4cd-fc7f12a796ae-kube-api-access-tvhh9\") pod \"31bc522d-051d-45fa-b4cd-fc7f12a796ae\" (UID: \"31bc522d-051d-45fa-b4cd-fc7f12a796ae\") " Mar 18 18:54:42 crc kubenswrapper[4830]: I0318 18:54:42.060218 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31bc522d-051d-45fa-b4cd-fc7f12a796ae-catalog-content\") pod \"31bc522d-051d-45fa-b4cd-fc7f12a796ae\" (UID: \"31bc522d-051d-45fa-b4cd-fc7f12a796ae\") " Mar 18 18:54:42 crc kubenswrapper[4830]: I0318 18:54:42.060404 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31bc522d-051d-45fa-b4cd-fc7f12a796ae-utilities\") pod \"31bc522d-051d-45fa-b4cd-fc7f12a796ae\" (UID: \"31bc522d-051d-45fa-b4cd-fc7f12a796ae\") " Mar 18 18:54:42 crc kubenswrapper[4830]: I0318 18:54:42.061319 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31bc522d-051d-45fa-b4cd-fc7f12a796ae-utilities" (OuterVolumeSpecName: "utilities") pod "31bc522d-051d-45fa-b4cd-fc7f12a796ae" (UID: "31bc522d-051d-45fa-b4cd-fc7f12a796ae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:54:42 crc kubenswrapper[4830]: I0318 18:54:42.065896 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31bc522d-051d-45fa-b4cd-fc7f12a796ae-kube-api-access-tvhh9" (OuterVolumeSpecName: "kube-api-access-tvhh9") pod "31bc522d-051d-45fa-b4cd-fc7f12a796ae" (UID: "31bc522d-051d-45fa-b4cd-fc7f12a796ae"). InnerVolumeSpecName "kube-api-access-tvhh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:54:42 crc kubenswrapper[4830]: I0318 18:54:42.115113 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31bc522d-051d-45fa-b4cd-fc7f12a796ae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31bc522d-051d-45fa-b4cd-fc7f12a796ae" (UID: "31bc522d-051d-45fa-b4cd-fc7f12a796ae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:54:42 crc kubenswrapper[4830]: I0318 18:54:42.162074 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvhh9\" (UniqueName: \"kubernetes.io/projected/31bc522d-051d-45fa-b4cd-fc7f12a796ae-kube-api-access-tvhh9\") on node \"crc\" DevicePath \"\"" Mar 18 18:54:42 crc kubenswrapper[4830]: I0318 18:54:42.162122 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31bc522d-051d-45fa-b4cd-fc7f12a796ae-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:54:42 crc kubenswrapper[4830]: I0318 18:54:42.162134 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31bc522d-051d-45fa-b4cd-fc7f12a796ae-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:54:42 crc kubenswrapper[4830]: I0318 18:54:42.629905 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m766p" event={"ID":"31bc522d-051d-45fa-b4cd-fc7f12a796ae","Type":"ContainerDied","Data":"4df4528be95b7a678b9f7ac514eeb7fb6172ff150b40c0f070484308bf69c159"} Mar 18 18:54:42 crc kubenswrapper[4830]: I0318 18:54:42.629976 4830 scope.go:117] "RemoveContainer" containerID="c55de0bb92b51cc02ec92a1aced36d6131f1ccc7633574cc12ef19b8e46b3e19" Mar 18 18:54:42 crc kubenswrapper[4830]: I0318 18:54:42.630020 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m766p" Mar 18 18:54:42 crc kubenswrapper[4830]: I0318 18:54:42.655043 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m766p"] Mar 18 18:54:42 crc kubenswrapper[4830]: I0318 18:54:42.661713 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m766p"] Mar 18 18:54:42 crc kubenswrapper[4830]: I0318 18:54:42.662989 4830 scope.go:117] "RemoveContainer" containerID="514f0bd6039ae210d47b4a3cdce7aefeb8b004263c4233a421fddd1775b3fcfb" Mar 18 18:54:42 crc kubenswrapper[4830]: I0318 18:54:42.689793 4830 scope.go:117] "RemoveContainer" containerID="51c3132c5a72bedd7242d220a9d7f923c692ca64603fa0c59538bc12462ac6ae" Mar 18 18:54:44 crc kubenswrapper[4830]: I0318 18:54:44.242386 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31bc522d-051d-45fa-b4cd-fc7f12a796ae" path="/var/lib/kubelet/pods/31bc522d-051d-45fa-b4cd-fc7f12a796ae/volumes" Mar 18 18:54:47 crc kubenswrapper[4830]: I0318 18:54:47.074428 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v47sx" Mar 18 18:54:47 crc kubenswrapper[4830]: I0318 18:54:47.156764 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v47sx" Mar 18 18:54:47 crc kubenswrapper[4830]: I0318 18:54:47.323435 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v47sx"] Mar 18 18:54:48 crc kubenswrapper[4830]: I0318 18:54:48.692405 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v47sx" podUID="d6fea67e-88da-4de6-95c7-68b32d1016aa" containerName="registry-server" containerID="cri-o://c8b51ce3ede11d56998c9da6911c5e91cd265dcb9746d50a72b34e554cffa1ab" gracePeriod=2 Mar 18 18:54:49 crc kubenswrapper[4830]: I0318 18:54:49.171400 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v47sx" Mar 18 18:54:49 crc kubenswrapper[4830]: I0318 18:54:49.286123 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6fea67e-88da-4de6-95c7-68b32d1016aa-catalog-content\") pod \"d6fea67e-88da-4de6-95c7-68b32d1016aa\" (UID: \"d6fea67e-88da-4de6-95c7-68b32d1016aa\") " Mar 18 18:54:49 crc kubenswrapper[4830]: I0318 18:54:49.286270 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvw6\" (UniqueName: \"kubernetes.io/projected/d6fea67e-88da-4de6-95c7-68b32d1016aa-kube-api-access-ngvw6\") pod \"d6fea67e-88da-4de6-95c7-68b32d1016aa\" (UID: \"d6fea67e-88da-4de6-95c7-68b32d1016aa\") " Mar 18 18:54:49 crc kubenswrapper[4830]: I0318 18:54:49.286359 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6fea67e-88da-4de6-95c7-68b32d1016aa-utilities\") pod \"d6fea67e-88da-4de6-95c7-68b32d1016aa\" (UID: \"d6fea67e-88da-4de6-95c7-68b32d1016aa\") " Mar 18 18:54:49 crc kubenswrapper[4830]: I0318 18:54:49.289578 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6fea67e-88da-4de6-95c7-68b32d1016aa-utilities" (OuterVolumeSpecName: "utilities") pod "d6fea67e-88da-4de6-95c7-68b32d1016aa" (UID: "d6fea67e-88da-4de6-95c7-68b32d1016aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:54:49 crc kubenswrapper[4830]: I0318 18:54:49.299983 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6fea67e-88da-4de6-95c7-68b32d1016aa-kube-api-access-ngvw6" (OuterVolumeSpecName: "kube-api-access-ngvw6") pod "d6fea67e-88da-4de6-95c7-68b32d1016aa" (UID: "d6fea67e-88da-4de6-95c7-68b32d1016aa"). InnerVolumeSpecName "kube-api-access-ngvw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:54:49 crc kubenswrapper[4830]: I0318 18:54:49.388827 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvw6\" (UniqueName: \"kubernetes.io/projected/d6fea67e-88da-4de6-95c7-68b32d1016aa-kube-api-access-ngvw6\") on node \"crc\" DevicePath \"\"" Mar 18 18:54:49 crc kubenswrapper[4830]: I0318 18:54:49.388881 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6fea67e-88da-4de6-95c7-68b32d1016aa-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:54:49 crc kubenswrapper[4830]: I0318 18:54:49.454006 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6fea67e-88da-4de6-95c7-68b32d1016aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d6fea67e-88da-4de6-95c7-68b32d1016aa" (UID: "d6fea67e-88da-4de6-95c7-68b32d1016aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:54:49 crc kubenswrapper[4830]: I0318 18:54:49.490179 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6fea67e-88da-4de6-95c7-68b32d1016aa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:54:49 crc kubenswrapper[4830]: I0318 18:54:49.706182 4830 generic.go:334] "Generic (PLEG): container finished" podID="d6fea67e-88da-4de6-95c7-68b32d1016aa" containerID="c8b51ce3ede11d56998c9da6911c5e91cd265dcb9746d50a72b34e554cffa1ab" exitCode=0 Mar 18 18:54:49 crc kubenswrapper[4830]: I0318 18:54:49.706243 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v47sx" event={"ID":"d6fea67e-88da-4de6-95c7-68b32d1016aa","Type":"ContainerDied","Data":"c8b51ce3ede11d56998c9da6911c5e91cd265dcb9746d50a72b34e554cffa1ab"} Mar 18 18:54:49 crc kubenswrapper[4830]: I0318 18:54:49.706312 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v47sx" event={"ID":"d6fea67e-88da-4de6-95c7-68b32d1016aa","Type":"ContainerDied","Data":"a9eddd176b99ae0d41ec3529785bbba431bafa899a1e12f506dfc959320bcdd0"} Mar 18 18:54:49 crc kubenswrapper[4830]: I0318 18:54:49.706341 4830 scope.go:117] "RemoveContainer" containerID="c8b51ce3ede11d56998c9da6911c5e91cd265dcb9746d50a72b34e554cffa1ab" Mar 18 18:54:49 crc kubenswrapper[4830]: I0318 18:54:49.706256 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v47sx" Mar 18 18:54:49 crc kubenswrapper[4830]: I0318 18:54:49.744893 4830 scope.go:117] "RemoveContainer" containerID="3b963c5e8f983e2b8c758a06cc575eb3665d32e014899aec1a58287bbfc64c50" Mar 18 18:54:49 crc kubenswrapper[4830]: I0318 18:54:49.757333 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v47sx"] Mar 18 18:54:49 crc kubenswrapper[4830]: I0318 18:54:49.768912 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v47sx"] Mar 18 18:54:49 crc kubenswrapper[4830]: I0318 18:54:49.785987 4830 scope.go:117] "RemoveContainer" containerID="8218a317bbb194fe3c7f2d5e222129c54fb1610fc842b9e3a10ed6333bdce821" Mar 18 18:54:49 crc kubenswrapper[4830]: I0318 18:54:49.819627 4830 scope.go:117] "RemoveContainer" containerID="c8b51ce3ede11d56998c9da6911c5e91cd265dcb9746d50a72b34e554cffa1ab" Mar 18 18:54:49 crc kubenswrapper[4830]: E0318 18:54:49.820370 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8b51ce3ede11d56998c9da6911c5e91cd265dcb9746d50a72b34e554cffa1ab\": container with ID starting with c8b51ce3ede11d56998c9da6911c5e91cd265dcb9746d50a72b34e554cffa1ab not found: ID does not exist" containerID="c8b51ce3ede11d56998c9da6911c5e91cd265dcb9746d50a72b34e554cffa1ab" Mar 18 18:54:49 crc kubenswrapper[4830]: I0318 18:54:49.820438 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8b51ce3ede11d56998c9da6911c5e91cd265dcb9746d50a72b34e554cffa1ab"} err="failed to get container status \"c8b51ce3ede11d56998c9da6911c5e91cd265dcb9746d50a72b34e554cffa1ab\": rpc error: code = NotFound desc = could not find container \"c8b51ce3ede11d56998c9da6911c5e91cd265dcb9746d50a72b34e554cffa1ab\": container with ID starting with c8b51ce3ede11d56998c9da6911c5e91cd265dcb9746d50a72b34e554cffa1ab not found: ID does not exist" Mar 18 18:54:49 crc kubenswrapper[4830]: I0318 18:54:49.820476 4830 scope.go:117] "RemoveContainer" containerID="3b963c5e8f983e2b8c758a06cc575eb3665d32e014899aec1a58287bbfc64c50" Mar 18 18:54:49 crc kubenswrapper[4830]: E0318 18:54:49.820859 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b963c5e8f983e2b8c758a06cc575eb3665d32e014899aec1a58287bbfc64c50\": container with ID starting with 3b963c5e8f983e2b8c758a06cc575eb3665d32e014899aec1a58287bbfc64c50 not found: ID does not exist" containerID="3b963c5e8f983e2b8c758a06cc575eb3665d32e014899aec1a58287bbfc64c50" Mar 18 18:54:49 crc kubenswrapper[4830]: I0318 18:54:49.820922 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b963c5e8f983e2b8c758a06cc575eb3665d32e014899aec1a58287bbfc64c50"} err="failed to get container status \"3b963c5e8f983e2b8c758a06cc575eb3665d32e014899aec1a58287bbfc64c50\": rpc error: code = NotFound desc = could not find container \"3b963c5e8f983e2b8c758a06cc575eb3665d32e014899aec1a58287bbfc64c50\": container with ID starting with 3b963c5e8f983e2b8c758a06cc575eb3665d32e014899aec1a58287bbfc64c50 not found: ID does not exist" Mar 18 18:54:49 crc kubenswrapper[4830]: I0318 18:54:49.820967 4830 scope.go:117] "RemoveContainer" containerID="8218a317bbb194fe3c7f2d5e222129c54fb1610fc842b9e3a10ed6333bdce821" Mar 18 18:54:49 crc kubenswrapper[4830]: E0318 18:54:49.821444 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8218a317bbb194fe3c7f2d5e222129c54fb1610fc842b9e3a10ed6333bdce821\": container with ID starting with 8218a317bbb194fe3c7f2d5e222129c54fb1610fc842b9e3a10ed6333bdce821 not found: ID does not exist" containerID="8218a317bbb194fe3c7f2d5e222129c54fb1610fc842b9e3a10ed6333bdce821" Mar 18 18:54:49 crc kubenswrapper[4830]: I0318 18:54:49.821492 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8218a317bbb194fe3c7f2d5e222129c54fb1610fc842b9e3a10ed6333bdce821"} err="failed to get container status \"8218a317bbb194fe3c7f2d5e222129c54fb1610fc842b9e3a10ed6333bdce821\": rpc error: code = NotFound desc = could not find container \"8218a317bbb194fe3c7f2d5e222129c54fb1610fc842b9e3a10ed6333bdce821\": container with ID starting with 8218a317bbb194fe3c7f2d5e222129c54fb1610fc842b9e3a10ed6333bdce821 not found: ID does not exist" Mar 18 18:54:50 crc kubenswrapper[4830]: I0318 18:54:50.250348 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6fea67e-88da-4de6-95c7-68b32d1016aa" path="/var/lib/kubelet/pods/d6fea67e-88da-4de6-95c7-68b32d1016aa/volumes" Mar 18 18:54:55 crc kubenswrapper[4830]: I0318 18:54:55.235206 4830 scope.go:117] "RemoveContainer" containerID="9ebf0be27630d291f563664287588810c1f00205f7e12e3c7064ff19e68f2365" Mar 18 18:54:55 crc kubenswrapper[4830]: E0318 18:54:55.235943 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:55:09 crc kubenswrapper[4830]: I0318 18:55:09.235683 4830 scope.go:117] "RemoveContainer" containerID="9ebf0be27630d291f563664287588810c1f00205f7e12e3c7064ff19e68f2365" Mar 18 18:55:09 crc kubenswrapper[4830]: E0318 18:55:09.237040 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:55:23 crc kubenswrapper[4830]: I0318 18:55:23.234852 4830 scope.go:117] "RemoveContainer" containerID="9ebf0be27630d291f563664287588810c1f00205f7e12e3c7064ff19e68f2365" Mar 18 18:55:23 crc kubenswrapper[4830]: E0318 18:55:23.235843 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:55:37 crc kubenswrapper[4830]: I0318 18:55:37.235801 4830 scope.go:117] "RemoveContainer" containerID="9ebf0be27630d291f563664287588810c1f00205f7e12e3c7064ff19e68f2365" Mar 18 18:55:37 crc kubenswrapper[4830]: E0318 18:55:37.236941 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:55:49 crc kubenswrapper[4830]: I0318 18:55:49.235120 4830 scope.go:117] "RemoveContainer" containerID="9ebf0be27630d291f563664287588810c1f00205f7e12e3c7064ff19e68f2365" Mar 18 18:55:49 crc kubenswrapper[4830]: E0318 18:55:49.238057 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:56:00 crc kubenswrapper[4830]: I0318 18:56:00.203191 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564336-z7b68"] Mar 18 18:56:00 crc kubenswrapper[4830]: E0318 18:56:00.204189 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31bc522d-051d-45fa-b4cd-fc7f12a796ae" containerName="registry-server" Mar 18 18:56:00 crc kubenswrapper[4830]: I0318 18:56:00.204204 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="31bc522d-051d-45fa-b4cd-fc7f12a796ae" containerName="registry-server" Mar 18 18:56:00 crc kubenswrapper[4830]: E0318 18:56:00.204215 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6fea67e-88da-4de6-95c7-68b32d1016aa" containerName="extract-content" Mar 18 18:56:00 crc kubenswrapper[4830]: I0318 18:56:00.204222 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6fea67e-88da-4de6-95c7-68b32d1016aa" containerName="extract-content" Mar 18 18:56:00 crc kubenswrapper[4830]: E0318 18:56:00.204243 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6fea67e-88da-4de6-95c7-68b32d1016aa" containerName="registry-server" Mar 18 18:56:00 crc kubenswrapper[4830]: I0318 18:56:00.204251 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6fea67e-88da-4de6-95c7-68b32d1016aa" containerName="registry-server" Mar 18 18:56:00 crc kubenswrapper[4830]: E0318 18:56:00.204261 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6fea67e-88da-4de6-95c7-68b32d1016aa" containerName="extract-utilities" Mar 18 18:56:00 crc kubenswrapper[4830]: I0318 18:56:00.204269 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6fea67e-88da-4de6-95c7-68b32d1016aa" containerName="extract-utilities" Mar 18 18:56:00 crc kubenswrapper[4830]: E0318 18:56:00.204288 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31bc522d-051d-45fa-b4cd-fc7f12a796ae" containerName="extract-content" Mar 18 18:56:00 crc kubenswrapper[4830]: I0318 18:56:00.204295 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="31bc522d-051d-45fa-b4cd-fc7f12a796ae" containerName="extract-content" Mar 18 18:56:00 crc kubenswrapper[4830]: E0318 18:56:00.204306 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31bc522d-051d-45fa-b4cd-fc7f12a796ae" containerName="extract-utilities" Mar 18 18:56:00 crc kubenswrapper[4830]: I0318 18:56:00.204313 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="31bc522d-051d-45fa-b4cd-fc7f12a796ae" containerName="extract-utilities" Mar 18 18:56:00 crc kubenswrapper[4830]: I0318 18:56:00.204465 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6fea67e-88da-4de6-95c7-68b32d1016aa" containerName="registry-server" Mar 18 18:56:00 crc kubenswrapper[4830]: I0318 18:56:00.204489 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="31bc522d-051d-45fa-b4cd-fc7f12a796ae" containerName="registry-server" Mar 18 18:56:00 crc kubenswrapper[4830]: I0318 18:56:00.205015 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564336-z7b68" Mar 18 18:56:00 crc kubenswrapper[4830]: I0318 18:56:00.208347 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 18:56:00 crc kubenswrapper[4830]: I0318 18:56:00.208449 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:56:00 crc kubenswrapper[4830]: I0318 18:56:00.212852 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:56:00 crc kubenswrapper[4830]: I0318 18:56:00.225126 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564336-z7b68"] Mar 18 18:56:00 crc kubenswrapper[4830]: I0318 18:56:00.263199 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff2lr\" (UniqueName: \"kubernetes.io/projected/39085195-bae6-4f11-8a16-5e96419af3ac-kube-api-access-ff2lr\") pod \"auto-csr-approver-29564336-z7b68\" (UID: \"39085195-bae6-4f11-8a16-5e96419af3ac\") " pod="openshift-infra/auto-csr-approver-29564336-z7b68" Mar 18 18:56:00 crc kubenswrapper[4830]: I0318 18:56:00.364570 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff2lr\" (UniqueName: \"kubernetes.io/projected/39085195-bae6-4f11-8a16-5e96419af3ac-kube-api-access-ff2lr\") pod \"auto-csr-approver-29564336-z7b68\" (UID: \"39085195-bae6-4f11-8a16-5e96419af3ac\") " pod="openshift-infra/auto-csr-approver-29564336-z7b68" Mar 18 18:56:00 crc kubenswrapper[4830]: I0318 18:56:00.390581 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff2lr\" (UniqueName: \"kubernetes.io/projected/39085195-bae6-4f11-8a16-5e96419af3ac-kube-api-access-ff2lr\") pod \"auto-csr-approver-29564336-z7b68\" (UID: \"39085195-bae6-4f11-8a16-5e96419af3ac\") " pod="openshift-infra/auto-csr-approver-29564336-z7b68" Mar 18 18:56:00 crc kubenswrapper[4830]: I0318 18:56:00.564824 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564336-z7b68" Mar 18 18:56:01 crc kubenswrapper[4830]: I0318 18:56:01.016382 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564336-z7b68"] Mar 18 18:56:01 crc kubenswrapper[4830]: I0318 18:56:01.022870 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 18:56:01 crc kubenswrapper[4830]: I0318 18:56:01.358582 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564336-z7b68" event={"ID":"39085195-bae6-4f11-8a16-5e96419af3ac","Type":"ContainerStarted","Data":"313ba50fb046fcbe331c43c9f098dd7c683eebb6331acd747243815690fc9e1d"} Mar 18 18:56:03 crc kubenswrapper[4830]: I0318 18:56:03.383221 4830 generic.go:334] "Generic (PLEG): container finished" podID="39085195-bae6-4f11-8a16-5e96419af3ac" containerID="1f525529fa7310d844b326bd46a6be05b38d9908fde378270e8cc542cbffeb1c" exitCode=0 Mar 18 18:56:03 crc kubenswrapper[4830]: I0318 18:56:03.383298 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564336-z7b68" event={"ID":"39085195-bae6-4f11-8a16-5e96419af3ac","Type":"ContainerDied","Data":"1f525529fa7310d844b326bd46a6be05b38d9908fde378270e8cc542cbffeb1c"} Mar 18 18:56:04 crc kubenswrapper[4830]: I0318 18:56:04.242456 4830 scope.go:117] "RemoveContainer" containerID="9ebf0be27630d291f563664287588810c1f00205f7e12e3c7064ff19e68f2365" Mar 18 18:56:04 crc kubenswrapper[4830]: E0318 18:56:04.243000 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:56:04 crc kubenswrapper[4830]: I0318 18:56:04.772609 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564336-z7b68" Mar 18 18:56:04 crc kubenswrapper[4830]: I0318 18:56:04.852229 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff2lr\" (UniqueName: \"kubernetes.io/projected/39085195-bae6-4f11-8a16-5e96419af3ac-kube-api-access-ff2lr\") pod \"39085195-bae6-4f11-8a16-5e96419af3ac\" (UID: \"39085195-bae6-4f11-8a16-5e96419af3ac\") " Mar 18 18:56:04 crc kubenswrapper[4830]: I0318 18:56:04.860943 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39085195-bae6-4f11-8a16-5e96419af3ac-kube-api-access-ff2lr" (OuterVolumeSpecName: "kube-api-access-ff2lr") pod "39085195-bae6-4f11-8a16-5e96419af3ac" (UID: "39085195-bae6-4f11-8a16-5e96419af3ac"). InnerVolumeSpecName "kube-api-access-ff2lr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:56:04 crc kubenswrapper[4830]: I0318 18:56:04.955004 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff2lr\" (UniqueName: \"kubernetes.io/projected/39085195-bae6-4f11-8a16-5e96419af3ac-kube-api-access-ff2lr\") on node \"crc\" DevicePath \"\"" Mar 18 18:56:05 crc kubenswrapper[4830]: I0318 18:56:05.412565 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564336-z7b68" event={"ID":"39085195-bae6-4f11-8a16-5e96419af3ac","Type":"ContainerDied","Data":"313ba50fb046fcbe331c43c9f098dd7c683eebb6331acd747243815690fc9e1d"} Mar 18 18:56:05 crc kubenswrapper[4830]: I0318 18:56:05.412624 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="313ba50fb046fcbe331c43c9f098dd7c683eebb6331acd747243815690fc9e1d" Mar 18 18:56:05 crc kubenswrapper[4830]: I0318 18:56:05.412627 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564336-z7b68" Mar 18 18:56:05 crc kubenswrapper[4830]: I0318 18:56:05.870045 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564330-4rgf2"] Mar 18 18:56:05 crc kubenswrapper[4830]: I0318 18:56:05.881166 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564330-4rgf2"] Mar 18 18:56:06 crc kubenswrapper[4830]: I0318 18:56:06.249909 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5226b313-e059-4748-8821-6d8875e4b11a" path="/var/lib/kubelet/pods/5226b313-e059-4748-8821-6d8875e4b11a/volumes" Mar 18 18:56:16 crc kubenswrapper[4830]: I0318 18:56:16.242328 4830 scope.go:117] "RemoveContainer" containerID="9ebf0be27630d291f563664287588810c1f00205f7e12e3c7064ff19e68f2365" Mar 18 18:56:16 crc kubenswrapper[4830]: E0318 18:56:16.243262 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:56:17 crc kubenswrapper[4830]: I0318 18:56:17.947032 4830 scope.go:117] "RemoveContainer" containerID="fdc2a7f8c694c80b2b77e6a271954a157eab03b828db5e951263b7de51ba1d5a" Mar 18 18:56:30 crc kubenswrapper[4830]: I0318 18:56:30.235532 4830 scope.go:117] "RemoveContainer" containerID="9ebf0be27630d291f563664287588810c1f00205f7e12e3c7064ff19e68f2365" Mar 18 18:56:30 crc kubenswrapper[4830]: E0318 18:56:30.236723 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:56:41 crc kubenswrapper[4830]: I0318 18:56:41.235332 4830 scope.go:117] "RemoveContainer" containerID="9ebf0be27630d291f563664287588810c1f00205f7e12e3c7064ff19e68f2365" Mar 18 18:56:41 crc kubenswrapper[4830]: E0318 18:56:41.236385 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:56:53 crc kubenswrapper[4830]: I0318 18:56:53.235244 4830 scope.go:117] "RemoveContainer" containerID="9ebf0be27630d291f563664287588810c1f00205f7e12e3c7064ff19e68f2365" Mar 18 18:56:53 crc kubenswrapper[4830]: E0318 18:56:53.237510 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:57:06 crc kubenswrapper[4830]: I0318 18:57:06.239276 4830 scope.go:117] "RemoveContainer" containerID="9ebf0be27630d291f563664287588810c1f00205f7e12e3c7064ff19e68f2365" Mar 18 18:57:06 crc kubenswrapper[4830]: E0318 18:57:06.240066 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:57:19 crc kubenswrapper[4830]: I0318 18:57:19.235007 4830 scope.go:117] "RemoveContainer" containerID="9ebf0be27630d291f563664287588810c1f00205f7e12e3c7064ff19e68f2365" Mar 18 18:57:19 crc kubenswrapper[4830]: E0318 18:57:19.236030 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 18:57:34 crc kubenswrapper[4830]: I0318 18:57:34.234369 4830 scope.go:117] "RemoveContainer" containerID="9ebf0be27630d291f563664287588810c1f00205f7e12e3c7064ff19e68f2365" Mar 18 18:57:35 crc kubenswrapper[4830]: I0318 18:57:35.256794 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerStarted","Data":"13b22ec6d37d4fdb9af379b78e2efb5238b0fc5b9b902fa2c6d79182a9053811"} Mar 18 18:58:00 crc kubenswrapper[4830]: I0318 18:58:00.158566 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564338-tfzcn"] Mar 18 18:58:00 crc kubenswrapper[4830]: E0318 18:58:00.159458 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39085195-bae6-4f11-8a16-5e96419af3ac" containerName="oc" Mar 18 18:58:00 crc kubenswrapper[4830]: I0318 18:58:00.159473 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="39085195-bae6-4f11-8a16-5e96419af3ac" containerName="oc" Mar 18 18:58:00 crc kubenswrapper[4830]: I0318 18:58:00.159653 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="39085195-bae6-4f11-8a16-5e96419af3ac" containerName="oc" Mar 18 18:58:00 crc kubenswrapper[4830]: I0318 18:58:00.160202 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564338-tfzcn" Mar 18 18:58:00 crc kubenswrapper[4830]: I0318 18:58:00.163733 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:58:00 crc kubenswrapper[4830]: I0318 18:58:00.164254 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:58:00 crc kubenswrapper[4830]: I0318 18:58:00.164260 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 18:58:00 crc kubenswrapper[4830]: I0318 18:58:00.181661 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564338-tfzcn"] Mar 18 18:58:00 crc kubenswrapper[4830]: I0318 18:58:00.266385 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86gtt\" (UniqueName: \"kubernetes.io/projected/f8276d1d-11ff-4811-8cb6-7cec792c49f9-kube-api-access-86gtt\") pod \"auto-csr-approver-29564338-tfzcn\" (UID: \"f8276d1d-11ff-4811-8cb6-7cec792c49f9\") " pod="openshift-infra/auto-csr-approver-29564338-tfzcn" Mar 18 18:58:00 crc kubenswrapper[4830]: I0318 18:58:00.367709 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86gtt\" (UniqueName: \"kubernetes.io/projected/f8276d1d-11ff-4811-8cb6-7cec792c49f9-kube-api-access-86gtt\") pod \"auto-csr-approver-29564338-tfzcn\" (UID: \"f8276d1d-11ff-4811-8cb6-7cec792c49f9\") " pod="openshift-infra/auto-csr-approver-29564338-tfzcn" Mar 18 18:58:00 crc kubenswrapper[4830]: I0318 18:58:00.387340 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86gtt\" (UniqueName: \"kubernetes.io/projected/f8276d1d-11ff-4811-8cb6-7cec792c49f9-kube-api-access-86gtt\") pod \"auto-csr-approver-29564338-tfzcn\" (UID: \"f8276d1d-11ff-4811-8cb6-7cec792c49f9\") " pod="openshift-infra/auto-csr-approver-29564338-tfzcn" Mar 18 18:58:00 crc kubenswrapper[4830]: I0318 18:58:00.483052 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564338-tfzcn" Mar 18 18:58:00 crc kubenswrapper[4830]: I0318 18:58:00.956735 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564338-tfzcn"] Mar 18 18:58:00 crc kubenswrapper[4830]: W0318 18:58:00.965715 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8276d1d_11ff_4811_8cb6_7cec792c49f9.slice/crio-6a1f1ff076664ee24bab0eb77cf9c11fc050fad56c59ec82af0a537b02faf8e9 WatchSource:0}: Error finding container 6a1f1ff076664ee24bab0eb77cf9c11fc050fad56c59ec82af0a537b02faf8e9: Status 404 returned error can't find the container with id 6a1f1ff076664ee24bab0eb77cf9c11fc050fad56c59ec82af0a537b02faf8e9 Mar 18 18:58:01 crc kubenswrapper[4830]: I0318 18:58:01.498759 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564338-tfzcn" event={"ID":"f8276d1d-11ff-4811-8cb6-7cec792c49f9","Type":"ContainerStarted","Data":"6a1f1ff076664ee24bab0eb77cf9c11fc050fad56c59ec82af0a537b02faf8e9"} Mar 18 18:58:03 crc kubenswrapper[4830]: I0318 18:58:03.518069 4830 generic.go:334] "Generic (PLEG): container finished" podID="f8276d1d-11ff-4811-8cb6-7cec792c49f9" containerID="4cb2c821e989437d9b62683190193bb9224cbe79a296cb322cd554dfd5389d56" exitCode=0 Mar 18 18:58:03 crc kubenswrapper[4830]: I0318 18:58:03.518163 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564338-tfzcn" event={"ID":"f8276d1d-11ff-4811-8cb6-7cec792c49f9","Type":"ContainerDied","Data":"4cb2c821e989437d9b62683190193bb9224cbe79a296cb322cd554dfd5389d56"} Mar 18 18:58:04 crc kubenswrapper[4830]: I0318 18:58:04.958012 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564338-tfzcn" Mar 18 18:58:05 crc kubenswrapper[4830]: I0318 18:58:05.040466 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86gtt\" (UniqueName: \"kubernetes.io/projected/f8276d1d-11ff-4811-8cb6-7cec792c49f9-kube-api-access-86gtt\") pod \"f8276d1d-11ff-4811-8cb6-7cec792c49f9\" (UID: \"f8276d1d-11ff-4811-8cb6-7cec792c49f9\") " Mar 18 18:58:05 crc kubenswrapper[4830]: I0318 18:58:05.048883 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8276d1d-11ff-4811-8cb6-7cec792c49f9-kube-api-access-86gtt" (OuterVolumeSpecName: "kube-api-access-86gtt") pod "f8276d1d-11ff-4811-8cb6-7cec792c49f9" (UID: "f8276d1d-11ff-4811-8cb6-7cec792c49f9"). InnerVolumeSpecName "kube-api-access-86gtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:58:05 crc kubenswrapper[4830]: I0318 18:58:05.142296 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86gtt\" (UniqueName: \"kubernetes.io/projected/f8276d1d-11ff-4811-8cb6-7cec792c49f9-kube-api-access-86gtt\") on node \"crc\" DevicePath \"\"" Mar 18 18:58:05 crc kubenswrapper[4830]: I0318 18:58:05.540747 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564338-tfzcn" event={"ID":"f8276d1d-11ff-4811-8cb6-7cec792c49f9","Type":"ContainerDied","Data":"6a1f1ff076664ee24bab0eb77cf9c11fc050fad56c59ec82af0a537b02faf8e9"} Mar 18 18:58:05 crc kubenswrapper[4830]: I0318 18:58:05.540862 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a1f1ff076664ee24bab0eb77cf9c11fc050fad56c59ec82af0a537b02faf8e9" Mar 18 18:58:05 crc kubenswrapper[4830]: I0318 18:58:05.540862 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564338-tfzcn" Mar 18 18:58:06 crc kubenswrapper[4830]: I0318 18:58:06.056600 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564332-9gdch"] Mar 18 18:58:06 crc kubenswrapper[4830]: I0318 18:58:06.067465 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564332-9gdch"] Mar 18 18:58:06 crc kubenswrapper[4830]: I0318 18:58:06.282450 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87f5fc92-b479-4cf5-9fb9-94f54e3d1822" path="/var/lib/kubelet/pods/87f5fc92-b479-4cf5-9fb9-94f54e3d1822/volumes" Mar 18 18:58:18 crc kubenswrapper[4830]: I0318 18:58:18.033385 4830 scope.go:117] "RemoveContainer" containerID="817e167b742a955d33429b2e3309b89787dc927e322abc34fd86838fb665b397" Mar 18 18:58:55 crc kubenswrapper[4830]: I0318 18:58:55.635782 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5wm25"] Mar 18 18:58:55 crc kubenswrapper[4830]: E0318 18:58:55.639514 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8276d1d-11ff-4811-8cb6-7cec792c49f9" containerName="oc" Mar 18 18:58:55 crc kubenswrapper[4830]: I0318 18:58:55.639755 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8276d1d-11ff-4811-8cb6-7cec792c49f9" containerName="oc" Mar 18 18:58:55 crc kubenswrapper[4830]: I0318 18:58:55.640648 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8276d1d-11ff-4811-8cb6-7cec792c49f9" containerName="oc" Mar 18 18:58:55 crc kubenswrapper[4830]: I0318 18:58:55.643017 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5wm25" Mar 18 18:58:55 crc kubenswrapper[4830]: I0318 18:58:55.656117 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5wm25"] Mar 18 18:58:55 crc kubenswrapper[4830]: I0318 18:58:55.831315 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dab79dc5-9beb-45bf-85fe-a9ba5f411241-utilities\") pod \"certified-operators-5wm25\" (UID: \"dab79dc5-9beb-45bf-85fe-a9ba5f411241\") " pod="openshift-marketplace/certified-operators-5wm25" Mar 18 18:58:55 crc kubenswrapper[4830]: I0318 18:58:55.831643 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm298\" (UniqueName: \"kubernetes.io/projected/dab79dc5-9beb-45bf-85fe-a9ba5f411241-kube-api-access-hm298\") pod \"certified-operators-5wm25\" (UID: \"dab79dc5-9beb-45bf-85fe-a9ba5f411241\") " pod="openshift-marketplace/certified-operators-5wm25" Mar 18 18:58:55 crc kubenswrapper[4830]: I0318 18:58:55.831698 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dab79dc5-9beb-45bf-85fe-a9ba5f411241-catalog-content\") pod \"certified-operators-5wm25\" (UID: \"dab79dc5-9beb-45bf-85fe-a9ba5f411241\") " pod="openshift-marketplace/certified-operators-5wm25" Mar 18 18:58:55 crc kubenswrapper[4830]: I0318 18:58:55.933035 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dab79dc5-9beb-45bf-85fe-a9ba5f411241-utilities\") pod \"certified-operators-5wm25\" (UID: \"dab79dc5-9beb-45bf-85fe-a9ba5f411241\") " pod="openshift-marketplace/certified-operators-5wm25" Mar 18 18:58:55 crc kubenswrapper[4830]: I0318 18:58:55.933144 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm298\" (UniqueName: \"kubernetes.io/projected/dab79dc5-9beb-45bf-85fe-a9ba5f411241-kube-api-access-hm298\") pod \"certified-operators-5wm25\" (UID: \"dab79dc5-9beb-45bf-85fe-a9ba5f411241\") " pod="openshift-marketplace/certified-operators-5wm25" Mar 18 18:58:55 crc kubenswrapper[4830]: I0318 18:58:55.933254 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dab79dc5-9beb-45bf-85fe-a9ba5f411241-catalog-content\") pod \"certified-operators-5wm25\" (UID: \"dab79dc5-9beb-45bf-85fe-a9ba5f411241\") " pod="openshift-marketplace/certified-operators-5wm25" Mar 18 18:58:55 crc kubenswrapper[4830]: I0318 18:58:55.933828 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dab79dc5-9beb-45bf-85fe-a9ba5f411241-catalog-content\") pod \"certified-operators-5wm25\" (UID: \"dab79dc5-9beb-45bf-85fe-a9ba5f411241\") " pod="openshift-marketplace/certified-operators-5wm25" Mar 18 18:58:55 crc kubenswrapper[4830]: I0318 18:58:55.933828 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dab79dc5-9beb-45bf-85fe-a9ba5f411241-utilities\") pod \"certified-operators-5wm25\" (UID: \"dab79dc5-9beb-45bf-85fe-a9ba5f411241\") " pod="openshift-marketplace/certified-operators-5wm25" Mar 18 18:58:55 crc kubenswrapper[4830]: I0318 18:58:55.962508 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm298\" (UniqueName: \"kubernetes.io/projected/dab79dc5-9beb-45bf-85fe-a9ba5f411241-kube-api-access-hm298\") pod \"certified-operators-5wm25\" (UID: \"dab79dc5-9beb-45bf-85fe-a9ba5f411241\") " pod="openshift-marketplace/certified-operators-5wm25" Mar 18 18:58:55 crc kubenswrapper[4830]: I0318 18:58:55.977825 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5wm25" Mar 18 18:58:56 crc kubenswrapper[4830]: I0318 18:58:56.475522 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5wm25"] Mar 18 18:58:57 crc kubenswrapper[4830]: I0318 18:58:57.083121 4830 generic.go:334] "Generic (PLEG): container finished" podID="dab79dc5-9beb-45bf-85fe-a9ba5f411241" containerID="db66473436d078e77fdb571fd2a8cf4558613b1eb787808bf7f4e04235eeeff4" exitCode=0 Mar 18 18:58:57 crc kubenswrapper[4830]: I0318 18:58:57.083210 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5wm25" event={"ID":"dab79dc5-9beb-45bf-85fe-a9ba5f411241","Type":"ContainerDied","Data":"db66473436d078e77fdb571fd2a8cf4558613b1eb787808bf7f4e04235eeeff4"} Mar 18 18:58:57 crc kubenswrapper[4830]: I0318 18:58:57.083289 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5wm25" event={"ID":"dab79dc5-9beb-45bf-85fe-a9ba5f411241","Type":"ContainerStarted","Data":"246ab175b990ae046ee5d87593579989459bfa81570f75ec83a066e9aa44a702"} Mar 18 18:58:58 crc kubenswrapper[4830]: I0318 18:58:58.094434 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5wm25" event={"ID":"dab79dc5-9beb-45bf-85fe-a9ba5f411241","Type":"ContainerStarted","Data":"3ddaca818293fd824eb2cae3dd889ebcb891d373e8ab0aeb4e33fefe3eb11046"} Mar 18 18:58:59 crc kubenswrapper[4830]: I0318 18:58:59.107695 4830 generic.go:334] "Generic (PLEG): container finished" podID="dab79dc5-9beb-45bf-85fe-a9ba5f411241" containerID="3ddaca818293fd824eb2cae3dd889ebcb891d373e8ab0aeb4e33fefe3eb11046" exitCode=0 Mar 18 18:58:59 crc kubenswrapper[4830]: I0318 18:58:59.107899 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5wm25" event={"ID":"dab79dc5-9beb-45bf-85fe-a9ba5f411241","Type":"ContainerDied","Data":"3ddaca818293fd824eb2cae3dd889ebcb891d373e8ab0aeb4e33fefe3eb11046"} Mar 18 18:59:00 crc kubenswrapper[4830]: I0318 18:59:00.135335 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5wm25" event={"ID":"dab79dc5-9beb-45bf-85fe-a9ba5f411241","Type":"ContainerStarted","Data":"ee0899a8432d9fa609d7b726cc5950a71a0992b1de89951259cb8ada6452f89c"} Mar 18 18:59:00 crc kubenswrapper[4830]: I0318 18:59:00.166211 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5wm25" podStartSLOduration=2.712760958 podStartE2EDuration="5.166191569s" podCreationTimestamp="2026-03-18 18:58:55 +0000 UTC" firstStartedPulling="2026-03-18 18:58:57.085879143 +0000 UTC m=+3371.653509515" lastFinishedPulling="2026-03-18 18:58:59.539309764 +0000 UTC m=+3374.106940126" observedRunningTime="2026-03-18 18:59:00.16058537 +0000 UTC m=+3374.728215742" watchObservedRunningTime="2026-03-18 18:59:00.166191569 +0000 UTC m=+3374.733821911" Mar 18 18:59:05 crc kubenswrapper[4830]: I0318 18:59:05.978874 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5wm25" Mar 18 18:59:05 crc kubenswrapper[4830]: I0318 18:59:05.979487 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5wm25" Mar 18 18:59:06 crc kubenswrapper[4830]: I0318 18:59:06.039896 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5wm25" Mar 18 18:59:06 crc kubenswrapper[4830]: I0318 18:59:06.232688 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5wm25" Mar 18 18:59:06 crc kubenswrapper[4830]: I0318 18:59:06.291353 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5wm25"] Mar 18 18:59:08 crc kubenswrapper[4830]: I0318 18:59:08.207331 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5wm25" podUID="dab79dc5-9beb-45bf-85fe-a9ba5f411241" containerName="registry-server" containerID="cri-o://ee0899a8432d9fa609d7b726cc5950a71a0992b1de89951259cb8ada6452f89c" gracePeriod=2 Mar 18 18:59:09 crc kubenswrapper[4830]: I0318 18:59:09.223975 4830 generic.go:334] "Generic (PLEG): container finished" podID="dab79dc5-9beb-45bf-85fe-a9ba5f411241" containerID="ee0899a8432d9fa609d7b726cc5950a71a0992b1de89951259cb8ada6452f89c" exitCode=0 Mar 18 18:59:09 crc kubenswrapper[4830]: I0318 18:59:09.224183 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5wm25" event={"ID":"dab79dc5-9beb-45bf-85fe-a9ba5f411241","Type":"ContainerDied","Data":"ee0899a8432d9fa609d7b726cc5950a71a0992b1de89951259cb8ada6452f89c"} Mar 18 18:59:10 crc kubenswrapper[4830]: I0318 18:59:10.392470 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5wm25" Mar 18 18:59:10 crc kubenswrapper[4830]: I0318 18:59:10.581312 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dab79dc5-9beb-45bf-85fe-a9ba5f411241-utilities\") pod \"dab79dc5-9beb-45bf-85fe-a9ba5f411241\" (UID: \"dab79dc5-9beb-45bf-85fe-a9ba5f411241\") " Mar 18 18:59:10 crc kubenswrapper[4830]: I0318 18:59:10.581475 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hm298\" (UniqueName: \"kubernetes.io/projected/dab79dc5-9beb-45bf-85fe-a9ba5f411241-kube-api-access-hm298\") pod \"dab79dc5-9beb-45bf-85fe-a9ba5f411241\" (UID: \"dab79dc5-9beb-45bf-85fe-a9ba5f411241\") " Mar 18 18:59:10 crc kubenswrapper[4830]: I0318 18:59:10.581583 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dab79dc5-9beb-45bf-85fe-a9ba5f411241-catalog-content\") pod \"dab79dc5-9beb-45bf-85fe-a9ba5f411241\" (UID: \"dab79dc5-9beb-45bf-85fe-a9ba5f411241\") " Mar 18 18:59:10 crc kubenswrapper[4830]: I0318 18:59:10.583644 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dab79dc5-9beb-45bf-85fe-a9ba5f411241-utilities" (OuterVolumeSpecName: "utilities") pod "dab79dc5-9beb-45bf-85fe-a9ba5f411241" (UID: "dab79dc5-9beb-45bf-85fe-a9ba5f411241"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:59:10 crc kubenswrapper[4830]: I0318 18:59:10.590029 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dab79dc5-9beb-45bf-85fe-a9ba5f411241-kube-api-access-hm298" (OuterVolumeSpecName: "kube-api-access-hm298") pod "dab79dc5-9beb-45bf-85fe-a9ba5f411241" (UID: "dab79dc5-9beb-45bf-85fe-a9ba5f411241"). InnerVolumeSpecName "kube-api-access-hm298". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:59:10 crc kubenswrapper[4830]: I0318 18:59:10.630599 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dab79dc5-9beb-45bf-85fe-a9ba5f411241-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dab79dc5-9beb-45bf-85fe-a9ba5f411241" (UID: "dab79dc5-9beb-45bf-85fe-a9ba5f411241"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:59:10 crc kubenswrapper[4830]: I0318 18:59:10.683608 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dab79dc5-9beb-45bf-85fe-a9ba5f411241-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:59:10 crc kubenswrapper[4830]: I0318 18:59:10.683659 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hm298\" (UniqueName: \"kubernetes.io/projected/dab79dc5-9beb-45bf-85fe-a9ba5f411241-kube-api-access-hm298\") on node \"crc\" DevicePath \"\"" Mar 18 18:59:10 crc kubenswrapper[4830]: I0318 18:59:10.683683 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dab79dc5-9beb-45bf-85fe-a9ba5f411241-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:59:11 crc kubenswrapper[4830]: I0318 18:59:11.250637 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5wm25" event={"ID":"dab79dc5-9beb-45bf-85fe-a9ba5f411241","Type":"ContainerDied","Data":"246ab175b990ae046ee5d87593579989459bfa81570f75ec83a066e9aa44a702"} Mar 18 18:59:11 crc kubenswrapper[4830]: I0318 18:59:11.250730 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5wm25" Mar 18 18:59:11 crc kubenswrapper[4830]: I0318 18:59:11.251127 4830 scope.go:117] "RemoveContainer" containerID="ee0899a8432d9fa609d7b726cc5950a71a0992b1de89951259cb8ada6452f89c" Mar 18 18:59:11 crc kubenswrapper[4830]: I0318 18:59:11.294382 4830 scope.go:117] "RemoveContainer" containerID="3ddaca818293fd824eb2cae3dd889ebcb891d373e8ab0aeb4e33fefe3eb11046" Mar 18 18:59:11 crc kubenswrapper[4830]: I0318 18:59:11.302562 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5wm25"] Mar 18 18:59:11 crc kubenswrapper[4830]: I0318 18:59:11.317276 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5wm25"] Mar 18 18:59:11 crc kubenswrapper[4830]: I0318 18:59:11.338309 4830 scope.go:117] "RemoveContainer" containerID="db66473436d078e77fdb571fd2a8cf4558613b1eb787808bf7f4e04235eeeff4" Mar 18 18:59:12 crc kubenswrapper[4830]: I0318 18:59:12.250459 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dab79dc5-9beb-45bf-85fe-a9ba5f411241" path="/var/lib/kubelet/pods/dab79dc5-9beb-45bf-85fe-a9ba5f411241/volumes" Mar 18 18:59:59 crc kubenswrapper[4830]: I0318 18:59:59.509540 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:59:59 crc kubenswrapper[4830]: I0318 18:59:59.510318 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:00:00 crc kubenswrapper[4830]: I0318 19:00:00.138192 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564340-jdt29"] Mar 18 19:00:00 crc kubenswrapper[4830]: E0318 19:00:00.138568 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab79dc5-9beb-45bf-85fe-a9ba5f411241" containerName="extract-content" Mar 18 19:00:00 crc kubenswrapper[4830]: I0318 19:00:00.138593 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab79dc5-9beb-45bf-85fe-a9ba5f411241" containerName="extract-content" Mar 18 19:00:00 crc kubenswrapper[4830]: E0318 19:00:00.138648 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab79dc5-9beb-45bf-85fe-a9ba5f411241" containerName="registry-server" Mar 18 19:00:00 crc kubenswrapper[4830]: I0318 19:00:00.138659 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab79dc5-9beb-45bf-85fe-a9ba5f411241" containerName="registry-server" Mar 18 19:00:00 crc kubenswrapper[4830]: E0318 19:00:00.138675 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab79dc5-9beb-45bf-85fe-a9ba5f411241" containerName="extract-utilities" Mar 18 19:00:00 crc kubenswrapper[4830]: I0318 19:00:00.138684 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab79dc5-9beb-45bf-85fe-a9ba5f411241" containerName="extract-utilities" Mar 18 19:00:00 crc kubenswrapper[4830]: I0318 19:00:00.138881 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="dab79dc5-9beb-45bf-85fe-a9ba5f411241" containerName="registry-server" Mar 18 19:00:00 crc kubenswrapper[4830]: I0318 19:00:00.139428 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564340-jdt29" Mar 18 19:00:00 crc kubenswrapper[4830]: I0318 19:00:00.142846 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:00:00 crc kubenswrapper[4830]: I0318 19:00:00.143081 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:00:00 crc kubenswrapper[4830]: I0318 19:00:00.143988 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 19:00:00 crc kubenswrapper[4830]: I0318 19:00:00.146108 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564340-5vwhs"] Mar 18 19:00:00 crc kubenswrapper[4830]: I0318 19:00:00.147181 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564340-5vwhs" Mar 18 19:00:00 crc kubenswrapper[4830]: I0318 19:00:00.149075 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 19:00:00 crc kubenswrapper[4830]: I0318 19:00:00.149218 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 19:00:00 crc kubenswrapper[4830]: I0318 19:00:00.163365 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564340-jdt29"] Mar 18 19:00:00 crc kubenswrapper[4830]: I0318 19:00:00.170679 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564340-5vwhs"] Mar 18 19:00:00 crc kubenswrapper[4830]: I0318 19:00:00.193854 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t9mm\" (UniqueName: \"kubernetes.io/projected/246ca0cc-cd9f-4d3b-a7c7-856b4077d1ec-kube-api-access-7t9mm\") pod \"auto-csr-approver-29564340-jdt29\" (UID: \"246ca0cc-cd9f-4d3b-a7c7-856b4077d1ec\") " pod="openshift-infra/auto-csr-approver-29564340-jdt29" Mar 18 19:00:00 crc kubenswrapper[4830]: I0318 19:00:00.294980 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/70d453f8-0d7d-44ca-b63f-88e0da076fea-secret-volume\") pod \"collect-profiles-29564340-5vwhs\" (UID: \"70d453f8-0d7d-44ca-b63f-88e0da076fea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564340-5vwhs" Mar 18 19:00:00 crc kubenswrapper[4830]: I0318 19:00:00.295073 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70d453f8-0d7d-44ca-b63f-88e0da076fea-config-volume\") pod \"collect-profiles-29564340-5vwhs\" (UID: \"70d453f8-0d7d-44ca-b63f-88e0da076fea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564340-5vwhs" Mar 18 19:00:00 crc kubenswrapper[4830]: I0318 19:00:00.295235 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t9mm\" (UniqueName: \"kubernetes.io/projected/246ca0cc-cd9f-4d3b-a7c7-856b4077d1ec-kube-api-access-7t9mm\") pod \"auto-csr-approver-29564340-jdt29\" (UID: \"246ca0cc-cd9f-4d3b-a7c7-856b4077d1ec\") " pod="openshift-infra/auto-csr-approver-29564340-jdt29" Mar 18 19:00:00 crc kubenswrapper[4830]: I0318 19:00:00.295361 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9tvx\" (UniqueName: \"kubernetes.io/projected/70d453f8-0d7d-44ca-b63f-88e0da076fea-kube-api-access-q9tvx\") pod \"collect-profiles-29564340-5vwhs\" (UID: \"70d453f8-0d7d-44ca-b63f-88e0da076fea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564340-5vwhs" Mar 18 19:00:00 crc kubenswrapper[4830]: I0318 19:00:00.326160 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t9mm\" (UniqueName: \"kubernetes.io/projected/246ca0cc-cd9f-4d3b-a7c7-856b4077d1ec-kube-api-access-7t9mm\") pod \"auto-csr-approver-29564340-jdt29\" (UID: \"246ca0cc-cd9f-4d3b-a7c7-856b4077d1ec\") " pod="openshift-infra/auto-csr-approver-29564340-jdt29" Mar 18 19:00:00 crc kubenswrapper[4830]: I0318 19:00:00.397123 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9tvx\" (UniqueName: \"kubernetes.io/projected/70d453f8-0d7d-44ca-b63f-88e0da076fea-kube-api-access-q9tvx\") pod \"collect-profiles-29564340-5vwhs\" (UID: \"70d453f8-0d7d-44ca-b63f-88e0da076fea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564340-5vwhs" Mar 18 19:00:00 crc kubenswrapper[4830]: I0318 19:00:00.397246 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/70d453f8-0d7d-44ca-b63f-88e0da076fea-secret-volume\") pod \"collect-profiles-29564340-5vwhs\" (UID: \"70d453f8-0d7d-44ca-b63f-88e0da076fea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564340-5vwhs" Mar 18 19:00:00 crc kubenswrapper[4830]: I0318 19:00:00.397300 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70d453f8-0d7d-44ca-b63f-88e0da076fea-config-volume\") pod \"collect-profiles-29564340-5vwhs\" (UID: \"70d453f8-0d7d-44ca-b63f-88e0da076fea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564340-5vwhs" Mar 18 19:00:00 crc kubenswrapper[4830]: I0318 19:00:00.399383 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70d453f8-0d7d-44ca-b63f-88e0da076fea-config-volume\") pod \"collect-profiles-29564340-5vwhs\" (UID: \"70d453f8-0d7d-44ca-b63f-88e0da076fea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564340-5vwhs" Mar 18 19:00:00 crc kubenswrapper[4830]: I0318 19:00:00.403818 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/70d453f8-0d7d-44ca-b63f-88e0da076fea-secret-volume\") pod \"collect-profiles-29564340-5vwhs\" (UID: \"70d453f8-0d7d-44ca-b63f-88e0da076fea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564340-5vwhs" Mar 18 19:00:00 crc kubenswrapper[4830]: I0318 19:00:00.425024 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9tvx\" (UniqueName: \"kubernetes.io/projected/70d453f8-0d7d-44ca-b63f-88e0da076fea-kube-api-access-q9tvx\") pod \"collect-profiles-29564340-5vwhs\" (UID: \"70d453f8-0d7d-44ca-b63f-88e0da076fea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564340-5vwhs" Mar 18 19:00:00 crc kubenswrapper[4830]: I0318 19:00:00.477135 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564340-jdt29" Mar 18 19:00:00 crc kubenswrapper[4830]: I0318 19:00:00.495583 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564340-5vwhs" Mar 18 19:00:00 crc kubenswrapper[4830]: I0318 19:00:00.858883 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564340-jdt29"] Mar 18 19:00:01 crc kubenswrapper[4830]: W0318 19:00:01.014616 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70d453f8_0d7d_44ca_b63f_88e0da076fea.slice/crio-7bd0a130a8eac817d2060297defa6ab7892457d1154c52d24c9a30019e67c155 WatchSource:0}: Error finding container 7bd0a130a8eac817d2060297defa6ab7892457d1154c52d24c9a30019e67c155: Status 404 returned error can't find the container with id 7bd0a130a8eac817d2060297defa6ab7892457d1154c52d24c9a30019e67c155 Mar 18 19:00:01 crc kubenswrapper[4830]: I0318 19:00:01.015524 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564340-5vwhs"] Mar 18 19:00:01 crc kubenswrapper[4830]: I0318 19:00:01.705519 4830 generic.go:334] "Generic (PLEG): container finished" podID="70d453f8-0d7d-44ca-b63f-88e0da076fea" containerID="93f072bba1ace13077f8734bf0b5a000f728c7b2f1186321b689085cb9d513ed" exitCode=0 Mar 18 19:00:01 crc kubenswrapper[4830]: I0318 19:00:01.705814 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564340-5vwhs" event={"ID":"70d453f8-0d7d-44ca-b63f-88e0da076fea","Type":"ContainerDied","Data":"93f072bba1ace13077f8734bf0b5a000f728c7b2f1186321b689085cb9d513ed"} Mar 18 19:00:01 crc kubenswrapper[4830]: I0318 19:00:01.706451 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564340-5vwhs" event={"ID":"70d453f8-0d7d-44ca-b63f-88e0da076fea","Type":"ContainerStarted","Data":"7bd0a130a8eac817d2060297defa6ab7892457d1154c52d24c9a30019e67c155"} Mar 18 19:00:01 crc kubenswrapper[4830]: I0318 19:00:01.708038 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564340-jdt29" event={"ID":"246ca0cc-cd9f-4d3b-a7c7-856b4077d1ec","Type":"ContainerStarted","Data":"5cde1588a6c833e773599743883b363fb9e713f43945c61c649cfd34e2ec527c"} Mar 18 19:00:03 crc kubenswrapper[4830]: I0318 19:00:03.048237 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564340-5vwhs" Mar 18 19:00:03 crc kubenswrapper[4830]: I0318 19:00:03.145793 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70d453f8-0d7d-44ca-b63f-88e0da076fea-config-volume\") pod \"70d453f8-0d7d-44ca-b63f-88e0da076fea\" (UID: \"70d453f8-0d7d-44ca-b63f-88e0da076fea\") " Mar 18 19:00:03 crc kubenswrapper[4830]: I0318 19:00:03.145845 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9tvx\" (UniqueName: \"kubernetes.io/projected/70d453f8-0d7d-44ca-b63f-88e0da076fea-kube-api-access-q9tvx\") pod \"70d453f8-0d7d-44ca-b63f-88e0da076fea\" (UID: \"70d453f8-0d7d-44ca-b63f-88e0da076fea\") " Mar 18 19:00:03 crc kubenswrapper[4830]: I0318 19:00:03.145872 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/70d453f8-0d7d-44ca-b63f-88e0da076fea-secret-volume\") pod \"70d453f8-0d7d-44ca-b63f-88e0da076fea\" (UID: \"70d453f8-0d7d-44ca-b63f-88e0da076fea\") " Mar 18 19:00:03 crc kubenswrapper[4830]: I0318 19:00:03.146448 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70d453f8-0d7d-44ca-b63f-88e0da076fea-config-volume" (OuterVolumeSpecName: "config-volume") pod "70d453f8-0d7d-44ca-b63f-88e0da076fea" (UID: "70d453f8-0d7d-44ca-b63f-88e0da076fea"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:00:03 crc kubenswrapper[4830]: I0318 19:00:03.153631 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70d453f8-0d7d-44ca-b63f-88e0da076fea-kube-api-access-q9tvx" (OuterVolumeSpecName: "kube-api-access-q9tvx") pod "70d453f8-0d7d-44ca-b63f-88e0da076fea" (UID: "70d453f8-0d7d-44ca-b63f-88e0da076fea"). InnerVolumeSpecName "kube-api-access-q9tvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:00:03 crc kubenswrapper[4830]: I0318 19:00:03.153911 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70d453f8-0d7d-44ca-b63f-88e0da076fea-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "70d453f8-0d7d-44ca-b63f-88e0da076fea" (UID: "70d453f8-0d7d-44ca-b63f-88e0da076fea"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 19:00:03 crc kubenswrapper[4830]: I0318 19:00:03.247458 4830 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70d453f8-0d7d-44ca-b63f-88e0da076fea-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 19:00:03 crc kubenswrapper[4830]: I0318 19:00:03.247520 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9tvx\" (UniqueName: \"kubernetes.io/projected/70d453f8-0d7d-44ca-b63f-88e0da076fea-kube-api-access-q9tvx\") on node \"crc\" DevicePath \"\"" Mar 18 19:00:03 crc kubenswrapper[4830]: I0318 19:00:03.247538 4830 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/70d453f8-0d7d-44ca-b63f-88e0da076fea-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 19:00:03 crc kubenswrapper[4830]: I0318 19:00:03.725674 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564340-5vwhs" event={"ID":"70d453f8-0d7d-44ca-b63f-88e0da076fea","Type":"ContainerDied","Data":"7bd0a130a8eac817d2060297defa6ab7892457d1154c52d24c9a30019e67c155"} Mar 18 19:00:03 crc kubenswrapper[4830]: I0318 19:00:03.726172 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bd0a130a8eac817d2060297defa6ab7892457d1154c52d24c9a30019e67c155" Mar 18 19:00:03 crc kubenswrapper[4830]: I0318 19:00:03.725755 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564340-5vwhs" Mar 18 19:00:04 crc kubenswrapper[4830]: I0318 19:00:04.130913 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564295-g725v"] Mar 18 19:00:04 crc kubenswrapper[4830]: I0318 19:00:04.135721 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564295-g725v"] Mar 18 19:00:04 crc kubenswrapper[4830]: I0318 19:00:04.244964 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfb11874-1ec9-48f8-9312-718370bab9d1" path="/var/lib/kubelet/pods/cfb11874-1ec9-48f8-9312-718370bab9d1/volumes" Mar 18 19:00:04 crc kubenswrapper[4830]: I0318 19:00:04.741695 4830 generic.go:334] "Generic (PLEG): container finished" podID="246ca0cc-cd9f-4d3b-a7c7-856b4077d1ec" containerID="cb2d005f71cced88f397c9e6a53fab664f89b86834823e262879e348f25cfe28" exitCode=0 Mar 18 19:00:04 crc kubenswrapper[4830]: I0318 19:00:04.741897 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564340-jdt29" event={"ID":"246ca0cc-cd9f-4d3b-a7c7-856b4077d1ec","Type":"ContainerDied","Data":"cb2d005f71cced88f397c9e6a53fab664f89b86834823e262879e348f25cfe28"} Mar 18 19:00:06 crc kubenswrapper[4830]: I0318 19:00:06.044959 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564340-jdt29" Mar 18 19:00:06 crc kubenswrapper[4830]: I0318 19:00:06.195042 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7t9mm\" (UniqueName: \"kubernetes.io/projected/246ca0cc-cd9f-4d3b-a7c7-856b4077d1ec-kube-api-access-7t9mm\") pod \"246ca0cc-cd9f-4d3b-a7c7-856b4077d1ec\" (UID: \"246ca0cc-cd9f-4d3b-a7c7-856b4077d1ec\") " Mar 18 19:00:06 crc kubenswrapper[4830]: I0318 19:00:06.200416 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/246ca0cc-cd9f-4d3b-a7c7-856b4077d1ec-kube-api-access-7t9mm" (OuterVolumeSpecName: "kube-api-access-7t9mm") pod "246ca0cc-cd9f-4d3b-a7c7-856b4077d1ec" (UID: "246ca0cc-cd9f-4d3b-a7c7-856b4077d1ec"). InnerVolumeSpecName "kube-api-access-7t9mm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:00:06 crc kubenswrapper[4830]: I0318 19:00:06.296445 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7t9mm\" (UniqueName: \"kubernetes.io/projected/246ca0cc-cd9f-4d3b-a7c7-856b4077d1ec-kube-api-access-7t9mm\") on node \"crc\" DevicePath \"\"" Mar 18 19:00:06 crc kubenswrapper[4830]: I0318 19:00:06.759489 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564340-jdt29" event={"ID":"246ca0cc-cd9f-4d3b-a7c7-856b4077d1ec","Type":"ContainerDied","Data":"5cde1588a6c833e773599743883b363fb9e713f43945c61c649cfd34e2ec527c"} Mar 18 19:00:06 crc kubenswrapper[4830]: I0318 19:00:06.759528 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cde1588a6c833e773599743883b363fb9e713f43945c61c649cfd34e2ec527c" Mar 18 19:00:06 crc kubenswrapper[4830]: I0318 19:00:06.759584 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564340-jdt29" Mar 18 19:00:07 crc kubenswrapper[4830]: I0318 19:00:07.114889 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564334-s8dn2"] Mar 18 19:00:07 crc kubenswrapper[4830]: I0318 19:00:07.124463 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564334-s8dn2"] Mar 18 19:00:08 crc kubenswrapper[4830]: I0318 19:00:08.250730 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdb4e508-7cc8-431d-9978-0bef156b5adb" path="/var/lib/kubelet/pods/cdb4e508-7cc8-431d-9978-0bef156b5adb/volumes" Mar 18 19:00:18 crc kubenswrapper[4830]: I0318 19:00:18.151754 4830 scope.go:117] "RemoveContainer" containerID="026cd292b47a740a4f80947a447a9aae6282e861d3336af0e7a1d9ce29072b04" Mar 18 19:00:18 crc kubenswrapper[4830]: I0318 19:00:18.200182 4830 scope.go:117] "RemoveContainer" containerID="b9dc640e172bbbdfdd8d24c86f2f50f970067cff0e2767fa98a7ab840fefea77" Mar 18 19:00:29 crc kubenswrapper[4830]: I0318 19:00:29.509692 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:00:29 crc kubenswrapper[4830]: I0318 19:00:29.510538 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:00:51 crc kubenswrapper[4830]: I0318 19:00:51.036737 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mzs5x"] Mar 18 19:00:51 crc kubenswrapper[4830]: E0318 19:00:51.038167 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70d453f8-0d7d-44ca-b63f-88e0da076fea" containerName="collect-profiles" Mar 18 19:00:51 crc kubenswrapper[4830]: I0318 19:00:51.038201 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="70d453f8-0d7d-44ca-b63f-88e0da076fea" containerName="collect-profiles" Mar 18 19:00:51 crc kubenswrapper[4830]: E0318 19:00:51.038280 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="246ca0cc-cd9f-4d3b-a7c7-856b4077d1ec" containerName="oc" Mar 18 19:00:51 crc kubenswrapper[4830]: I0318 19:00:51.038299 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="246ca0cc-cd9f-4d3b-a7c7-856b4077d1ec" containerName="oc" Mar 18 19:00:51 crc kubenswrapper[4830]: I0318 19:00:51.038626 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="246ca0cc-cd9f-4d3b-a7c7-856b4077d1ec" containerName="oc" Mar 18 19:00:51 crc kubenswrapper[4830]: I0318 19:00:51.038661 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="70d453f8-0d7d-44ca-b63f-88e0da076fea" containerName="collect-profiles" Mar 18 19:00:51 crc kubenswrapper[4830]: I0318 19:00:51.041353 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mzs5x" Mar 18 19:00:51 crc kubenswrapper[4830]: I0318 19:00:51.063667 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mzs5x"] Mar 18 19:00:51 crc kubenswrapper[4830]: I0318 19:00:51.146497 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td7nj\" (UniqueName: \"kubernetes.io/projected/d2135d7b-1b92-4b95-ac84-e69c0054dedb-kube-api-access-td7nj\") pod \"community-operators-mzs5x\" (UID: \"d2135d7b-1b92-4b95-ac84-e69c0054dedb\") " pod="openshift-marketplace/community-operators-mzs5x" Mar 18 19:00:51 crc kubenswrapper[4830]: I0318 19:00:51.146664 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2135d7b-1b92-4b95-ac84-e69c0054dedb-utilities\") pod \"community-operators-mzs5x\" (UID: \"d2135d7b-1b92-4b95-ac84-e69c0054dedb\") " pod="openshift-marketplace/community-operators-mzs5x" Mar 18 19:00:51 crc kubenswrapper[4830]: I0318 19:00:51.146702 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2135d7b-1b92-4b95-ac84-e69c0054dedb-catalog-content\") pod \"community-operators-mzs5x\" (UID: \"d2135d7b-1b92-4b95-ac84-e69c0054dedb\") " pod="openshift-marketplace/community-operators-mzs5x" Mar 18 19:00:51 crc kubenswrapper[4830]: I0318 19:00:51.248217 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2135d7b-1b92-4b95-ac84-e69c0054dedb-utilities\") pod \"community-operators-mzs5x\" (UID: \"d2135d7b-1b92-4b95-ac84-e69c0054dedb\") " pod="openshift-marketplace/community-operators-mzs5x" Mar 18 19:00:51 crc kubenswrapper[4830]: I0318 19:00:51.248306 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2135d7b-1b92-4b95-ac84-e69c0054dedb-catalog-content\") pod \"community-operators-mzs5x\" (UID: \"d2135d7b-1b92-4b95-ac84-e69c0054dedb\") " pod="openshift-marketplace/community-operators-mzs5x" Mar 18 19:00:51 crc kubenswrapper[4830]: I0318 19:00:51.248372 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td7nj\" (UniqueName: \"kubernetes.io/projected/d2135d7b-1b92-4b95-ac84-e69c0054dedb-kube-api-access-td7nj\") pod \"community-operators-mzs5x\" (UID: \"d2135d7b-1b92-4b95-ac84-e69c0054dedb\") " pod="openshift-marketplace/community-operators-mzs5x" Mar 18 19:00:51 crc kubenswrapper[4830]: I0318 19:00:51.248971 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2135d7b-1b92-4b95-ac84-e69c0054dedb-utilities\") pod \"community-operators-mzs5x\" (UID: \"d2135d7b-1b92-4b95-ac84-e69c0054dedb\") " pod="openshift-marketplace/community-operators-mzs5x" Mar 18 19:00:51 crc kubenswrapper[4830]: I0318 19:00:51.249057 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2135d7b-1b92-4b95-ac84-e69c0054dedb-catalog-content\") pod \"community-operators-mzs5x\" (UID: \"d2135d7b-1b92-4b95-ac84-e69c0054dedb\") " pod="openshift-marketplace/community-operators-mzs5x" Mar 18 19:00:51 crc kubenswrapper[4830]: I0318 19:00:51.276967 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td7nj\" (UniqueName: \"kubernetes.io/projected/d2135d7b-1b92-4b95-ac84-e69c0054dedb-kube-api-access-td7nj\") pod \"community-operators-mzs5x\" (UID: \"d2135d7b-1b92-4b95-ac84-e69c0054dedb\") " pod="openshift-marketplace/community-operators-mzs5x" Mar 18 19:00:51 crc kubenswrapper[4830]: I0318 19:00:51.362802 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mzs5x" Mar 18 19:00:51 crc kubenswrapper[4830]: I0318 19:00:51.708184 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mzs5x"] Mar 18 19:00:52 crc kubenswrapper[4830]: I0318 19:00:52.207503 4830 generic.go:334] "Generic (PLEG): container finished" podID="d2135d7b-1b92-4b95-ac84-e69c0054dedb" containerID="5750eea946162874deb58d3a402dce0a5d045ca96c0cb533c94351650854a000" exitCode=0 Mar 18 19:00:52 crc kubenswrapper[4830]: I0318 19:00:52.207585 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mzs5x" event={"ID":"d2135d7b-1b92-4b95-ac84-e69c0054dedb","Type":"ContainerDied","Data":"5750eea946162874deb58d3a402dce0a5d045ca96c0cb533c94351650854a000"} Mar 18 19:00:52 crc kubenswrapper[4830]: I0318 19:00:52.207871 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mzs5x" event={"ID":"d2135d7b-1b92-4b95-ac84-e69c0054dedb","Type":"ContainerStarted","Data":"c3e61dcfd26d1a963dd3aae0375bb7669d06678961066ea53e5c9b84411094ee"} Mar 18 19:00:54 crc kubenswrapper[4830]: I0318 19:00:54.230104 4830 generic.go:334] "Generic (PLEG): container finished" podID="d2135d7b-1b92-4b95-ac84-e69c0054dedb" containerID="3a5d61318498dd6606ddacbc398d98ec3f4e200561fda6922bfb8e2268746b08" exitCode=0 Mar 18 19:00:54 crc kubenswrapper[4830]: I0318 19:00:54.230332 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mzs5x" event={"ID":"d2135d7b-1b92-4b95-ac84-e69c0054dedb","Type":"ContainerDied","Data":"3a5d61318498dd6606ddacbc398d98ec3f4e200561fda6922bfb8e2268746b08"} Mar 18 19:00:56 crc kubenswrapper[4830]: I0318 19:00:56.248614 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mzs5x" event={"ID":"d2135d7b-1b92-4b95-ac84-e69c0054dedb","Type":"ContainerStarted","Data":"f121525abfaf4e57280983a070bcbc72b740baf3ac592416833bf12ea8b656da"} Mar 18 19:00:56 crc kubenswrapper[4830]: I0318 19:00:56.266643 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mzs5x" podStartSLOduration=1.607183716 podStartE2EDuration="5.266617373s" podCreationTimestamp="2026-03-18 19:00:51 +0000 UTC" firstStartedPulling="2026-03-18 19:00:52.209848178 +0000 UTC m=+3486.777478550" lastFinishedPulling="2026-03-18 19:00:55.869281835 +0000 UTC m=+3490.436912207" observedRunningTime="2026-03-18 19:00:56.265204523 +0000 UTC m=+3490.832834865" watchObservedRunningTime="2026-03-18 19:00:56.266617373 +0000 UTC m=+3490.834247705" Mar 18 19:00:59 crc kubenswrapper[4830]: I0318 19:00:59.509095 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:00:59 crc kubenswrapper[4830]: I0318 19:00:59.509671 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:00:59 crc kubenswrapper[4830]: I0318 19:00:59.509718 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" Mar 18 19:00:59 crc kubenswrapper[4830]: I0318 19:00:59.510385 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"13b22ec6d37d4fdb9af379b78e2efb5238b0fc5b9b902fa2c6d79182a9053811"} pod="openshift-machine-config-operator/machine-config-daemon-plzpb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 19:00:59 crc kubenswrapper[4830]: I0318 19:00:59.510459 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" containerID="cri-o://13b22ec6d37d4fdb9af379b78e2efb5238b0fc5b9b902fa2c6d79182a9053811" gracePeriod=600 Mar 18 19:01:00 crc kubenswrapper[4830]: I0318 19:01:00.284583 4830 generic.go:334] "Generic (PLEG): container finished" podID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerID="13b22ec6d37d4fdb9af379b78e2efb5238b0fc5b9b902fa2c6d79182a9053811" exitCode=0 Mar 18 19:01:00 crc kubenswrapper[4830]: I0318 19:01:00.284639 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerDied","Data":"13b22ec6d37d4fdb9af379b78e2efb5238b0fc5b9b902fa2c6d79182a9053811"} Mar 18 19:01:00 crc kubenswrapper[4830]: I0318 19:01:00.285143 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerStarted","Data":"c8d0fa381806ea089b307b93dbbb4d8d9b6965319f263d699b74c2b596415bd8"} Mar 18 19:01:00 crc kubenswrapper[4830]: I0318 19:01:00.285197 4830 scope.go:117] "RemoveContainer" containerID="9ebf0be27630d291f563664287588810c1f00205f7e12e3c7064ff19e68f2365" Mar 18 19:01:01 crc kubenswrapper[4830]: I0318 19:01:01.363463 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mzs5x" Mar 18 19:01:01 crc kubenswrapper[4830]: I0318 19:01:01.363923 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mzs5x" Mar 18 19:01:01 crc kubenswrapper[4830]: I0318 19:01:01.433614 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mzs5x" Mar 18 19:01:02 crc kubenswrapper[4830]: I0318 19:01:02.377894 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mzs5x" Mar 18 19:01:02 crc kubenswrapper[4830]: I0318 19:01:02.434912 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mzs5x"] Mar 18 19:01:04 crc kubenswrapper[4830]: I0318 19:01:04.318149 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mzs5x" podUID="d2135d7b-1b92-4b95-ac84-e69c0054dedb" containerName="registry-server" containerID="cri-o://f121525abfaf4e57280983a070bcbc72b740baf3ac592416833bf12ea8b656da" gracePeriod=2 Mar 18 19:01:04 crc kubenswrapper[4830]: I0318 19:01:04.717499 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mzs5x" Mar 18 19:01:04 crc kubenswrapper[4830]: I0318 19:01:04.849469 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2135d7b-1b92-4b95-ac84-e69c0054dedb-catalog-content\") pod \"d2135d7b-1b92-4b95-ac84-e69c0054dedb\" (UID: \"d2135d7b-1b92-4b95-ac84-e69c0054dedb\") " Mar 18 19:01:04 crc kubenswrapper[4830]: I0318 19:01:04.849516 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2135d7b-1b92-4b95-ac84-e69c0054dedb-utilities\") pod \"d2135d7b-1b92-4b95-ac84-e69c0054dedb\" (UID: \"d2135d7b-1b92-4b95-ac84-e69c0054dedb\") " Mar 18 19:01:04 crc kubenswrapper[4830]: I0318 19:01:04.849589 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-td7nj\" (UniqueName: \"kubernetes.io/projected/d2135d7b-1b92-4b95-ac84-e69c0054dedb-kube-api-access-td7nj\") pod \"d2135d7b-1b92-4b95-ac84-e69c0054dedb\" (UID: \"d2135d7b-1b92-4b95-ac84-e69c0054dedb\") " Mar 18 19:01:04 crc kubenswrapper[4830]: I0318 19:01:04.851178 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2135d7b-1b92-4b95-ac84-e69c0054dedb-utilities" (OuterVolumeSpecName: "utilities") pod "d2135d7b-1b92-4b95-ac84-e69c0054dedb" (UID: "d2135d7b-1b92-4b95-ac84-e69c0054dedb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:01:04 crc kubenswrapper[4830]: I0318 19:01:04.855034 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2135d7b-1b92-4b95-ac84-e69c0054dedb-kube-api-access-td7nj" (OuterVolumeSpecName: "kube-api-access-td7nj") pod "d2135d7b-1b92-4b95-ac84-e69c0054dedb" (UID: "d2135d7b-1b92-4b95-ac84-e69c0054dedb"). InnerVolumeSpecName "kube-api-access-td7nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:01:04 crc kubenswrapper[4830]: I0318 19:01:04.924184 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2135d7b-1b92-4b95-ac84-e69c0054dedb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d2135d7b-1b92-4b95-ac84-e69c0054dedb" (UID: "d2135d7b-1b92-4b95-ac84-e69c0054dedb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:01:04 crc kubenswrapper[4830]: I0318 19:01:04.951107 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-td7nj\" (UniqueName: \"kubernetes.io/projected/d2135d7b-1b92-4b95-ac84-e69c0054dedb-kube-api-access-td7nj\") on node \"crc\" DevicePath \"\"" Mar 18 19:01:04 crc kubenswrapper[4830]: I0318 19:01:04.951155 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2135d7b-1b92-4b95-ac84-e69c0054dedb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 19:01:04 crc kubenswrapper[4830]: I0318 19:01:04.951167 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2135d7b-1b92-4b95-ac84-e69c0054dedb-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 19:01:05 crc kubenswrapper[4830]: I0318 19:01:05.328963 4830 generic.go:334] "Generic (PLEG): container finished" podID="d2135d7b-1b92-4b95-ac84-e69c0054dedb" containerID="f121525abfaf4e57280983a070bcbc72b740baf3ac592416833bf12ea8b656da" exitCode=0 Mar 18 19:01:05 crc kubenswrapper[4830]: I0318 19:01:05.329046 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mzs5x" event={"ID":"d2135d7b-1b92-4b95-ac84-e69c0054dedb","Type":"ContainerDied","Data":"f121525abfaf4e57280983a070bcbc72b740baf3ac592416833bf12ea8b656da"} Mar 18 19:01:05 crc kubenswrapper[4830]: I0318 19:01:05.329089 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mzs5x" Mar 18 19:01:05 crc kubenswrapper[4830]: I0318 19:01:05.329119 4830 scope.go:117] "RemoveContainer" containerID="f121525abfaf4e57280983a070bcbc72b740baf3ac592416833bf12ea8b656da" Mar 18 19:01:05 crc kubenswrapper[4830]: I0318 19:01:05.329098 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mzs5x" event={"ID":"d2135d7b-1b92-4b95-ac84-e69c0054dedb","Type":"ContainerDied","Data":"c3e61dcfd26d1a963dd3aae0375bb7669d06678961066ea53e5c9b84411094ee"} Mar 18 19:01:05 crc kubenswrapper[4830]: I0318 19:01:05.362573 4830 scope.go:117] "RemoveContainer" containerID="3a5d61318498dd6606ddacbc398d98ec3f4e200561fda6922bfb8e2268746b08" Mar 18 19:01:05 crc kubenswrapper[4830]: I0318 19:01:05.385848 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mzs5x"] Mar 18 19:01:05 crc kubenswrapper[4830]: I0318 19:01:05.395020 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mzs5x"] Mar 18 19:01:05 crc kubenswrapper[4830]: I0318 19:01:05.399652 4830 scope.go:117] "RemoveContainer" containerID="5750eea946162874deb58d3a402dce0a5d045ca96c0cb533c94351650854a000" Mar 18 19:01:05 crc kubenswrapper[4830]: I0318 19:01:05.439565 4830 scope.go:117] "RemoveContainer" containerID="f121525abfaf4e57280983a070bcbc72b740baf3ac592416833bf12ea8b656da" Mar 18 19:01:05 crc kubenswrapper[4830]: E0318 19:01:05.439951 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f121525abfaf4e57280983a070bcbc72b740baf3ac592416833bf12ea8b656da\": container with ID starting with f121525abfaf4e57280983a070bcbc72b740baf3ac592416833bf12ea8b656da not found: ID does not exist" containerID="f121525abfaf4e57280983a070bcbc72b740baf3ac592416833bf12ea8b656da" Mar 18 19:01:05 crc kubenswrapper[4830]: I0318 19:01:05.439995 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f121525abfaf4e57280983a070bcbc72b740baf3ac592416833bf12ea8b656da"} err="failed to get container status \"f121525abfaf4e57280983a070bcbc72b740baf3ac592416833bf12ea8b656da\": rpc error: code = NotFound desc = could not find container \"f121525abfaf4e57280983a070bcbc72b740baf3ac592416833bf12ea8b656da\": container with ID starting with f121525abfaf4e57280983a070bcbc72b740baf3ac592416833bf12ea8b656da not found: ID does not exist" Mar 18 19:01:05 crc kubenswrapper[4830]: I0318 19:01:05.440021 4830 scope.go:117] "RemoveContainer" containerID="3a5d61318498dd6606ddacbc398d98ec3f4e200561fda6922bfb8e2268746b08" Mar 18 19:01:05 crc kubenswrapper[4830]: E0318 19:01:05.440279 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a5d61318498dd6606ddacbc398d98ec3f4e200561fda6922bfb8e2268746b08\": container with ID starting with 3a5d61318498dd6606ddacbc398d98ec3f4e200561fda6922bfb8e2268746b08 not found: ID does not exist" containerID="3a5d61318498dd6606ddacbc398d98ec3f4e200561fda6922bfb8e2268746b08" Mar 18 19:01:05 crc kubenswrapper[4830]: I0318 19:01:05.440322 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a5d61318498dd6606ddacbc398d98ec3f4e200561fda6922bfb8e2268746b08"} err="failed to get container status \"3a5d61318498dd6606ddacbc398d98ec3f4e200561fda6922bfb8e2268746b08\": rpc error: code = NotFound desc = could not find container \"3a5d61318498dd6606ddacbc398d98ec3f4e200561fda6922bfb8e2268746b08\": container with ID starting with 3a5d61318498dd6606ddacbc398d98ec3f4e200561fda6922bfb8e2268746b08 not found: ID does not exist" Mar 18 19:01:05 crc kubenswrapper[4830]: I0318 19:01:05.440343 4830 scope.go:117] "RemoveContainer" containerID="5750eea946162874deb58d3a402dce0a5d045ca96c0cb533c94351650854a000" Mar 18 19:01:05 crc kubenswrapper[4830]: E0318 19:01:05.440520 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5750eea946162874deb58d3a402dce0a5d045ca96c0cb533c94351650854a000\": container with ID starting with 5750eea946162874deb58d3a402dce0a5d045ca96c0cb533c94351650854a000 not found: ID does not exist" containerID="5750eea946162874deb58d3a402dce0a5d045ca96c0cb533c94351650854a000" Mar 18 19:01:05 crc kubenswrapper[4830]: I0318 19:01:05.440550 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5750eea946162874deb58d3a402dce0a5d045ca96c0cb533c94351650854a000"} err="failed to get container status \"5750eea946162874deb58d3a402dce0a5d045ca96c0cb533c94351650854a000\": rpc error: code = NotFound desc = could not find container \"5750eea946162874deb58d3a402dce0a5d045ca96c0cb533c94351650854a000\": container with ID starting with 5750eea946162874deb58d3a402dce0a5d045ca96c0cb533c94351650854a000 not found: ID does not exist" Mar 18 19:01:06 crc kubenswrapper[4830]: I0318 19:01:06.254037 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2135d7b-1b92-4b95-ac84-e69c0054dedb" path="/var/lib/kubelet/pods/d2135d7b-1b92-4b95-ac84-e69c0054dedb/volumes" Mar 18 19:02:00 crc kubenswrapper[4830]: I0318 19:02:00.152119 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564342-wctwx"] Mar 18 19:02:00 crc kubenswrapper[4830]: E0318 19:02:00.152957 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2135d7b-1b92-4b95-ac84-e69c0054dedb" containerName="extract-content" Mar 18 19:02:00 crc kubenswrapper[4830]: I0318 19:02:00.152972 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2135d7b-1b92-4b95-ac84-e69c0054dedb" containerName="extract-content" Mar 18 19:02:00 crc kubenswrapper[4830]: E0318 19:02:00.153001 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2135d7b-1b92-4b95-ac84-e69c0054dedb" containerName="extract-utilities" Mar 18 19:02:00 crc kubenswrapper[4830]: I0318 19:02:00.153007 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2135d7b-1b92-4b95-ac84-e69c0054dedb" containerName="extract-utilities" Mar 18 19:02:00 crc kubenswrapper[4830]: E0318 19:02:00.153023 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2135d7b-1b92-4b95-ac84-e69c0054dedb" containerName="registry-server" Mar 18 19:02:00 crc kubenswrapper[4830]: I0318 19:02:00.153029 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2135d7b-1b92-4b95-ac84-e69c0054dedb" containerName="registry-server" Mar 18 19:02:00 crc kubenswrapper[4830]: I0318 19:02:00.153165 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2135d7b-1b92-4b95-ac84-e69c0054dedb" containerName="registry-server" Mar 18 19:02:00 crc kubenswrapper[4830]: I0318 19:02:00.153632 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564342-wctwx" Mar 18 19:02:00 crc kubenswrapper[4830]: I0318 19:02:00.155950 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564342-wctwx"] Mar 18 19:02:00 crc kubenswrapper[4830]: I0318 19:02:00.156093 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:02:00 crc kubenswrapper[4830]: I0318 19:02:00.156371 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 19:02:00 crc kubenswrapper[4830]: I0318 19:02:00.156741 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:02:00 crc kubenswrapper[4830]: I0318 19:02:00.240017 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24ftm\" (UniqueName: \"kubernetes.io/projected/6d0ec0fb-f334-41e8-a5e6-47d858b02b63-kube-api-access-24ftm\") pod \"auto-csr-approver-29564342-wctwx\" (UID: \"6d0ec0fb-f334-41e8-a5e6-47d858b02b63\") " pod="openshift-infra/auto-csr-approver-29564342-wctwx" Mar 18 19:02:00 crc kubenswrapper[4830]: I0318 19:02:00.340896 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24ftm\" (UniqueName: \"kubernetes.io/projected/6d0ec0fb-f334-41e8-a5e6-47d858b02b63-kube-api-access-24ftm\") pod \"auto-csr-approver-29564342-wctwx\" (UID: \"6d0ec0fb-f334-41e8-a5e6-47d858b02b63\") " pod="openshift-infra/auto-csr-approver-29564342-wctwx" Mar 18 19:02:00 crc kubenswrapper[4830]: I0318 19:02:00.362685 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24ftm\" (UniqueName: \"kubernetes.io/projected/6d0ec0fb-f334-41e8-a5e6-47d858b02b63-kube-api-access-24ftm\") pod \"auto-csr-approver-29564342-wctwx\" (UID: \"6d0ec0fb-f334-41e8-a5e6-47d858b02b63\") " pod="openshift-infra/auto-csr-approver-29564342-wctwx" Mar 18 19:02:00 crc kubenswrapper[4830]: I0318 19:02:00.507158 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564342-wctwx" Mar 18 19:02:00 crc kubenswrapper[4830]: I0318 19:02:00.761149 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564342-wctwx"] Mar 18 19:02:00 crc kubenswrapper[4830]: I0318 19:02:00.770335 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 19:02:00 crc kubenswrapper[4830]: I0318 19:02:00.810805 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564342-wctwx" event={"ID":"6d0ec0fb-f334-41e8-a5e6-47d858b02b63","Type":"ContainerStarted","Data":"0ba3dd8b49258590a11cff309f41492e862b4d0280ea5757abb3b5c3469c8dd9"} Mar 18 19:02:02 crc kubenswrapper[4830]: I0318 19:02:02.833487 4830 generic.go:334] "Generic (PLEG): container finished" podID="6d0ec0fb-f334-41e8-a5e6-47d858b02b63" containerID="ccd66cf137b643dbdd204b60b4c5c4fc0dabe230af9303e60222676324d01db4" exitCode=0 Mar 18 19:02:02 crc kubenswrapper[4830]: I0318 19:02:02.833702 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564342-wctwx" event={"ID":"6d0ec0fb-f334-41e8-a5e6-47d858b02b63","Type":"ContainerDied","Data":"ccd66cf137b643dbdd204b60b4c5c4fc0dabe230af9303e60222676324d01db4"} Mar 18 19:02:04 crc kubenswrapper[4830]: I0318 19:02:04.222277 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564342-wctwx" Mar 18 19:02:04 crc kubenswrapper[4830]: I0318 19:02:04.418823 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24ftm\" (UniqueName: \"kubernetes.io/projected/6d0ec0fb-f334-41e8-a5e6-47d858b02b63-kube-api-access-24ftm\") pod \"6d0ec0fb-f334-41e8-a5e6-47d858b02b63\" (UID: \"6d0ec0fb-f334-41e8-a5e6-47d858b02b63\") " Mar 18 19:02:04 crc kubenswrapper[4830]: I0318 19:02:04.425988 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d0ec0fb-f334-41e8-a5e6-47d858b02b63-kube-api-access-24ftm" (OuterVolumeSpecName: "kube-api-access-24ftm") pod "6d0ec0fb-f334-41e8-a5e6-47d858b02b63" (UID: "6d0ec0fb-f334-41e8-a5e6-47d858b02b63"). InnerVolumeSpecName "kube-api-access-24ftm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:02:04 crc kubenswrapper[4830]: I0318 19:02:04.521104 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24ftm\" (UniqueName: \"kubernetes.io/projected/6d0ec0fb-f334-41e8-a5e6-47d858b02b63-kube-api-access-24ftm\") on node \"crc\" DevicePath \"\"" Mar 18 19:02:04 crc kubenswrapper[4830]: I0318 19:02:04.855281 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564342-wctwx" event={"ID":"6d0ec0fb-f334-41e8-a5e6-47d858b02b63","Type":"ContainerDied","Data":"0ba3dd8b49258590a11cff309f41492e862b4d0280ea5757abb3b5c3469c8dd9"} Mar 18 19:02:04 crc kubenswrapper[4830]: I0318 19:02:04.855356 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ba3dd8b49258590a11cff309f41492e862b4d0280ea5757abb3b5c3469c8dd9" Mar 18 19:02:04 crc kubenswrapper[4830]: I0318 19:02:04.855849 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564342-wctwx" Mar 18 19:02:05 crc kubenswrapper[4830]: I0318 19:02:05.322298 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564336-z7b68"] Mar 18 19:02:05 crc kubenswrapper[4830]: I0318 19:02:05.333352 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564336-z7b68"] Mar 18 19:02:06 crc kubenswrapper[4830]: I0318 19:02:06.250831 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39085195-bae6-4f11-8a16-5e96419af3ac" path="/var/lib/kubelet/pods/39085195-bae6-4f11-8a16-5e96419af3ac/volumes" Mar 18 19:02:18 crc kubenswrapper[4830]: I0318 19:02:18.346479 4830 scope.go:117] "RemoveContainer" containerID="1f525529fa7310d844b326bd46a6be05b38d9908fde378270e8cc542cbffeb1c" Mar 18 19:02:59 crc kubenswrapper[4830]: I0318 19:02:59.509605 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:02:59 crc kubenswrapper[4830]: I0318 19:02:59.510338 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:03:29 crc kubenswrapper[4830]: I0318 19:03:29.510286 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:03:29 crc kubenswrapper[4830]: I0318 19:03:29.511159 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:03:59 crc kubenswrapper[4830]: I0318 19:03:59.509485 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:03:59 crc kubenswrapper[4830]: I0318 19:03:59.510283 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:03:59 crc kubenswrapper[4830]: I0318 19:03:59.510360 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" Mar 18 19:03:59 crc kubenswrapper[4830]: I0318 19:03:59.511352 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c8d0fa381806ea089b307b93dbbb4d8d9b6965319f263d699b74c2b596415bd8"} pod="openshift-machine-config-operator/machine-config-daemon-plzpb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 19:03:59 crc kubenswrapper[4830]: I0318 19:03:59.511453 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" containerID="cri-o://c8d0fa381806ea089b307b93dbbb4d8d9b6965319f263d699b74c2b596415bd8" gracePeriod=600 Mar 18 19:03:59 crc kubenswrapper[4830]: E0318 19:03:59.645025 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:03:59 crc kubenswrapper[4830]: I0318 19:03:59.950937 4830 generic.go:334] "Generic (PLEG): container finished" podID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerID="c8d0fa381806ea089b307b93dbbb4d8d9b6965319f263d699b74c2b596415bd8" exitCode=0 Mar 18 19:03:59 crc kubenswrapper[4830]: I0318 19:03:59.951146 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerDied","Data":"c8d0fa381806ea089b307b93dbbb4d8d9b6965319f263d699b74c2b596415bd8"} Mar 18 19:03:59 crc kubenswrapper[4830]: I0318 19:03:59.951359 4830 scope.go:117] "RemoveContainer" containerID="13b22ec6d37d4fdb9af379b78e2efb5238b0fc5b9b902fa2c6d79182a9053811" Mar 18 19:03:59 crc kubenswrapper[4830]: I0318 19:03:59.952015 4830 scope.go:117] "RemoveContainer" containerID="c8d0fa381806ea089b307b93dbbb4d8d9b6965319f263d699b74c2b596415bd8" Mar 18 19:03:59 crc kubenswrapper[4830]: E0318 19:03:59.952383 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:04:00 crc kubenswrapper[4830]: I0318 19:04:00.160479 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564344-nm6mj"] Mar 18 19:04:00 crc kubenswrapper[4830]: E0318 19:04:00.160872 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d0ec0fb-f334-41e8-a5e6-47d858b02b63" containerName="oc" Mar 18 19:04:00 crc kubenswrapper[4830]: I0318 19:04:00.160891 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d0ec0fb-f334-41e8-a5e6-47d858b02b63" containerName="oc" Mar 18 19:04:00 crc kubenswrapper[4830]: I0318 19:04:00.161071 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d0ec0fb-f334-41e8-a5e6-47d858b02b63" containerName="oc" Mar 18 19:04:00 crc kubenswrapper[4830]: I0318 19:04:00.161833 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564344-nm6mj" Mar 18 19:04:00 crc kubenswrapper[4830]: I0318 19:04:00.164478 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:04:00 crc kubenswrapper[4830]: I0318 19:04:00.164856 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 19:04:00 crc kubenswrapper[4830]: I0318 19:04:00.164942 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:04:00 crc kubenswrapper[4830]: I0318 19:04:00.180596 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564344-nm6mj"] Mar 18 19:04:00 crc kubenswrapper[4830]: I0318 19:04:00.250429 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbtxz\" (UniqueName: \"kubernetes.io/projected/105ad763-b28d-48b2-a76a-dd01caf486b3-kube-api-access-fbtxz\") pod \"auto-csr-approver-29564344-nm6mj\" (UID: \"105ad763-b28d-48b2-a76a-dd01caf486b3\") " pod="openshift-infra/auto-csr-approver-29564344-nm6mj" Mar 18 19:04:00 crc kubenswrapper[4830]: I0318 19:04:00.352526 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbtxz\" (UniqueName: \"kubernetes.io/projected/105ad763-b28d-48b2-a76a-dd01caf486b3-kube-api-access-fbtxz\") pod \"auto-csr-approver-29564344-nm6mj\" (UID: \"105ad763-b28d-48b2-a76a-dd01caf486b3\") " pod="openshift-infra/auto-csr-approver-29564344-nm6mj" Mar 18 19:04:00 crc kubenswrapper[4830]: I0318 19:04:00.386208 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbtxz\" (UniqueName: \"kubernetes.io/projected/105ad763-b28d-48b2-a76a-dd01caf486b3-kube-api-access-fbtxz\") pod \"auto-csr-approver-29564344-nm6mj\" (UID: \"105ad763-b28d-48b2-a76a-dd01caf486b3\") " pod="openshift-infra/auto-csr-approver-29564344-nm6mj" Mar 18 19:04:00 crc kubenswrapper[4830]: I0318 19:04:00.491369 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564344-nm6mj" Mar 18 19:04:00 crc kubenswrapper[4830]: I0318 19:04:00.794476 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564344-nm6mj"] Mar 18 19:04:00 crc kubenswrapper[4830]: I0318 19:04:00.963763 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564344-nm6mj" event={"ID":"105ad763-b28d-48b2-a76a-dd01caf486b3","Type":"ContainerStarted","Data":"d6026138c94f9307f07e289093644bfd2b7a43343fcc1dbedde82ddea420d8a8"} Mar 18 19:04:02 crc kubenswrapper[4830]: I0318 19:04:02.997637 4830 generic.go:334] "Generic (PLEG): container finished" podID="105ad763-b28d-48b2-a76a-dd01caf486b3" containerID="8b1ec9e3a0da55248708acac973d7994798f9b298376f475afb8c930b9fa502e" exitCode=0 Mar 18 19:04:02 crc kubenswrapper[4830]: I0318 19:04:02.997716 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564344-nm6mj" event={"ID":"105ad763-b28d-48b2-a76a-dd01caf486b3","Type":"ContainerDied","Data":"8b1ec9e3a0da55248708acac973d7994798f9b298376f475afb8c930b9fa502e"} Mar 18 19:04:04 crc kubenswrapper[4830]: I0318 19:04:04.347961 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564344-nm6mj" Mar 18 19:04:04 crc kubenswrapper[4830]: I0318 19:04:04.415062 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbtxz\" (UniqueName: \"kubernetes.io/projected/105ad763-b28d-48b2-a76a-dd01caf486b3-kube-api-access-fbtxz\") pod \"105ad763-b28d-48b2-a76a-dd01caf486b3\" (UID: \"105ad763-b28d-48b2-a76a-dd01caf486b3\") " Mar 18 19:04:04 crc kubenswrapper[4830]: I0318 19:04:04.426422 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/105ad763-b28d-48b2-a76a-dd01caf486b3-kube-api-access-fbtxz" (OuterVolumeSpecName: "kube-api-access-fbtxz") pod "105ad763-b28d-48b2-a76a-dd01caf486b3" (UID: "105ad763-b28d-48b2-a76a-dd01caf486b3"). InnerVolumeSpecName "kube-api-access-fbtxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:04:04 crc kubenswrapper[4830]: I0318 19:04:04.517505 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbtxz\" (UniqueName: \"kubernetes.io/projected/105ad763-b28d-48b2-a76a-dd01caf486b3-kube-api-access-fbtxz\") on node \"crc\" DevicePath \"\"" Mar 18 19:04:05 crc kubenswrapper[4830]: I0318 19:04:05.023268 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564344-nm6mj" event={"ID":"105ad763-b28d-48b2-a76a-dd01caf486b3","Type":"ContainerDied","Data":"d6026138c94f9307f07e289093644bfd2b7a43343fcc1dbedde82ddea420d8a8"} Mar 18 19:04:05 crc kubenswrapper[4830]: I0318 19:04:05.023330 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6026138c94f9307f07e289093644bfd2b7a43343fcc1dbedde82ddea420d8a8" Mar 18 19:04:05 crc kubenswrapper[4830]: I0318 19:04:05.023337 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564344-nm6mj" Mar 18 19:04:05 crc kubenswrapper[4830]: I0318 19:04:05.429037 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564338-tfzcn"] Mar 18 19:04:05 crc kubenswrapper[4830]: I0318 19:04:05.433445 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564338-tfzcn"] Mar 18 19:04:06 crc kubenswrapper[4830]: I0318 19:04:06.250497 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8276d1d-11ff-4811-8cb6-7cec792c49f9" path="/var/lib/kubelet/pods/f8276d1d-11ff-4811-8cb6-7cec792c49f9/volumes" Mar 18 19:04:13 crc kubenswrapper[4830]: I0318 19:04:13.234862 4830 scope.go:117] "RemoveContainer" containerID="c8d0fa381806ea089b307b93dbbb4d8d9b6965319f263d699b74c2b596415bd8" Mar 18 19:04:13 crc kubenswrapper[4830]: E0318 19:04:13.235559 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:04:18 crc kubenswrapper[4830]: I0318 19:04:18.453688 4830 scope.go:117] "RemoveContainer" containerID="4cb2c821e989437d9b62683190193bb9224cbe79a296cb322cd554dfd5389d56" Mar 18 19:04:28 crc kubenswrapper[4830]: I0318 19:04:28.235647 4830 scope.go:117] "RemoveContainer" containerID="c8d0fa381806ea089b307b93dbbb4d8d9b6965319f263d699b74c2b596415bd8" Mar 18 19:04:28 crc kubenswrapper[4830]: E0318 19:04:28.236428 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:04:38 crc kubenswrapper[4830]: I0318 19:04:38.690584 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6mkvh"] Mar 18 19:04:38 crc kubenswrapper[4830]: E0318 19:04:38.691699 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="105ad763-b28d-48b2-a76a-dd01caf486b3" containerName="oc" Mar 18 19:04:38 crc kubenswrapper[4830]: I0318 19:04:38.691721 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="105ad763-b28d-48b2-a76a-dd01caf486b3" containerName="oc" Mar 18 19:04:38 crc kubenswrapper[4830]: I0318 19:04:38.691985 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="105ad763-b28d-48b2-a76a-dd01caf486b3" containerName="oc" Mar 18 19:04:38 crc kubenswrapper[4830]: I0318 19:04:38.698005 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6mkvh" Mar 18 19:04:38 crc kubenswrapper[4830]: I0318 19:04:38.703656 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6mkvh"] Mar 18 19:04:38 crc kubenswrapper[4830]: I0318 19:04:38.872704 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93b6c420-8c78-4e77-a50d-62c9527e5907-utilities\") pod \"redhat-operators-6mkvh\" (UID: \"93b6c420-8c78-4e77-a50d-62c9527e5907\") " pod="openshift-marketplace/redhat-operators-6mkvh" Mar 18 19:04:38 crc kubenswrapper[4830]: I0318 19:04:38.872761 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvw7b\" (UniqueName: \"kubernetes.io/projected/93b6c420-8c78-4e77-a50d-62c9527e5907-kube-api-access-zvw7b\") pod \"redhat-operators-6mkvh\" (UID: \"93b6c420-8c78-4e77-a50d-62c9527e5907\") " pod="openshift-marketplace/redhat-operators-6mkvh" Mar 18 19:04:38 crc kubenswrapper[4830]: I0318 19:04:38.872895 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93b6c420-8c78-4e77-a50d-62c9527e5907-catalog-content\") pod \"redhat-operators-6mkvh\" (UID: \"93b6c420-8c78-4e77-a50d-62c9527e5907\") " pod="openshift-marketplace/redhat-operators-6mkvh" Mar 18 19:04:38 crc kubenswrapper[4830]: I0318 19:04:38.973996 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93b6c420-8c78-4e77-a50d-62c9527e5907-catalog-content\") pod \"redhat-operators-6mkvh\" (UID: \"93b6c420-8c78-4e77-a50d-62c9527e5907\") " pod="openshift-marketplace/redhat-operators-6mkvh" Mar 18 19:04:38 crc kubenswrapper[4830]: I0318 19:04:38.974303 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93b6c420-8c78-4e77-a50d-62c9527e5907-utilities\") pod \"redhat-operators-6mkvh\" (UID: \"93b6c420-8c78-4e77-a50d-62c9527e5907\") " pod="openshift-marketplace/redhat-operators-6mkvh" Mar 18 19:04:38 crc kubenswrapper[4830]: I0318 19:04:38.974324 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvw7b\" (UniqueName: \"kubernetes.io/projected/93b6c420-8c78-4e77-a50d-62c9527e5907-kube-api-access-zvw7b\") pod \"redhat-operators-6mkvh\" (UID: \"93b6c420-8c78-4e77-a50d-62c9527e5907\") " pod="openshift-marketplace/redhat-operators-6mkvh" Mar 18 19:04:38 crc kubenswrapper[4830]: I0318 19:04:38.974454 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93b6c420-8c78-4e77-a50d-62c9527e5907-catalog-content\") pod \"redhat-operators-6mkvh\" (UID: \"93b6c420-8c78-4e77-a50d-62c9527e5907\") " pod="openshift-marketplace/redhat-operators-6mkvh" Mar 18 19:04:38 crc kubenswrapper[4830]: I0318 19:04:38.974634 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93b6c420-8c78-4e77-a50d-62c9527e5907-utilities\") pod \"redhat-operators-6mkvh\" (UID: \"93b6c420-8c78-4e77-a50d-62c9527e5907\") " pod="openshift-marketplace/redhat-operators-6mkvh" Mar 18 19:04:38 crc kubenswrapper[4830]: I0318 19:04:38.996161 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvw7b\" (UniqueName: \"kubernetes.io/projected/93b6c420-8c78-4e77-a50d-62c9527e5907-kube-api-access-zvw7b\") pod \"redhat-operators-6mkvh\" (UID: \"93b6c420-8c78-4e77-a50d-62c9527e5907\") " pod="openshift-marketplace/redhat-operators-6mkvh" Mar 18 19:04:39 crc kubenswrapper[4830]: I0318 19:04:39.040589 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6mkvh" Mar 18 19:04:39 crc kubenswrapper[4830]: I0318 19:04:39.312567 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6mkvh"] Mar 18 19:04:39 crc kubenswrapper[4830]: I0318 19:04:39.325739 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6mkvh" event={"ID":"93b6c420-8c78-4e77-a50d-62c9527e5907","Type":"ContainerStarted","Data":"4b631223261a1d6bb5a4795089bfc0c93db8f2e18b44e51c1973799c6be4c01e"} Mar 18 19:04:40 crc kubenswrapper[4830]: I0318 19:04:40.235474 4830 scope.go:117] "RemoveContainer" containerID="c8d0fa381806ea089b307b93dbbb4d8d9b6965319f263d699b74c2b596415bd8" Mar 18 19:04:40 crc kubenswrapper[4830]: E0318 19:04:40.235884 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:04:40 crc kubenswrapper[4830]: I0318 19:04:40.336604 4830 generic.go:334] "Generic (PLEG): container finished" podID="93b6c420-8c78-4e77-a50d-62c9527e5907" containerID="2f4e2c35e95c77cb1e0ad58564f6fc0614c1cb3746557f81552ae52ad6b8af5c" exitCode=0 Mar 18 19:04:40 crc kubenswrapper[4830]: I0318 19:04:40.336670 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6mkvh" event={"ID":"93b6c420-8c78-4e77-a50d-62c9527e5907","Type":"ContainerDied","Data":"2f4e2c35e95c77cb1e0ad58564f6fc0614c1cb3746557f81552ae52ad6b8af5c"} Mar 18 19:04:42 crc kubenswrapper[4830]: I0318 19:04:42.356083 4830 generic.go:334] "Generic (PLEG): container finished" podID="93b6c420-8c78-4e77-a50d-62c9527e5907" containerID="a53439c9cddf7977cf41f30ef6534a07726d19274baedb8bf19709b3367ad961" exitCode=0 Mar 18 19:04:42 crc kubenswrapper[4830]: I0318 19:04:42.356182 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6mkvh" event={"ID":"93b6c420-8c78-4e77-a50d-62c9527e5907","Type":"ContainerDied","Data":"a53439c9cddf7977cf41f30ef6534a07726d19274baedb8bf19709b3367ad961"} Mar 18 19:04:43 crc kubenswrapper[4830]: I0318 19:04:43.370195 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6mkvh" event={"ID":"93b6c420-8c78-4e77-a50d-62c9527e5907","Type":"ContainerStarted","Data":"4e4c83fcfee062a8c8bca122f225126517479822e38928f47aa94af2e8681b3e"} Mar 18 19:04:43 crc kubenswrapper[4830]: I0318 19:04:43.399412 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6mkvh" podStartSLOduration=2.8631282970000003 podStartE2EDuration="5.399387223s" podCreationTimestamp="2026-03-18 19:04:38 +0000 UTC" firstStartedPulling="2026-03-18 19:04:40.339150594 +0000 UTC m=+3714.906780966" lastFinishedPulling="2026-03-18 19:04:42.87540953 +0000 UTC m=+3717.443039892" observedRunningTime="2026-03-18 19:04:43.396232134 +0000 UTC m=+3717.963862546" watchObservedRunningTime="2026-03-18 19:04:43.399387223 +0000 UTC m=+3717.967017575" Mar 18 19:04:49 crc kubenswrapper[4830]: I0318 19:04:49.041301 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6mkvh" Mar 18 19:04:49 crc kubenswrapper[4830]: I0318 19:04:49.042126 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6mkvh" Mar 18 19:04:50 crc kubenswrapper[4830]: I0318 19:04:50.113596 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6mkvh" podUID="93b6c420-8c78-4e77-a50d-62c9527e5907" containerName="registry-server" probeResult="failure" output=< Mar 18 19:04:50 crc kubenswrapper[4830]: timeout: failed to connect service ":50051" within 1s Mar 18 19:04:50 crc kubenswrapper[4830]: > Mar 18 19:04:52 crc kubenswrapper[4830]: I0318 19:04:52.234964 4830 scope.go:117] "RemoveContainer" containerID="c8d0fa381806ea089b307b93dbbb4d8d9b6965319f263d699b74c2b596415bd8" Mar 18 19:04:52 crc kubenswrapper[4830]: E0318 19:04:52.235258 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:04:59 crc kubenswrapper[4830]: I0318 19:04:59.108460 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6mkvh" Mar 18 19:04:59 crc kubenswrapper[4830]: I0318 19:04:59.165664 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6mkvh" Mar 18 19:04:59 crc kubenswrapper[4830]: I0318 19:04:59.358047 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6mkvh"] Mar 18 19:05:00 crc kubenswrapper[4830]: I0318 19:05:00.520450 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6mkvh" podUID="93b6c420-8c78-4e77-a50d-62c9527e5907" containerName="registry-server" containerID="cri-o://4e4c83fcfee062a8c8bca122f225126517479822e38928f47aa94af2e8681b3e" gracePeriod=2 Mar 18 19:05:01 crc kubenswrapper[4830]: I0318 19:05:01.006356 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6mkvh" Mar 18 19:05:01 crc kubenswrapper[4830]: I0318 19:05:01.176224 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93b6c420-8c78-4e77-a50d-62c9527e5907-catalog-content\") pod \"93b6c420-8c78-4e77-a50d-62c9527e5907\" (UID: \"93b6c420-8c78-4e77-a50d-62c9527e5907\") " Mar 18 19:05:01 crc kubenswrapper[4830]: I0318 19:05:01.177997 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93b6c420-8c78-4e77-a50d-62c9527e5907-utilities\") pod \"93b6c420-8c78-4e77-a50d-62c9527e5907\" (UID: \"93b6c420-8c78-4e77-a50d-62c9527e5907\") " Mar 18 19:05:01 crc kubenswrapper[4830]: I0318 19:05:01.178345 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvw7b\" (UniqueName: \"kubernetes.io/projected/93b6c420-8c78-4e77-a50d-62c9527e5907-kube-api-access-zvw7b\") pod \"93b6c420-8c78-4e77-a50d-62c9527e5907\" (UID: \"93b6c420-8c78-4e77-a50d-62c9527e5907\") " Mar 18 19:05:01 crc kubenswrapper[4830]: I0318 19:05:01.179607 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93b6c420-8c78-4e77-a50d-62c9527e5907-utilities" (OuterVolumeSpecName: "utilities") pod "93b6c420-8c78-4e77-a50d-62c9527e5907" (UID: "93b6c420-8c78-4e77-a50d-62c9527e5907"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:05:01 crc kubenswrapper[4830]: I0318 19:05:01.187659 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93b6c420-8c78-4e77-a50d-62c9527e5907-kube-api-access-zvw7b" (OuterVolumeSpecName: "kube-api-access-zvw7b") pod "93b6c420-8c78-4e77-a50d-62c9527e5907" (UID: "93b6c420-8c78-4e77-a50d-62c9527e5907"). InnerVolumeSpecName "kube-api-access-zvw7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:05:01 crc kubenswrapper[4830]: I0318 19:05:01.280638 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvw7b\" (UniqueName: \"kubernetes.io/projected/93b6c420-8c78-4e77-a50d-62c9527e5907-kube-api-access-zvw7b\") on node \"crc\" DevicePath \"\"" Mar 18 19:05:01 crc kubenswrapper[4830]: I0318 19:05:01.280690 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93b6c420-8c78-4e77-a50d-62c9527e5907-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 19:05:01 crc kubenswrapper[4830]: I0318 19:05:01.355985 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93b6c420-8c78-4e77-a50d-62c9527e5907-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "93b6c420-8c78-4e77-a50d-62c9527e5907" (UID: "93b6c420-8c78-4e77-a50d-62c9527e5907"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:05:01 crc kubenswrapper[4830]: I0318 19:05:01.385145 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93b6c420-8c78-4e77-a50d-62c9527e5907-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 19:05:01 crc kubenswrapper[4830]: I0318 19:05:01.530858 4830 generic.go:334] "Generic (PLEG): container finished" podID="93b6c420-8c78-4e77-a50d-62c9527e5907" containerID="4e4c83fcfee062a8c8bca122f225126517479822e38928f47aa94af2e8681b3e" exitCode=0 Mar 18 19:05:01 crc kubenswrapper[4830]: I0318 19:05:01.530953 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6mkvh" event={"ID":"93b6c420-8c78-4e77-a50d-62c9527e5907","Type":"ContainerDied","Data":"4e4c83fcfee062a8c8bca122f225126517479822e38928f47aa94af2e8681b3e"} Mar 18 19:05:01 crc kubenswrapper[4830]: I0318 19:05:01.530962 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6mkvh" Mar 18 19:05:01 crc kubenswrapper[4830]: I0318 19:05:01.531009 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6mkvh" event={"ID":"93b6c420-8c78-4e77-a50d-62c9527e5907","Type":"ContainerDied","Data":"4b631223261a1d6bb5a4795089bfc0c93db8f2e18b44e51c1973799c6be4c01e"} Mar 18 19:05:01 crc kubenswrapper[4830]: I0318 19:05:01.531052 4830 scope.go:117] "RemoveContainer" containerID="4e4c83fcfee062a8c8bca122f225126517479822e38928f47aa94af2e8681b3e" Mar 18 19:05:01 crc kubenswrapper[4830]: I0318 19:05:01.559166 4830 scope.go:117] "RemoveContainer" containerID="a53439c9cddf7977cf41f30ef6534a07726d19274baedb8bf19709b3367ad961" Mar 18 19:05:01 crc kubenswrapper[4830]: I0318 19:05:01.587595 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6mkvh"] Mar 18 19:05:01 crc kubenswrapper[4830]: I0318 19:05:01.600918 4830 scope.go:117] "RemoveContainer" containerID="2f4e2c35e95c77cb1e0ad58564f6fc0614c1cb3746557f81552ae52ad6b8af5c" Mar 18 19:05:01 crc kubenswrapper[4830]: I0318 19:05:01.604810 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6mkvh"] Mar 18 19:05:01 crc kubenswrapper[4830]: I0318 19:05:01.638941 4830 scope.go:117] "RemoveContainer" containerID="4e4c83fcfee062a8c8bca122f225126517479822e38928f47aa94af2e8681b3e" Mar 18 19:05:01 crc kubenswrapper[4830]: E0318 19:05:01.639570 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e4c83fcfee062a8c8bca122f225126517479822e38928f47aa94af2e8681b3e\": container with ID starting with 4e4c83fcfee062a8c8bca122f225126517479822e38928f47aa94af2e8681b3e not found: ID does not exist" containerID="4e4c83fcfee062a8c8bca122f225126517479822e38928f47aa94af2e8681b3e" Mar 18 19:05:01 crc kubenswrapper[4830]: I0318 19:05:01.639643 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e4c83fcfee062a8c8bca122f225126517479822e38928f47aa94af2e8681b3e"} err="failed to get container status \"4e4c83fcfee062a8c8bca122f225126517479822e38928f47aa94af2e8681b3e\": rpc error: code = NotFound desc = could not find container \"4e4c83fcfee062a8c8bca122f225126517479822e38928f47aa94af2e8681b3e\": container with ID starting with 4e4c83fcfee062a8c8bca122f225126517479822e38928f47aa94af2e8681b3e not found: ID does not exist" Mar 18 19:05:01 crc kubenswrapper[4830]: I0318 19:05:01.639710 4830 scope.go:117] "RemoveContainer" containerID="a53439c9cddf7977cf41f30ef6534a07726d19274baedb8bf19709b3367ad961" Mar 18 19:05:01 crc kubenswrapper[4830]: E0318 19:05:01.640508 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a53439c9cddf7977cf41f30ef6534a07726d19274baedb8bf19709b3367ad961\": container with ID starting with a53439c9cddf7977cf41f30ef6534a07726d19274baedb8bf19709b3367ad961 not found: ID does not exist" containerID="a53439c9cddf7977cf41f30ef6534a07726d19274baedb8bf19709b3367ad961" Mar 18 19:05:01 crc kubenswrapper[4830]: I0318 19:05:01.640569 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a53439c9cddf7977cf41f30ef6534a07726d19274baedb8bf19709b3367ad961"} err="failed to get container status \"a53439c9cddf7977cf41f30ef6534a07726d19274baedb8bf19709b3367ad961\": rpc error: code = NotFound desc = could not find container \"a53439c9cddf7977cf41f30ef6534a07726d19274baedb8bf19709b3367ad961\": container with ID starting with a53439c9cddf7977cf41f30ef6534a07726d19274baedb8bf19709b3367ad961 not found: ID does not exist" Mar 18 19:05:01 crc kubenswrapper[4830]: I0318 19:05:01.640607 4830 scope.go:117] "RemoveContainer" containerID="2f4e2c35e95c77cb1e0ad58564f6fc0614c1cb3746557f81552ae52ad6b8af5c" Mar 18 19:05:01 crc kubenswrapper[4830]: E0318 19:05:01.641230 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f4e2c35e95c77cb1e0ad58564f6fc0614c1cb3746557f81552ae52ad6b8af5c\": container with ID starting with 2f4e2c35e95c77cb1e0ad58564f6fc0614c1cb3746557f81552ae52ad6b8af5c not found: ID does not exist" containerID="2f4e2c35e95c77cb1e0ad58564f6fc0614c1cb3746557f81552ae52ad6b8af5c" Mar 18 19:05:01 crc kubenswrapper[4830]: I0318 19:05:01.641286 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f4e2c35e95c77cb1e0ad58564f6fc0614c1cb3746557f81552ae52ad6b8af5c"} err="failed to get container status \"2f4e2c35e95c77cb1e0ad58564f6fc0614c1cb3746557f81552ae52ad6b8af5c\": rpc error: code = NotFound desc = could not find container \"2f4e2c35e95c77cb1e0ad58564f6fc0614c1cb3746557f81552ae52ad6b8af5c\": container with ID starting with 2f4e2c35e95c77cb1e0ad58564f6fc0614c1cb3746557f81552ae52ad6b8af5c not found: ID does not exist" Mar 18 19:05:02 crc kubenswrapper[4830]: I0318 19:05:02.250956 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93b6c420-8c78-4e77-a50d-62c9527e5907" path="/var/lib/kubelet/pods/93b6c420-8c78-4e77-a50d-62c9527e5907/volumes" Mar 18 19:05:04 crc kubenswrapper[4830]: I0318 19:05:04.234922 4830 scope.go:117] "RemoveContainer" containerID="c8d0fa381806ea089b307b93dbbb4d8d9b6965319f263d699b74c2b596415bd8" Mar 18 19:05:04 crc kubenswrapper[4830]: E0318 19:05:04.235673 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:05:16 crc kubenswrapper[4830]: I0318 19:05:16.243211 4830 scope.go:117] "RemoveContainer" containerID="c8d0fa381806ea089b307b93dbbb4d8d9b6965319f263d699b74c2b596415bd8" Mar 18 19:05:16 crc kubenswrapper[4830]: E0318 19:05:16.244090 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:05:16 crc kubenswrapper[4830]: I0318 19:05:16.970823 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qnf6r"] Mar 18 19:05:16 crc kubenswrapper[4830]: E0318 19:05:16.971250 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93b6c420-8c78-4e77-a50d-62c9527e5907" containerName="registry-server" Mar 18 19:05:16 crc kubenswrapper[4830]: I0318 19:05:16.971281 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="93b6c420-8c78-4e77-a50d-62c9527e5907" containerName="registry-server" Mar 18 19:05:16 crc kubenswrapper[4830]: E0318 19:05:16.971299 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93b6c420-8c78-4e77-a50d-62c9527e5907" containerName="extract-content" Mar 18 19:05:16 crc kubenswrapper[4830]: I0318 19:05:16.971309 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="93b6c420-8c78-4e77-a50d-62c9527e5907" containerName="extract-content" Mar 18 19:05:16 crc kubenswrapper[4830]: E0318 19:05:16.971326 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93b6c420-8c78-4e77-a50d-62c9527e5907" containerName="extract-utilities" Mar 18 19:05:16 crc kubenswrapper[4830]: I0318 19:05:16.971339 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="93b6c420-8c78-4e77-a50d-62c9527e5907" containerName="extract-utilities" Mar 18 19:05:16 crc kubenswrapper[4830]: I0318 19:05:16.971643 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="93b6c420-8c78-4e77-a50d-62c9527e5907" containerName="registry-server" Mar 18 19:05:16 crc kubenswrapper[4830]: I0318 19:05:16.973100 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qnf6r" Mar 18 19:05:16 crc kubenswrapper[4830]: I0318 19:05:16.991651 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qnf6r"] Mar 18 19:05:17 crc kubenswrapper[4830]: I0318 19:05:17.154399 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36eea196-f7b8-4303-83dd-a32a3baa616f-catalog-content\") pod \"redhat-marketplace-qnf6r\" (UID: \"36eea196-f7b8-4303-83dd-a32a3baa616f\") " pod="openshift-marketplace/redhat-marketplace-qnf6r" Mar 18 19:05:17 crc kubenswrapper[4830]: I0318 19:05:17.154509 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36eea196-f7b8-4303-83dd-a32a3baa616f-utilities\") pod \"redhat-marketplace-qnf6r\" (UID: \"36eea196-f7b8-4303-83dd-a32a3baa616f\") " pod="openshift-marketplace/redhat-marketplace-qnf6r" Mar 18 19:05:17 crc kubenswrapper[4830]: I0318 19:05:17.154539 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mft5s\" (UniqueName: \"kubernetes.io/projected/36eea196-f7b8-4303-83dd-a32a3baa616f-kube-api-access-mft5s\") pod \"redhat-marketplace-qnf6r\" (UID: \"36eea196-f7b8-4303-83dd-a32a3baa616f\") " pod="openshift-marketplace/redhat-marketplace-qnf6r" Mar 18 19:05:17 crc kubenswrapper[4830]: I0318 19:05:17.255719 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36eea196-f7b8-4303-83dd-a32a3baa616f-catalog-content\") pod \"redhat-marketplace-qnf6r\" (UID: \"36eea196-f7b8-4303-83dd-a32a3baa616f\") " pod="openshift-marketplace/redhat-marketplace-qnf6r" Mar 18 19:05:17 crc kubenswrapper[4830]: I0318 19:05:17.255930 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36eea196-f7b8-4303-83dd-a32a3baa616f-utilities\") pod \"redhat-marketplace-qnf6r\" (UID: \"36eea196-f7b8-4303-83dd-a32a3baa616f\") " pod="openshift-marketplace/redhat-marketplace-qnf6r" Mar 18 19:05:17 crc kubenswrapper[4830]: I0318 19:05:17.255978 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mft5s\" (UniqueName: \"kubernetes.io/projected/36eea196-f7b8-4303-83dd-a32a3baa616f-kube-api-access-mft5s\") pod \"redhat-marketplace-qnf6r\" (UID: \"36eea196-f7b8-4303-83dd-a32a3baa616f\") " pod="openshift-marketplace/redhat-marketplace-qnf6r" Mar 18 19:05:17 crc kubenswrapper[4830]: I0318 19:05:17.257200 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36eea196-f7b8-4303-83dd-a32a3baa616f-catalog-content\") pod \"redhat-marketplace-qnf6r\" (UID: \"36eea196-f7b8-4303-83dd-a32a3baa616f\") " pod="openshift-marketplace/redhat-marketplace-qnf6r" Mar 18 19:05:17 crc kubenswrapper[4830]: I0318 19:05:17.257635 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36eea196-f7b8-4303-83dd-a32a3baa616f-utilities\") pod \"redhat-marketplace-qnf6r\" (UID: \"36eea196-f7b8-4303-83dd-a32a3baa616f\") " pod="openshift-marketplace/redhat-marketplace-qnf6r" Mar 18 19:05:17 crc kubenswrapper[4830]: I0318 19:05:17.284456 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mft5s\" (UniqueName: \"kubernetes.io/projected/36eea196-f7b8-4303-83dd-a32a3baa616f-kube-api-access-mft5s\") pod \"redhat-marketplace-qnf6r\" (UID: \"36eea196-f7b8-4303-83dd-a32a3baa616f\") " pod="openshift-marketplace/redhat-marketplace-qnf6r" Mar 18 19:05:17 crc kubenswrapper[4830]: I0318 19:05:17.315704 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qnf6r" Mar 18 19:05:17 crc kubenswrapper[4830]: I0318 19:05:17.764324 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qnf6r"] Mar 18 19:05:18 crc kubenswrapper[4830]: I0318 19:05:18.673792 4830 generic.go:334] "Generic (PLEG): container finished" podID="36eea196-f7b8-4303-83dd-a32a3baa616f" containerID="85f592089944f521fc3df7986633d2899eeb66f79a8f70dec6820126182ead7a" exitCode=0 Mar 18 19:05:18 crc kubenswrapper[4830]: I0318 19:05:18.673853 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qnf6r" event={"ID":"36eea196-f7b8-4303-83dd-a32a3baa616f","Type":"ContainerDied","Data":"85f592089944f521fc3df7986633d2899eeb66f79a8f70dec6820126182ead7a"} Mar 18 19:05:18 crc kubenswrapper[4830]: I0318 19:05:18.674109 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qnf6r" event={"ID":"36eea196-f7b8-4303-83dd-a32a3baa616f","Type":"ContainerStarted","Data":"d8ad79bb09764aa51a7878ea8bac0cfee91f2f49d64966189c1fb740a84c807b"} Mar 18 19:05:20 crc kubenswrapper[4830]: I0318 19:05:20.691928 4830 generic.go:334] "Generic (PLEG): container finished" podID="36eea196-f7b8-4303-83dd-a32a3baa616f" containerID="d48378351a327b23307dbac8a031e37beb38addd8dc3dd217226775ff59a3c1d" exitCode=0 Mar 18 19:05:20 crc kubenswrapper[4830]: I0318 19:05:20.692017 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qnf6r" event={"ID":"36eea196-f7b8-4303-83dd-a32a3baa616f","Type":"ContainerDied","Data":"d48378351a327b23307dbac8a031e37beb38addd8dc3dd217226775ff59a3c1d"} Mar 18 19:05:21 crc kubenswrapper[4830]: I0318 19:05:21.705379 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qnf6r" event={"ID":"36eea196-f7b8-4303-83dd-a32a3baa616f","Type":"ContainerStarted","Data":"76df038c5629c4849d5f250401f5f2e398237d40a366948a48298b1609fb85c4"} Mar 18 19:05:21 crc kubenswrapper[4830]: I0318 19:05:21.740703 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qnf6r" podStartSLOduration=3.292276434 podStartE2EDuration="5.74067335s" podCreationTimestamp="2026-03-18 19:05:16 +0000 UTC" firstStartedPulling="2026-03-18 19:05:18.675529712 +0000 UTC m=+3753.243160034" lastFinishedPulling="2026-03-18 19:05:21.123926578 +0000 UTC m=+3755.691556950" observedRunningTime="2026-03-18 19:05:21.737683005 +0000 UTC m=+3756.305313377" watchObservedRunningTime="2026-03-18 19:05:21.74067335 +0000 UTC m=+3756.308303722" Mar 18 19:05:27 crc kubenswrapper[4830]: I0318 19:05:27.316357 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qnf6r" Mar 18 19:05:27 crc kubenswrapper[4830]: I0318 19:05:27.316838 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qnf6r" Mar 18 19:05:27 crc kubenswrapper[4830]: I0318 19:05:27.382744 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qnf6r" Mar 18 19:05:27 crc kubenswrapper[4830]: I0318 19:05:27.828354 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qnf6r" Mar 18 19:05:27 crc kubenswrapper[4830]: I0318 19:05:27.868836 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qnf6r"] Mar 18 19:05:29 crc kubenswrapper[4830]: I0318 19:05:29.769930 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qnf6r" podUID="36eea196-f7b8-4303-83dd-a32a3baa616f" containerName="registry-server" containerID="cri-o://76df038c5629c4849d5f250401f5f2e398237d40a366948a48298b1609fb85c4" gracePeriod=2 Mar 18 19:05:30 crc kubenswrapper[4830]: I0318 19:05:30.187195 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qnf6r" Mar 18 19:05:30 crc kubenswrapper[4830]: I0318 19:05:30.198786 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36eea196-f7b8-4303-83dd-a32a3baa616f-catalog-content\") pod \"36eea196-f7b8-4303-83dd-a32a3baa616f\" (UID: \"36eea196-f7b8-4303-83dd-a32a3baa616f\") " Mar 18 19:05:30 crc kubenswrapper[4830]: I0318 19:05:30.198841 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36eea196-f7b8-4303-83dd-a32a3baa616f-utilities\") pod \"36eea196-f7b8-4303-83dd-a32a3baa616f\" (UID: \"36eea196-f7b8-4303-83dd-a32a3baa616f\") " Mar 18 19:05:30 crc kubenswrapper[4830]: I0318 19:05:30.198888 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mft5s\" (UniqueName: \"kubernetes.io/projected/36eea196-f7b8-4303-83dd-a32a3baa616f-kube-api-access-mft5s\") pod \"36eea196-f7b8-4303-83dd-a32a3baa616f\" (UID: \"36eea196-f7b8-4303-83dd-a32a3baa616f\") " Mar 18 19:05:30 crc kubenswrapper[4830]: I0318 19:05:30.202005 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36eea196-f7b8-4303-83dd-a32a3baa616f-utilities" (OuterVolumeSpecName: "utilities") pod "36eea196-f7b8-4303-83dd-a32a3baa616f" (UID: "36eea196-f7b8-4303-83dd-a32a3baa616f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:05:30 crc kubenswrapper[4830]: I0318 19:05:30.215124 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36eea196-f7b8-4303-83dd-a32a3baa616f-kube-api-access-mft5s" (OuterVolumeSpecName: "kube-api-access-mft5s") pod "36eea196-f7b8-4303-83dd-a32a3baa616f" (UID: "36eea196-f7b8-4303-83dd-a32a3baa616f"). InnerVolumeSpecName "kube-api-access-mft5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:05:30 crc kubenswrapper[4830]: I0318 19:05:30.263584 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36eea196-f7b8-4303-83dd-a32a3baa616f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36eea196-f7b8-4303-83dd-a32a3baa616f" (UID: "36eea196-f7b8-4303-83dd-a32a3baa616f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:05:30 crc kubenswrapper[4830]: I0318 19:05:30.300835 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36eea196-f7b8-4303-83dd-a32a3baa616f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 19:05:30 crc kubenswrapper[4830]: I0318 19:05:30.300869 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36eea196-f7b8-4303-83dd-a32a3baa616f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 19:05:30 crc kubenswrapper[4830]: I0318 19:05:30.300881 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mft5s\" (UniqueName: \"kubernetes.io/projected/36eea196-f7b8-4303-83dd-a32a3baa616f-kube-api-access-mft5s\") on node \"crc\" DevicePath \"\"" Mar 18 19:05:30 crc kubenswrapper[4830]: I0318 19:05:30.788714 4830 generic.go:334] "Generic (PLEG): container finished" podID="36eea196-f7b8-4303-83dd-a32a3baa616f" containerID="76df038c5629c4849d5f250401f5f2e398237d40a366948a48298b1609fb85c4" exitCode=0 Mar 18 19:05:30 crc kubenswrapper[4830]: I0318 19:05:30.788838 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qnf6r" event={"ID":"36eea196-f7b8-4303-83dd-a32a3baa616f","Type":"ContainerDied","Data":"76df038c5629c4849d5f250401f5f2e398237d40a366948a48298b1609fb85c4"} Mar 18 19:05:30 crc kubenswrapper[4830]: I0318 19:05:30.789248 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qnf6r" event={"ID":"36eea196-f7b8-4303-83dd-a32a3baa616f","Type":"ContainerDied","Data":"d8ad79bb09764aa51a7878ea8bac0cfee91f2f49d64966189c1fb740a84c807b"} Mar 18 19:05:30 crc kubenswrapper[4830]: I0318 19:05:30.789284 4830 scope.go:117] "RemoveContainer" containerID="76df038c5629c4849d5f250401f5f2e398237d40a366948a48298b1609fb85c4" Mar 18 19:05:30 crc kubenswrapper[4830]: I0318 19:05:30.789063 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qnf6r" Mar 18 19:05:30 crc kubenswrapper[4830]: I0318 19:05:30.828404 4830 scope.go:117] "RemoveContainer" containerID="d48378351a327b23307dbac8a031e37beb38addd8dc3dd217226775ff59a3c1d" Mar 18 19:05:30 crc kubenswrapper[4830]: I0318 19:05:30.840135 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qnf6r"] Mar 18 19:05:30 crc kubenswrapper[4830]: I0318 19:05:30.845217 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qnf6r"] Mar 18 19:05:30 crc kubenswrapper[4830]: I0318 19:05:30.862762 4830 scope.go:117] "RemoveContainer" containerID="85f592089944f521fc3df7986633d2899eeb66f79a8f70dec6820126182ead7a" Mar 18 19:05:30 crc kubenswrapper[4830]: I0318 19:05:30.900363 4830 scope.go:117] "RemoveContainer" containerID="76df038c5629c4849d5f250401f5f2e398237d40a366948a48298b1609fb85c4" Mar 18 19:05:30 crc kubenswrapper[4830]: E0318 19:05:30.900810 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76df038c5629c4849d5f250401f5f2e398237d40a366948a48298b1609fb85c4\": container with ID starting with 76df038c5629c4849d5f250401f5f2e398237d40a366948a48298b1609fb85c4 not found: ID does not exist" containerID="76df038c5629c4849d5f250401f5f2e398237d40a366948a48298b1609fb85c4" Mar 18 19:05:30 crc kubenswrapper[4830]: I0318 19:05:30.900855 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76df038c5629c4849d5f250401f5f2e398237d40a366948a48298b1609fb85c4"} err="failed to get container status \"76df038c5629c4849d5f250401f5f2e398237d40a366948a48298b1609fb85c4\": rpc error: code = NotFound desc = could not find container \"76df038c5629c4849d5f250401f5f2e398237d40a366948a48298b1609fb85c4\": container with ID starting with 76df038c5629c4849d5f250401f5f2e398237d40a366948a48298b1609fb85c4 not found: ID does not exist" Mar 18 19:05:30 crc kubenswrapper[4830]: I0318 19:05:30.900886 4830 scope.go:117] "RemoveContainer" containerID="d48378351a327b23307dbac8a031e37beb38addd8dc3dd217226775ff59a3c1d" Mar 18 19:05:30 crc kubenswrapper[4830]: E0318 19:05:30.901448 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d48378351a327b23307dbac8a031e37beb38addd8dc3dd217226775ff59a3c1d\": container with ID starting with d48378351a327b23307dbac8a031e37beb38addd8dc3dd217226775ff59a3c1d not found: ID does not exist" containerID="d48378351a327b23307dbac8a031e37beb38addd8dc3dd217226775ff59a3c1d" Mar 18 19:05:30 crc kubenswrapper[4830]: I0318 19:05:30.901478 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d48378351a327b23307dbac8a031e37beb38addd8dc3dd217226775ff59a3c1d"} err="failed to get container status \"d48378351a327b23307dbac8a031e37beb38addd8dc3dd217226775ff59a3c1d\": rpc error: code = NotFound desc = could not find container \"d48378351a327b23307dbac8a031e37beb38addd8dc3dd217226775ff59a3c1d\": container with ID starting with d48378351a327b23307dbac8a031e37beb38addd8dc3dd217226775ff59a3c1d not found: ID does not exist" Mar 18 19:05:30 crc kubenswrapper[4830]: I0318 19:05:30.901499 4830 scope.go:117] "RemoveContainer" containerID="85f592089944f521fc3df7986633d2899eeb66f79a8f70dec6820126182ead7a" Mar 18 19:05:30 crc kubenswrapper[4830]: E0318 19:05:30.902080 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85f592089944f521fc3df7986633d2899eeb66f79a8f70dec6820126182ead7a\": container with ID starting with 85f592089944f521fc3df7986633d2899eeb66f79a8f70dec6820126182ead7a not found: ID does not exist" containerID="85f592089944f521fc3df7986633d2899eeb66f79a8f70dec6820126182ead7a" Mar 18 19:05:30 crc kubenswrapper[4830]: I0318 19:05:30.902111 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85f592089944f521fc3df7986633d2899eeb66f79a8f70dec6820126182ead7a"} err="failed to get container status \"85f592089944f521fc3df7986633d2899eeb66f79a8f70dec6820126182ead7a\": rpc error: code = NotFound desc = could not find container \"85f592089944f521fc3df7986633d2899eeb66f79a8f70dec6820126182ead7a\": container with ID starting with 85f592089944f521fc3df7986633d2899eeb66f79a8f70dec6820126182ead7a not found: ID does not exist" Mar 18 19:05:31 crc kubenswrapper[4830]: I0318 19:05:31.235126 4830 scope.go:117] "RemoveContainer" containerID="c8d0fa381806ea089b307b93dbbb4d8d9b6965319f263d699b74c2b596415bd8" Mar 18 19:05:31 crc kubenswrapper[4830]: E0318 19:05:31.235524 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:05:32 crc kubenswrapper[4830]: I0318 19:05:32.253919 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36eea196-f7b8-4303-83dd-a32a3baa616f" path="/var/lib/kubelet/pods/36eea196-f7b8-4303-83dd-a32a3baa616f/volumes" Mar 18 19:05:42 crc kubenswrapper[4830]: I0318 19:05:42.235381 4830 scope.go:117] "RemoveContainer" containerID="c8d0fa381806ea089b307b93dbbb4d8d9b6965319f263d699b74c2b596415bd8" Mar 18 19:05:42 crc kubenswrapper[4830]: E0318 19:05:42.236520 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:05:55 crc kubenswrapper[4830]: I0318 19:05:55.234059 4830 scope.go:117] "RemoveContainer" containerID="c8d0fa381806ea089b307b93dbbb4d8d9b6965319f263d699b74c2b596415bd8" Mar 18 19:05:55 crc kubenswrapper[4830]: E0318 19:05:55.234828 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:06:00 crc kubenswrapper[4830]: I0318 19:06:00.163004 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564346-dtkfc"] Mar 18 19:06:00 crc kubenswrapper[4830]: E0318 19:06:00.165326 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36eea196-f7b8-4303-83dd-a32a3baa616f" containerName="registry-server" Mar 18 19:06:00 crc kubenswrapper[4830]: I0318 19:06:00.165486 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="36eea196-f7b8-4303-83dd-a32a3baa616f" containerName="registry-server" Mar 18 19:06:00 crc kubenswrapper[4830]: E0318 19:06:00.165647 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36eea196-f7b8-4303-83dd-a32a3baa616f" containerName="extract-utilities" Mar 18 19:06:00 crc kubenswrapper[4830]: I0318 19:06:00.165904 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="36eea196-f7b8-4303-83dd-a32a3baa616f" containerName="extract-utilities" Mar 18 19:06:00 crc kubenswrapper[4830]: E0318 19:06:00.166077 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36eea196-f7b8-4303-83dd-a32a3baa616f" containerName="extract-content" Mar 18 19:06:00 crc kubenswrapper[4830]: I0318 19:06:00.166199 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="36eea196-f7b8-4303-83dd-a32a3baa616f" containerName="extract-content" Mar 18 19:06:00 crc kubenswrapper[4830]: I0318 19:06:00.166554 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="36eea196-f7b8-4303-83dd-a32a3baa616f" containerName="registry-server" Mar 18 19:06:00 crc kubenswrapper[4830]: I0318 19:06:00.167499 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564346-dtkfc" Mar 18 19:06:00 crc kubenswrapper[4830]: I0318 19:06:00.170603 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:06:00 crc kubenswrapper[4830]: I0318 19:06:00.173434 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:06:00 crc kubenswrapper[4830]: I0318 19:06:00.173655 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 19:06:00 crc kubenswrapper[4830]: I0318 19:06:00.175922 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564346-dtkfc"] Mar 18 19:06:00 crc kubenswrapper[4830]: I0318 19:06:00.191071 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h87dp\" (UniqueName: \"kubernetes.io/projected/7f93b632-cb27-4d42-a3c0-72a82602e0a3-kube-api-access-h87dp\") pod \"auto-csr-approver-29564346-dtkfc\" (UID: \"7f93b632-cb27-4d42-a3c0-72a82602e0a3\") " pod="openshift-infra/auto-csr-approver-29564346-dtkfc" Mar 18 19:06:00 crc kubenswrapper[4830]: I0318 19:06:00.292980 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h87dp\" (UniqueName: \"kubernetes.io/projected/7f93b632-cb27-4d42-a3c0-72a82602e0a3-kube-api-access-h87dp\") pod \"auto-csr-approver-29564346-dtkfc\" (UID: \"7f93b632-cb27-4d42-a3c0-72a82602e0a3\") " pod="openshift-infra/auto-csr-approver-29564346-dtkfc" Mar 18 19:06:00 crc kubenswrapper[4830]: I0318 19:06:00.325509 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h87dp\" (UniqueName: \"kubernetes.io/projected/7f93b632-cb27-4d42-a3c0-72a82602e0a3-kube-api-access-h87dp\") pod \"auto-csr-approver-29564346-dtkfc\" (UID: \"7f93b632-cb27-4d42-a3c0-72a82602e0a3\") " pod="openshift-infra/auto-csr-approver-29564346-dtkfc" Mar 18 19:06:00 crc kubenswrapper[4830]: I0318 19:06:00.503133 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564346-dtkfc" Mar 18 19:06:01 crc kubenswrapper[4830]: I0318 19:06:01.048299 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564346-dtkfc"] Mar 18 19:06:01 crc kubenswrapper[4830]: I0318 19:06:01.080068 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564346-dtkfc" event={"ID":"7f93b632-cb27-4d42-a3c0-72a82602e0a3","Type":"ContainerStarted","Data":"234e6f3af57667238cd15f8a48f0094bce76fe539598fc908940487f431205b0"} Mar 18 19:06:03 crc kubenswrapper[4830]: I0318 19:06:03.102662 4830 generic.go:334] "Generic (PLEG): container finished" podID="7f93b632-cb27-4d42-a3c0-72a82602e0a3" containerID="d0205b4f3835583bfd9434c58ab4f2a5777cae060dd1729eb9e6380fe383fa67" exitCode=0 Mar 18 19:06:03 crc kubenswrapper[4830]: I0318 19:06:03.102861 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564346-dtkfc" event={"ID":"7f93b632-cb27-4d42-a3c0-72a82602e0a3","Type":"ContainerDied","Data":"d0205b4f3835583bfd9434c58ab4f2a5777cae060dd1729eb9e6380fe383fa67"} Mar 18 19:06:04 crc kubenswrapper[4830]: I0318 19:06:04.476868 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564346-dtkfc" Mar 18 19:06:04 crc kubenswrapper[4830]: I0318 19:06:04.561732 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h87dp\" (UniqueName: \"kubernetes.io/projected/7f93b632-cb27-4d42-a3c0-72a82602e0a3-kube-api-access-h87dp\") pod \"7f93b632-cb27-4d42-a3c0-72a82602e0a3\" (UID: \"7f93b632-cb27-4d42-a3c0-72a82602e0a3\") " Mar 18 19:06:04 crc kubenswrapper[4830]: I0318 19:06:04.572180 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f93b632-cb27-4d42-a3c0-72a82602e0a3-kube-api-access-h87dp" (OuterVolumeSpecName: "kube-api-access-h87dp") pod "7f93b632-cb27-4d42-a3c0-72a82602e0a3" (UID: "7f93b632-cb27-4d42-a3c0-72a82602e0a3"). InnerVolumeSpecName "kube-api-access-h87dp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:06:04 crc kubenswrapper[4830]: I0318 19:06:04.663135 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h87dp\" (UniqueName: \"kubernetes.io/projected/7f93b632-cb27-4d42-a3c0-72a82602e0a3-kube-api-access-h87dp\") on node \"crc\" DevicePath \"\"" Mar 18 19:06:05 crc kubenswrapper[4830]: I0318 19:06:05.127472 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564346-dtkfc" event={"ID":"7f93b632-cb27-4d42-a3c0-72a82602e0a3","Type":"ContainerDied","Data":"234e6f3af57667238cd15f8a48f0094bce76fe539598fc908940487f431205b0"} Mar 18 19:06:05 crc kubenswrapper[4830]: I0318 19:06:05.127996 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="234e6f3af57667238cd15f8a48f0094bce76fe539598fc908940487f431205b0" Mar 18 19:06:05 crc kubenswrapper[4830]: I0318 19:06:05.127591 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564346-dtkfc" Mar 18 19:06:05 crc kubenswrapper[4830]: I0318 19:06:05.568075 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564340-jdt29"] Mar 18 19:06:05 crc kubenswrapper[4830]: I0318 19:06:05.576693 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564340-jdt29"] Mar 18 19:06:06 crc kubenswrapper[4830]: I0318 19:06:06.250165 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="246ca0cc-cd9f-4d3b-a7c7-856b4077d1ec" path="/var/lib/kubelet/pods/246ca0cc-cd9f-4d3b-a7c7-856b4077d1ec/volumes" Mar 18 19:06:07 crc kubenswrapper[4830]: I0318 19:06:07.235549 4830 scope.go:117] "RemoveContainer" containerID="c8d0fa381806ea089b307b93dbbb4d8d9b6965319f263d699b74c2b596415bd8" Mar 18 19:06:07 crc kubenswrapper[4830]: E0318 19:06:07.235849 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:06:18 crc kubenswrapper[4830]: I0318 19:06:18.596834 4830 scope.go:117] "RemoveContainer" containerID="cb2d005f71cced88f397c9e6a53fab664f89b86834823e262879e348f25cfe28" Mar 18 19:06:20 crc kubenswrapper[4830]: I0318 19:06:20.236093 4830 scope.go:117] "RemoveContainer" containerID="c8d0fa381806ea089b307b93dbbb4d8d9b6965319f263d699b74c2b596415bd8" Mar 18 19:06:20 crc kubenswrapper[4830]: E0318 19:06:20.237371 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:06:35 crc kubenswrapper[4830]: I0318 19:06:35.235508 4830 scope.go:117] "RemoveContainer" containerID="c8d0fa381806ea089b307b93dbbb4d8d9b6965319f263d699b74c2b596415bd8" Mar 18 19:06:35 crc kubenswrapper[4830]: E0318 19:06:35.236987 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:06:50 crc kubenswrapper[4830]: I0318 19:06:50.234916 4830 scope.go:117] "RemoveContainer" containerID="c8d0fa381806ea089b307b93dbbb4d8d9b6965319f263d699b74c2b596415bd8" Mar 18 19:06:50 crc kubenswrapper[4830]: E0318 19:06:50.236458 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:07:01 crc kubenswrapper[4830]: I0318 19:07:01.235898 4830 scope.go:117] "RemoveContainer" containerID="c8d0fa381806ea089b307b93dbbb4d8d9b6965319f263d699b74c2b596415bd8" Mar 18 19:07:01 crc kubenswrapper[4830]: E0318 19:07:01.236816 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:07:14 crc kubenswrapper[4830]: I0318 19:07:14.234607 4830 scope.go:117] "RemoveContainer" containerID="c8d0fa381806ea089b307b93dbbb4d8d9b6965319f263d699b74c2b596415bd8" Mar 18 19:07:14 crc kubenswrapper[4830]: E0318 19:07:14.235572 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:07:29 crc kubenswrapper[4830]: I0318 19:07:29.234698 4830 scope.go:117] "RemoveContainer" containerID="c8d0fa381806ea089b307b93dbbb4d8d9b6965319f263d699b74c2b596415bd8" Mar 18 19:07:29 crc kubenswrapper[4830]: E0318 19:07:29.236162 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:07:43 crc kubenswrapper[4830]: I0318 19:07:43.235160 4830 scope.go:117] "RemoveContainer" containerID="c8d0fa381806ea089b307b93dbbb4d8d9b6965319f263d699b74c2b596415bd8" Mar 18 19:07:43 crc kubenswrapper[4830]: E0318 19:07:43.237642 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:07:55 crc kubenswrapper[4830]: I0318 19:07:55.235852 4830 scope.go:117] "RemoveContainer" containerID="c8d0fa381806ea089b307b93dbbb4d8d9b6965319f263d699b74c2b596415bd8" Mar 18 19:07:55 crc kubenswrapper[4830]: E0318 19:07:55.236452 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:08:00 crc kubenswrapper[4830]: I0318 19:08:00.142100 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564348-tx82j"] Mar 18 19:08:00 crc kubenswrapper[4830]: E0318 19:08:00.142672 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f93b632-cb27-4d42-a3c0-72a82602e0a3" containerName="oc" Mar 18 19:08:00 crc kubenswrapper[4830]: I0318 19:08:00.142687 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f93b632-cb27-4d42-a3c0-72a82602e0a3" containerName="oc" Mar 18 19:08:00 crc kubenswrapper[4830]: I0318 19:08:00.142909 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f93b632-cb27-4d42-a3c0-72a82602e0a3" containerName="oc" Mar 18 19:08:00 crc kubenswrapper[4830]: I0318 19:08:00.143426 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564348-tx82j" Mar 18 19:08:00 crc kubenswrapper[4830]: I0318 19:08:00.145861 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:08:00 crc kubenswrapper[4830]: I0318 19:08:00.146099 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:08:00 crc kubenswrapper[4830]: I0318 19:08:00.146412 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 19:08:00 crc kubenswrapper[4830]: I0318 19:08:00.161315 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564348-tx82j"] Mar 18 19:08:00 crc kubenswrapper[4830]: I0318 19:08:00.294402 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts49w\" (UniqueName: \"kubernetes.io/projected/cef8db1d-daf3-422a-ac17-ff853a092b99-kube-api-access-ts49w\") pod \"auto-csr-approver-29564348-tx82j\" (UID: \"cef8db1d-daf3-422a-ac17-ff853a092b99\") " pod="openshift-infra/auto-csr-approver-29564348-tx82j" Mar 18 19:08:00 crc kubenswrapper[4830]: I0318 19:08:00.395514 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts49w\" (UniqueName: \"kubernetes.io/projected/cef8db1d-daf3-422a-ac17-ff853a092b99-kube-api-access-ts49w\") pod \"auto-csr-approver-29564348-tx82j\" (UID: \"cef8db1d-daf3-422a-ac17-ff853a092b99\") " pod="openshift-infra/auto-csr-approver-29564348-tx82j" Mar 18 19:08:00 crc kubenswrapper[4830]: I0318 19:08:00.672302 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts49w\" (UniqueName: \"kubernetes.io/projected/cef8db1d-daf3-422a-ac17-ff853a092b99-kube-api-access-ts49w\") pod \"auto-csr-approver-29564348-tx82j\" (UID: \"cef8db1d-daf3-422a-ac17-ff853a092b99\") " pod="openshift-infra/auto-csr-approver-29564348-tx82j" Mar 18 19:08:00 crc kubenswrapper[4830]: I0318 19:08:00.811483 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564348-tx82j" Mar 18 19:08:01 crc kubenswrapper[4830]: I0318 19:08:01.354312 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564348-tx82j"] Mar 18 19:08:01 crc kubenswrapper[4830]: I0318 19:08:01.366223 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 19:08:02 crc kubenswrapper[4830]: I0318 19:08:02.230746 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564348-tx82j" event={"ID":"cef8db1d-daf3-422a-ac17-ff853a092b99","Type":"ContainerStarted","Data":"4eaf16438bd9c110f13b6292b9c07257f44c4d28a3ac35934daef033846d5708"} Mar 18 19:08:03 crc kubenswrapper[4830]: I0318 19:08:03.243174 4830 generic.go:334] "Generic (PLEG): container finished" podID="cef8db1d-daf3-422a-ac17-ff853a092b99" containerID="fb5666dc316f7d77d62639897ce477a72c94a5eade680251402242e0deaa1a61" exitCode=0 Mar 18 19:08:03 crc kubenswrapper[4830]: I0318 19:08:03.243252 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564348-tx82j" event={"ID":"cef8db1d-daf3-422a-ac17-ff853a092b99","Type":"ContainerDied","Data":"fb5666dc316f7d77d62639897ce477a72c94a5eade680251402242e0deaa1a61"} Mar 18 19:08:04 crc kubenswrapper[4830]: I0318 19:08:04.560368 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564348-tx82j" Mar 18 19:08:04 crc kubenswrapper[4830]: I0318 19:08:04.661525 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ts49w\" (UniqueName: \"kubernetes.io/projected/cef8db1d-daf3-422a-ac17-ff853a092b99-kube-api-access-ts49w\") pod \"cef8db1d-daf3-422a-ac17-ff853a092b99\" (UID: \"cef8db1d-daf3-422a-ac17-ff853a092b99\") " Mar 18 19:08:04 crc kubenswrapper[4830]: I0318 19:08:04.665852 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cef8db1d-daf3-422a-ac17-ff853a092b99-kube-api-access-ts49w" (OuterVolumeSpecName: "kube-api-access-ts49w") pod "cef8db1d-daf3-422a-ac17-ff853a092b99" (UID: "cef8db1d-daf3-422a-ac17-ff853a092b99"). InnerVolumeSpecName "kube-api-access-ts49w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:08:04 crc kubenswrapper[4830]: I0318 19:08:04.762978 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ts49w\" (UniqueName: \"kubernetes.io/projected/cef8db1d-daf3-422a-ac17-ff853a092b99-kube-api-access-ts49w\") on node \"crc\" DevicePath \"\"" Mar 18 19:08:05 crc kubenswrapper[4830]: I0318 19:08:05.263123 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564348-tx82j" event={"ID":"cef8db1d-daf3-422a-ac17-ff853a092b99","Type":"ContainerDied","Data":"4eaf16438bd9c110f13b6292b9c07257f44c4d28a3ac35934daef033846d5708"} Mar 18 19:08:05 crc kubenswrapper[4830]: I0318 19:08:05.263184 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4eaf16438bd9c110f13b6292b9c07257f44c4d28a3ac35934daef033846d5708" Mar 18 19:08:05 crc kubenswrapper[4830]: I0318 19:08:05.263191 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564348-tx82j" Mar 18 19:08:05 crc kubenswrapper[4830]: I0318 19:08:05.638691 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564342-wctwx"] Mar 18 19:08:05 crc kubenswrapper[4830]: I0318 19:08:05.645597 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564342-wctwx"] Mar 18 19:08:06 crc kubenswrapper[4830]: I0318 19:08:06.252260 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d0ec0fb-f334-41e8-a5e6-47d858b02b63" path="/var/lib/kubelet/pods/6d0ec0fb-f334-41e8-a5e6-47d858b02b63/volumes" Mar 18 19:08:09 crc kubenswrapper[4830]: I0318 19:08:09.235892 4830 scope.go:117] "RemoveContainer" containerID="c8d0fa381806ea089b307b93dbbb4d8d9b6965319f263d699b74c2b596415bd8" Mar 18 19:08:09 crc kubenswrapper[4830]: E0318 19:08:09.236741 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:08:18 crc kubenswrapper[4830]: I0318 19:08:18.685880 4830 scope.go:117] "RemoveContainer" containerID="ccd66cf137b643dbdd204b60b4c5c4fc0dabe230af9303e60222676324d01db4" Mar 18 19:08:23 crc kubenswrapper[4830]: I0318 19:08:23.237134 4830 scope.go:117] "RemoveContainer" containerID="c8d0fa381806ea089b307b93dbbb4d8d9b6965319f263d699b74c2b596415bd8" Mar 18 19:08:23 crc kubenswrapper[4830]: E0318 19:08:23.238278 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:08:34 crc kubenswrapper[4830]: I0318 19:08:34.236273 4830 scope.go:117] "RemoveContainer" containerID="c8d0fa381806ea089b307b93dbbb4d8d9b6965319f263d699b74c2b596415bd8" Mar 18 19:08:34 crc kubenswrapper[4830]: E0318 19:08:34.237583 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:08:45 crc kubenswrapper[4830]: I0318 19:08:45.235667 4830 scope.go:117] "RemoveContainer" containerID="c8d0fa381806ea089b307b93dbbb4d8d9b6965319f263d699b74c2b596415bd8" Mar 18 19:08:45 crc kubenswrapper[4830]: E0318 19:08:45.236668 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:08:59 crc kubenswrapper[4830]: I0318 19:08:59.235237 4830 scope.go:117] "RemoveContainer" containerID="c8d0fa381806ea089b307b93dbbb4d8d9b6965319f263d699b74c2b596415bd8" Mar 18 19:08:59 crc kubenswrapper[4830]: E0318 19:08:59.236234 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:09:11 crc kubenswrapper[4830]: I0318 19:09:11.234473 4830 scope.go:117] "RemoveContainer" containerID="c8d0fa381806ea089b307b93dbbb4d8d9b6965319f263d699b74c2b596415bd8" Mar 18 19:09:11 crc kubenswrapper[4830]: I0318 19:09:11.917292 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerStarted","Data":"cf7d9c2b6dea6abbfa87a4fa3f7f0f5653ccafbb3d4a9baa3dff8679fc8ef01c"} Mar 18 19:10:00 crc kubenswrapper[4830]: I0318 19:10:00.146440 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564350-ms9hw"] Mar 18 19:10:00 crc kubenswrapper[4830]: E0318 19:10:00.147200 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef8db1d-daf3-422a-ac17-ff853a092b99" containerName="oc" Mar 18 19:10:00 crc kubenswrapper[4830]: I0318 19:10:00.147212 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef8db1d-daf3-422a-ac17-ff853a092b99" containerName="oc" Mar 18 19:10:00 crc kubenswrapper[4830]: I0318 19:10:00.147365 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="cef8db1d-daf3-422a-ac17-ff853a092b99" containerName="oc" Mar 18 19:10:00 crc kubenswrapper[4830]: I0318 19:10:00.147957 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564350-ms9hw" Mar 18 19:10:00 crc kubenswrapper[4830]: I0318 19:10:00.149417 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 19:10:00 crc kubenswrapper[4830]: I0318 19:10:00.150601 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:10:00 crc kubenswrapper[4830]: I0318 19:10:00.150737 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:10:00 crc kubenswrapper[4830]: I0318 19:10:00.162570 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564350-ms9hw"] Mar 18 19:10:00 crc kubenswrapper[4830]: I0318 19:10:00.234695 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6kp5\" (UniqueName: \"kubernetes.io/projected/f87c0735-7c95-4e91-acb4-897e6db6909d-kube-api-access-j6kp5\") pod \"auto-csr-approver-29564350-ms9hw\" (UID: \"f87c0735-7c95-4e91-acb4-897e6db6909d\") " pod="openshift-infra/auto-csr-approver-29564350-ms9hw" Mar 18 19:10:00 crc kubenswrapper[4830]: I0318 19:10:00.336166 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6kp5\" (UniqueName: \"kubernetes.io/projected/f87c0735-7c95-4e91-acb4-897e6db6909d-kube-api-access-j6kp5\") pod \"auto-csr-approver-29564350-ms9hw\" (UID: \"f87c0735-7c95-4e91-acb4-897e6db6909d\") " pod="openshift-infra/auto-csr-approver-29564350-ms9hw" Mar 18 19:10:00 crc kubenswrapper[4830]: I0318 19:10:00.366416 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6kp5\" (UniqueName: \"kubernetes.io/projected/f87c0735-7c95-4e91-acb4-897e6db6909d-kube-api-access-j6kp5\") pod \"auto-csr-approver-29564350-ms9hw\" (UID: \"f87c0735-7c95-4e91-acb4-897e6db6909d\") " pod="openshift-infra/auto-csr-approver-29564350-ms9hw" Mar 18 19:10:00 crc kubenswrapper[4830]: I0318 19:10:00.471374 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564350-ms9hw" Mar 18 19:10:00 crc kubenswrapper[4830]: I0318 19:10:00.772977 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564350-ms9hw"] Mar 18 19:10:01 crc kubenswrapper[4830]: I0318 19:10:01.402498 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564350-ms9hw" event={"ID":"f87c0735-7c95-4e91-acb4-897e6db6909d","Type":"ContainerStarted","Data":"232da7c192ed25d872c3e3e2be8b1429e489ae7dcd445ba593de40add39752fb"} Mar 18 19:10:03 crc kubenswrapper[4830]: I0318 19:10:03.424944 4830 generic.go:334] "Generic (PLEG): container finished" podID="f87c0735-7c95-4e91-acb4-897e6db6909d" containerID="12a773d396c894165f8c80547c107752459a02cdd7a174af094337122ee97f81" exitCode=0 Mar 18 19:10:03 crc kubenswrapper[4830]: I0318 19:10:03.425082 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564350-ms9hw" event={"ID":"f87c0735-7c95-4e91-acb4-897e6db6909d","Type":"ContainerDied","Data":"12a773d396c894165f8c80547c107752459a02cdd7a174af094337122ee97f81"} Mar 18 19:10:04 crc kubenswrapper[4830]: I0318 19:10:04.793504 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564350-ms9hw" Mar 18 19:10:04 crc kubenswrapper[4830]: I0318 19:10:04.900862 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6kp5\" (UniqueName: \"kubernetes.io/projected/f87c0735-7c95-4e91-acb4-897e6db6909d-kube-api-access-j6kp5\") pod \"f87c0735-7c95-4e91-acb4-897e6db6909d\" (UID: \"f87c0735-7c95-4e91-acb4-897e6db6909d\") " Mar 18 19:10:04 crc kubenswrapper[4830]: I0318 19:10:04.907339 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f87c0735-7c95-4e91-acb4-897e6db6909d-kube-api-access-j6kp5" (OuterVolumeSpecName: "kube-api-access-j6kp5") pod "f87c0735-7c95-4e91-acb4-897e6db6909d" (UID: "f87c0735-7c95-4e91-acb4-897e6db6909d"). InnerVolumeSpecName "kube-api-access-j6kp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:10:05 crc kubenswrapper[4830]: I0318 19:10:05.003477 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6kp5\" (UniqueName: \"kubernetes.io/projected/f87c0735-7c95-4e91-acb4-897e6db6909d-kube-api-access-j6kp5\") on node \"crc\" DevicePath \"\"" Mar 18 19:10:05 crc kubenswrapper[4830]: I0318 19:10:05.441293 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564350-ms9hw" event={"ID":"f87c0735-7c95-4e91-acb4-897e6db6909d","Type":"ContainerDied","Data":"232da7c192ed25d872c3e3e2be8b1429e489ae7dcd445ba593de40add39752fb"} Mar 18 19:10:05 crc kubenswrapper[4830]: I0318 19:10:05.441341 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="232da7c192ed25d872c3e3e2be8b1429e489ae7dcd445ba593de40add39752fb" Mar 18 19:10:05 crc kubenswrapper[4830]: I0318 19:10:05.441406 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564350-ms9hw" Mar 18 19:10:05 crc kubenswrapper[4830]: I0318 19:10:05.888754 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564344-nm6mj"] Mar 18 19:10:05 crc kubenswrapper[4830]: I0318 19:10:05.895758 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564344-nm6mj"] Mar 18 19:10:06 crc kubenswrapper[4830]: I0318 19:10:06.249605 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="105ad763-b28d-48b2-a76a-dd01caf486b3" path="/var/lib/kubelet/pods/105ad763-b28d-48b2-a76a-dd01caf486b3/volumes" Mar 18 19:10:07 crc kubenswrapper[4830]: I0318 19:10:07.299966 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cgh45"] Mar 18 19:10:07 crc kubenswrapper[4830]: E0318 19:10:07.300835 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f87c0735-7c95-4e91-acb4-897e6db6909d" containerName="oc" Mar 18 19:10:07 crc kubenswrapper[4830]: I0318 19:10:07.300855 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f87c0735-7c95-4e91-acb4-897e6db6909d" containerName="oc" Mar 18 19:10:07 crc kubenswrapper[4830]: I0318 19:10:07.301062 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f87c0735-7c95-4e91-acb4-897e6db6909d" containerName="oc" Mar 18 19:10:07 crc kubenswrapper[4830]: I0318 19:10:07.302797 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cgh45" Mar 18 19:10:07 crc kubenswrapper[4830]: I0318 19:10:07.309051 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cgh45"] Mar 18 19:10:07 crc kubenswrapper[4830]: I0318 19:10:07.352263 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x7fl\" (UniqueName: \"kubernetes.io/projected/bd4124d1-358c-4272-bca3-39a1dff119b0-kube-api-access-5x7fl\") pod \"certified-operators-cgh45\" (UID: \"bd4124d1-358c-4272-bca3-39a1dff119b0\") " pod="openshift-marketplace/certified-operators-cgh45" Mar 18 19:10:07 crc kubenswrapper[4830]: I0318 19:10:07.352354 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd4124d1-358c-4272-bca3-39a1dff119b0-utilities\") pod \"certified-operators-cgh45\" (UID: \"bd4124d1-358c-4272-bca3-39a1dff119b0\") " pod="openshift-marketplace/certified-operators-cgh45" Mar 18 19:10:07 crc kubenswrapper[4830]: I0318 19:10:07.352399 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd4124d1-358c-4272-bca3-39a1dff119b0-catalog-content\") pod \"certified-operators-cgh45\" (UID: \"bd4124d1-358c-4272-bca3-39a1dff119b0\") " pod="openshift-marketplace/certified-operators-cgh45" Mar 18 19:10:07 crc kubenswrapper[4830]: I0318 19:10:07.454639 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x7fl\" (UniqueName: \"kubernetes.io/projected/bd4124d1-358c-4272-bca3-39a1dff119b0-kube-api-access-5x7fl\") pod \"certified-operators-cgh45\" (UID: \"bd4124d1-358c-4272-bca3-39a1dff119b0\") " pod="openshift-marketplace/certified-operators-cgh45" Mar 18 19:10:07 crc kubenswrapper[4830]: I0318 19:10:07.454743 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd4124d1-358c-4272-bca3-39a1dff119b0-utilities\") pod \"certified-operators-cgh45\" (UID: \"bd4124d1-358c-4272-bca3-39a1dff119b0\") " pod="openshift-marketplace/certified-operators-cgh45" Mar 18 19:10:07 crc kubenswrapper[4830]: I0318 19:10:07.454858 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd4124d1-358c-4272-bca3-39a1dff119b0-catalog-content\") pod \"certified-operators-cgh45\" (UID: \"bd4124d1-358c-4272-bca3-39a1dff119b0\") " pod="openshift-marketplace/certified-operators-cgh45" Mar 18 19:10:07 crc kubenswrapper[4830]: I0318 19:10:07.455443 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd4124d1-358c-4272-bca3-39a1dff119b0-utilities\") pod \"certified-operators-cgh45\" (UID: \"bd4124d1-358c-4272-bca3-39a1dff119b0\") " pod="openshift-marketplace/certified-operators-cgh45" Mar 18 19:10:07 crc kubenswrapper[4830]: I0318 19:10:07.455589 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd4124d1-358c-4272-bca3-39a1dff119b0-catalog-content\") pod \"certified-operators-cgh45\" (UID: \"bd4124d1-358c-4272-bca3-39a1dff119b0\") " pod="openshift-marketplace/certified-operators-cgh45" Mar 18 19:10:07 crc kubenswrapper[4830]: I0318 19:10:07.481763 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x7fl\" (UniqueName: \"kubernetes.io/projected/bd4124d1-358c-4272-bca3-39a1dff119b0-kube-api-access-5x7fl\") pod \"certified-operators-cgh45\" (UID: \"bd4124d1-358c-4272-bca3-39a1dff119b0\") " pod="openshift-marketplace/certified-operators-cgh45" Mar 18 19:10:07 crc kubenswrapper[4830]: I0318 19:10:07.631285 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cgh45" Mar 18 19:10:07 crc kubenswrapper[4830]: I0318 19:10:07.915497 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cgh45"] Mar 18 19:10:08 crc kubenswrapper[4830]: I0318 19:10:08.469677 4830 generic.go:334] "Generic (PLEG): container finished" podID="bd4124d1-358c-4272-bca3-39a1dff119b0" containerID="5739b0221266a7954cdc6755faf8c3b78eac2609469bb602dc80ad60663bbf5d" exitCode=0 Mar 18 19:10:08 crc kubenswrapper[4830]: I0318 19:10:08.469756 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgh45" event={"ID":"bd4124d1-358c-4272-bca3-39a1dff119b0","Type":"ContainerDied","Data":"5739b0221266a7954cdc6755faf8c3b78eac2609469bb602dc80ad60663bbf5d"} Mar 18 19:10:08 crc kubenswrapper[4830]: I0318 19:10:08.469974 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgh45" event={"ID":"bd4124d1-358c-4272-bca3-39a1dff119b0","Type":"ContainerStarted","Data":"b904cd3c413cdfd5d17860db9cace89bf2aef453136cdc07fdf387e406e0e4ad"} Mar 18 19:10:09 crc kubenswrapper[4830]: I0318 19:10:09.480144 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgh45" event={"ID":"bd4124d1-358c-4272-bca3-39a1dff119b0","Type":"ContainerStarted","Data":"17ed1bfe3e36f7a211995e68f0cbb707b2ad0804ce9baf9b5536283dda672791"} Mar 18 19:10:10 crc kubenswrapper[4830]: I0318 19:10:10.493093 4830 generic.go:334] "Generic (PLEG): container finished" podID="bd4124d1-358c-4272-bca3-39a1dff119b0" containerID="17ed1bfe3e36f7a211995e68f0cbb707b2ad0804ce9baf9b5536283dda672791" exitCode=0 Mar 18 19:10:10 crc kubenswrapper[4830]: I0318 19:10:10.493194 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgh45" event={"ID":"bd4124d1-358c-4272-bca3-39a1dff119b0","Type":"ContainerDied","Data":"17ed1bfe3e36f7a211995e68f0cbb707b2ad0804ce9baf9b5536283dda672791"} Mar 18 19:10:11 crc kubenswrapper[4830]: I0318 19:10:11.505189 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgh45" event={"ID":"bd4124d1-358c-4272-bca3-39a1dff119b0","Type":"ContainerStarted","Data":"ce68d543733628f5ffeeed3da84a764939bef5f751cd8b2a52e7fd4c62e391cd"} Mar 18 19:10:11 crc kubenswrapper[4830]: I0318 19:10:11.534849 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cgh45" podStartSLOduration=2.045890669 podStartE2EDuration="4.534822355s" podCreationTimestamp="2026-03-18 19:10:07 +0000 UTC" firstStartedPulling="2026-03-18 19:10:08.471994025 +0000 UTC m=+4043.039624357" lastFinishedPulling="2026-03-18 19:10:10.960925681 +0000 UTC m=+4045.528556043" observedRunningTime="2026-03-18 19:10:11.527939631 +0000 UTC m=+4046.095569973" watchObservedRunningTime="2026-03-18 19:10:11.534822355 +0000 UTC m=+4046.102452697" Mar 18 19:10:17 crc kubenswrapper[4830]: I0318 19:10:17.631525 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cgh45" Mar 18 19:10:17 crc kubenswrapper[4830]: I0318 19:10:17.632181 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cgh45" Mar 18 19:10:17 crc kubenswrapper[4830]: I0318 19:10:17.708708 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cgh45" Mar 18 19:10:18 crc kubenswrapper[4830]: I0318 19:10:18.639254 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cgh45" Mar 18 19:10:18 crc kubenswrapper[4830]: I0318 19:10:18.733646 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cgh45"] Mar 18 19:10:18 crc kubenswrapper[4830]: I0318 19:10:18.788694 4830 scope.go:117] "RemoveContainer" containerID="8b1ec9e3a0da55248708acac973d7994798f9b298376f475afb8c930b9fa502e" Mar 18 19:10:20 crc kubenswrapper[4830]: I0318 19:10:20.581201 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cgh45" podUID="bd4124d1-358c-4272-bca3-39a1dff119b0" containerName="registry-server" containerID="cri-o://ce68d543733628f5ffeeed3da84a764939bef5f751cd8b2a52e7fd4c62e391cd" gracePeriod=2 Mar 18 19:10:21 crc kubenswrapper[4830]: I0318 19:10:21.252922 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cgh45" Mar 18 19:10:21 crc kubenswrapper[4830]: I0318 19:10:21.391249 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd4124d1-358c-4272-bca3-39a1dff119b0-catalog-content\") pod \"bd4124d1-358c-4272-bca3-39a1dff119b0\" (UID: \"bd4124d1-358c-4272-bca3-39a1dff119b0\") " Mar 18 19:10:21 crc kubenswrapper[4830]: I0318 19:10:21.391467 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd4124d1-358c-4272-bca3-39a1dff119b0-utilities\") pod \"bd4124d1-358c-4272-bca3-39a1dff119b0\" (UID: \"bd4124d1-358c-4272-bca3-39a1dff119b0\") " Mar 18 19:10:21 crc kubenswrapper[4830]: I0318 19:10:21.391604 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5x7fl\" (UniqueName: \"kubernetes.io/projected/bd4124d1-358c-4272-bca3-39a1dff119b0-kube-api-access-5x7fl\") pod \"bd4124d1-358c-4272-bca3-39a1dff119b0\" (UID: \"bd4124d1-358c-4272-bca3-39a1dff119b0\") " Mar 18 19:10:21 crc kubenswrapper[4830]: I0318 19:10:21.392541 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd4124d1-358c-4272-bca3-39a1dff119b0-utilities" (OuterVolumeSpecName: "utilities") pod "bd4124d1-358c-4272-bca3-39a1dff119b0" (UID: "bd4124d1-358c-4272-bca3-39a1dff119b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:10:21 crc kubenswrapper[4830]: I0318 19:10:21.404936 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd4124d1-358c-4272-bca3-39a1dff119b0-kube-api-access-5x7fl" (OuterVolumeSpecName: "kube-api-access-5x7fl") pod "bd4124d1-358c-4272-bca3-39a1dff119b0" (UID: "bd4124d1-358c-4272-bca3-39a1dff119b0"). InnerVolumeSpecName "kube-api-access-5x7fl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:10:21 crc kubenswrapper[4830]: I0318 19:10:21.441309 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd4124d1-358c-4272-bca3-39a1dff119b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bd4124d1-358c-4272-bca3-39a1dff119b0" (UID: "bd4124d1-358c-4272-bca3-39a1dff119b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:10:21 crc kubenswrapper[4830]: I0318 19:10:21.493921 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5x7fl\" (UniqueName: \"kubernetes.io/projected/bd4124d1-358c-4272-bca3-39a1dff119b0-kube-api-access-5x7fl\") on node \"crc\" DevicePath \"\"" Mar 18 19:10:21 crc kubenswrapper[4830]: I0318 19:10:21.493954 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd4124d1-358c-4272-bca3-39a1dff119b0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 19:10:21 crc kubenswrapper[4830]: I0318 19:10:21.493964 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd4124d1-358c-4272-bca3-39a1dff119b0-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 19:10:21 crc kubenswrapper[4830]: I0318 19:10:21.594678 4830 generic.go:334] "Generic (PLEG): container finished" podID="bd4124d1-358c-4272-bca3-39a1dff119b0" containerID="ce68d543733628f5ffeeed3da84a764939bef5f751cd8b2a52e7fd4c62e391cd" exitCode=0 Mar 18 19:10:21 crc kubenswrapper[4830]: I0318 19:10:21.594742 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgh45" event={"ID":"bd4124d1-358c-4272-bca3-39a1dff119b0","Type":"ContainerDied","Data":"ce68d543733628f5ffeeed3da84a764939bef5f751cd8b2a52e7fd4c62e391cd"} Mar 18 19:10:21 crc kubenswrapper[4830]: I0318 19:10:21.594791 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cgh45" Mar 18 19:10:21 crc kubenswrapper[4830]: I0318 19:10:21.594816 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgh45" event={"ID":"bd4124d1-358c-4272-bca3-39a1dff119b0","Type":"ContainerDied","Data":"b904cd3c413cdfd5d17860db9cace89bf2aef453136cdc07fdf387e406e0e4ad"} Mar 18 19:10:21 crc kubenswrapper[4830]: I0318 19:10:21.594846 4830 scope.go:117] "RemoveContainer" containerID="ce68d543733628f5ffeeed3da84a764939bef5f751cd8b2a52e7fd4c62e391cd" Mar 18 19:10:21 crc kubenswrapper[4830]: I0318 19:10:21.622675 4830 scope.go:117] "RemoveContainer" containerID="17ed1bfe3e36f7a211995e68f0cbb707b2ad0804ce9baf9b5536283dda672791" Mar 18 19:10:21 crc kubenswrapper[4830]: I0318 19:10:21.638633 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cgh45"] Mar 18 19:10:21 crc kubenswrapper[4830]: I0318 19:10:21.644926 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cgh45"] Mar 18 19:10:21 crc kubenswrapper[4830]: I0318 19:10:21.662518 4830 scope.go:117] "RemoveContainer" containerID="5739b0221266a7954cdc6755faf8c3b78eac2609469bb602dc80ad60663bbf5d" Mar 18 19:10:21 crc kubenswrapper[4830]: I0318 19:10:21.694476 4830 scope.go:117] "RemoveContainer" containerID="ce68d543733628f5ffeeed3da84a764939bef5f751cd8b2a52e7fd4c62e391cd" Mar 18 19:10:21 crc kubenswrapper[4830]: E0318 19:10:21.695463 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce68d543733628f5ffeeed3da84a764939bef5f751cd8b2a52e7fd4c62e391cd\": container with ID starting with ce68d543733628f5ffeeed3da84a764939bef5f751cd8b2a52e7fd4c62e391cd not found: ID does not exist" containerID="ce68d543733628f5ffeeed3da84a764939bef5f751cd8b2a52e7fd4c62e391cd" Mar 18 19:10:21 crc kubenswrapper[4830]: I0318 19:10:21.695577 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce68d543733628f5ffeeed3da84a764939bef5f751cd8b2a52e7fd4c62e391cd"} err="failed to get container status \"ce68d543733628f5ffeeed3da84a764939bef5f751cd8b2a52e7fd4c62e391cd\": rpc error: code = NotFound desc = could not find container \"ce68d543733628f5ffeeed3da84a764939bef5f751cd8b2a52e7fd4c62e391cd\": container with ID starting with ce68d543733628f5ffeeed3da84a764939bef5f751cd8b2a52e7fd4c62e391cd not found: ID does not exist" Mar 18 19:10:21 crc kubenswrapper[4830]: I0318 19:10:21.695663 4830 scope.go:117] "RemoveContainer" containerID="17ed1bfe3e36f7a211995e68f0cbb707b2ad0804ce9baf9b5536283dda672791" Mar 18 19:10:21 crc kubenswrapper[4830]: E0318 19:10:21.696365 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17ed1bfe3e36f7a211995e68f0cbb707b2ad0804ce9baf9b5536283dda672791\": container with ID starting with 17ed1bfe3e36f7a211995e68f0cbb707b2ad0804ce9baf9b5536283dda672791 not found: ID does not exist" containerID="17ed1bfe3e36f7a211995e68f0cbb707b2ad0804ce9baf9b5536283dda672791" Mar 18 19:10:21 crc kubenswrapper[4830]: I0318 19:10:21.696490 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17ed1bfe3e36f7a211995e68f0cbb707b2ad0804ce9baf9b5536283dda672791"} err="failed to get container status \"17ed1bfe3e36f7a211995e68f0cbb707b2ad0804ce9baf9b5536283dda672791\": rpc error: code = NotFound desc = could not find container \"17ed1bfe3e36f7a211995e68f0cbb707b2ad0804ce9baf9b5536283dda672791\": container with ID starting with 17ed1bfe3e36f7a211995e68f0cbb707b2ad0804ce9baf9b5536283dda672791 not found: ID does not exist" Mar 18 19:10:21 crc kubenswrapper[4830]: I0318 19:10:21.696531 4830 scope.go:117] "RemoveContainer" containerID="5739b0221266a7954cdc6755faf8c3b78eac2609469bb602dc80ad60663bbf5d" Mar 18 19:10:21 crc kubenswrapper[4830]: E0318 19:10:21.697134 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5739b0221266a7954cdc6755faf8c3b78eac2609469bb602dc80ad60663bbf5d\": container with ID starting with 5739b0221266a7954cdc6755faf8c3b78eac2609469bb602dc80ad60663bbf5d not found: ID does not exist" containerID="5739b0221266a7954cdc6755faf8c3b78eac2609469bb602dc80ad60663bbf5d" Mar 18 19:10:21 crc kubenswrapper[4830]: I0318 19:10:21.697190 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5739b0221266a7954cdc6755faf8c3b78eac2609469bb602dc80ad60663bbf5d"} err="failed to get container status \"5739b0221266a7954cdc6755faf8c3b78eac2609469bb602dc80ad60663bbf5d\": rpc error: code = NotFound desc = could not find container \"5739b0221266a7954cdc6755faf8c3b78eac2609469bb602dc80ad60663bbf5d\": container with ID starting with 5739b0221266a7954cdc6755faf8c3b78eac2609469bb602dc80ad60663bbf5d not found: ID does not exist" Mar 18 19:10:22 crc kubenswrapper[4830]: I0318 19:10:22.248818 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd4124d1-358c-4272-bca3-39a1dff119b0" path="/var/lib/kubelet/pods/bd4124d1-358c-4272-bca3-39a1dff119b0/volumes" Mar 18 19:11:27 crc kubenswrapper[4830]: I0318 19:11:27.083440 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-v6dfk"] Mar 18 19:11:27 crc kubenswrapper[4830]: E0318 19:11:27.088237 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd4124d1-358c-4272-bca3-39a1dff119b0" containerName="extract-utilities" Mar 18 19:11:27 crc kubenswrapper[4830]: I0318 19:11:27.088263 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd4124d1-358c-4272-bca3-39a1dff119b0" containerName="extract-utilities" Mar 18 19:11:27 crc kubenswrapper[4830]: E0318 19:11:27.088296 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd4124d1-358c-4272-bca3-39a1dff119b0" containerName="registry-server" Mar 18 19:11:27 crc kubenswrapper[4830]: I0318 19:11:27.088310 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd4124d1-358c-4272-bca3-39a1dff119b0" containerName="registry-server" Mar 18 19:11:27 crc kubenswrapper[4830]: E0318 19:11:27.088355 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd4124d1-358c-4272-bca3-39a1dff119b0" containerName="extract-content" Mar 18 19:11:27 crc kubenswrapper[4830]: I0318 19:11:27.088369 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd4124d1-358c-4272-bca3-39a1dff119b0" containerName="extract-content" Mar 18 19:11:27 crc kubenswrapper[4830]: I0318 19:11:27.088645 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd4124d1-358c-4272-bca3-39a1dff119b0" containerName="registry-server" Mar 18 19:11:27 crc kubenswrapper[4830]: I0318 19:11:27.128646 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v6dfk" Mar 18 19:11:27 crc kubenswrapper[4830]: I0318 19:11:27.128890 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v6dfk"] Mar 18 19:11:27 crc kubenswrapper[4830]: I0318 19:11:27.199357 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/513b0077-e23d-446c-92ce-9b96827e5dd4-utilities\") pod \"community-operators-v6dfk\" (UID: \"513b0077-e23d-446c-92ce-9b96827e5dd4\") " pod="openshift-marketplace/community-operators-v6dfk" Mar 18 19:11:27 crc kubenswrapper[4830]: I0318 19:11:27.199642 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwcnv\" (UniqueName: \"kubernetes.io/projected/513b0077-e23d-446c-92ce-9b96827e5dd4-kube-api-access-kwcnv\") pod \"community-operators-v6dfk\" (UID: \"513b0077-e23d-446c-92ce-9b96827e5dd4\") " pod="openshift-marketplace/community-operators-v6dfk" Mar 18 19:11:27 crc kubenswrapper[4830]: I0318 19:11:27.199827 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/513b0077-e23d-446c-92ce-9b96827e5dd4-catalog-content\") pod \"community-operators-v6dfk\" (UID: \"513b0077-e23d-446c-92ce-9b96827e5dd4\") " pod="openshift-marketplace/community-operators-v6dfk" Mar 18 19:11:27 crc kubenswrapper[4830]: I0318 19:11:27.301366 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/513b0077-e23d-446c-92ce-9b96827e5dd4-utilities\") pod \"community-operators-v6dfk\" (UID: \"513b0077-e23d-446c-92ce-9b96827e5dd4\") " pod="openshift-marketplace/community-operators-v6dfk" Mar 18 19:11:27 crc kubenswrapper[4830]: I0318 19:11:27.301427 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwcnv\" (UniqueName: \"kubernetes.io/projected/513b0077-e23d-446c-92ce-9b96827e5dd4-kube-api-access-kwcnv\") pod \"community-operators-v6dfk\" (UID: \"513b0077-e23d-446c-92ce-9b96827e5dd4\") " pod="openshift-marketplace/community-operators-v6dfk" Mar 18 19:11:27 crc kubenswrapper[4830]: I0318 19:11:27.301492 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/513b0077-e23d-446c-92ce-9b96827e5dd4-catalog-content\") pod \"community-operators-v6dfk\" (UID: \"513b0077-e23d-446c-92ce-9b96827e5dd4\") " pod="openshift-marketplace/community-operators-v6dfk" Mar 18 19:11:27 crc kubenswrapper[4830]: I0318 19:11:27.302123 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/513b0077-e23d-446c-92ce-9b96827e5dd4-catalog-content\") pod \"community-operators-v6dfk\" (UID: \"513b0077-e23d-446c-92ce-9b96827e5dd4\") " pod="openshift-marketplace/community-operators-v6dfk" Mar 18 19:11:27 crc kubenswrapper[4830]: I0318 19:11:27.302310 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/513b0077-e23d-446c-92ce-9b96827e5dd4-utilities\") pod \"community-operators-v6dfk\" (UID: \"513b0077-e23d-446c-92ce-9b96827e5dd4\") " pod="openshift-marketplace/community-operators-v6dfk" Mar 18 19:11:27 crc kubenswrapper[4830]: I0318 19:11:27.324953 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwcnv\" (UniqueName: \"kubernetes.io/projected/513b0077-e23d-446c-92ce-9b96827e5dd4-kube-api-access-kwcnv\") pod \"community-operators-v6dfk\" (UID: \"513b0077-e23d-446c-92ce-9b96827e5dd4\") " pod="openshift-marketplace/community-operators-v6dfk" Mar 18 19:11:27 crc kubenswrapper[4830]: I0318 19:11:27.457856 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v6dfk" Mar 18 19:11:27 crc kubenswrapper[4830]: I0318 19:11:27.820391 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v6dfk"] Mar 18 19:11:28 crc kubenswrapper[4830]: I0318 19:11:28.208181 4830 generic.go:334] "Generic (PLEG): container finished" podID="513b0077-e23d-446c-92ce-9b96827e5dd4" containerID="dbe5d78d1df3ac0e92ecb0e7cdd5f047eca2a691423b830999201c34dc420113" exitCode=0 Mar 18 19:11:28 crc kubenswrapper[4830]: I0318 19:11:28.208242 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6dfk" event={"ID":"513b0077-e23d-446c-92ce-9b96827e5dd4","Type":"ContainerDied","Data":"dbe5d78d1df3ac0e92ecb0e7cdd5f047eca2a691423b830999201c34dc420113"} Mar 18 19:11:28 crc kubenswrapper[4830]: I0318 19:11:28.208289 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6dfk" event={"ID":"513b0077-e23d-446c-92ce-9b96827e5dd4","Type":"ContainerStarted","Data":"fb490d2e72ecf25e6c17ae61cea22efc25b9f87c0ee366709d8137fa4cb5a11b"} Mar 18 19:11:29 crc kubenswrapper[4830]: I0318 19:11:29.509561 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:11:29 crc kubenswrapper[4830]: I0318 19:11:29.509925 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:11:30 crc kubenswrapper[4830]: I0318 19:11:30.228915 4830 generic.go:334] "Generic (PLEG): container finished" podID="513b0077-e23d-446c-92ce-9b96827e5dd4" containerID="c6dc18299477c621d38242b8277f52c8d3801b4561534c74538381ce8b1d00e8" exitCode=0 Mar 18 19:11:30 crc kubenswrapper[4830]: I0318 19:11:30.229299 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6dfk" event={"ID":"513b0077-e23d-446c-92ce-9b96827e5dd4","Type":"ContainerDied","Data":"c6dc18299477c621d38242b8277f52c8d3801b4561534c74538381ce8b1d00e8"} Mar 18 19:11:32 crc kubenswrapper[4830]: I0318 19:11:32.279037 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6dfk" event={"ID":"513b0077-e23d-446c-92ce-9b96827e5dd4","Type":"ContainerStarted","Data":"3916b8822a93aba3f583a4e8b986c66c9eb73ebf74e5fa1a0ec26db8084bdec8"} Mar 18 19:11:32 crc kubenswrapper[4830]: I0318 19:11:32.314493 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-v6dfk" podStartSLOduration=2.839999345 podStartE2EDuration="5.314463773s" podCreationTimestamp="2026-03-18 19:11:27 +0000 UTC" firstStartedPulling="2026-03-18 19:11:28.210879272 +0000 UTC m=+4122.778509604" lastFinishedPulling="2026-03-18 19:11:30.68534369 +0000 UTC m=+4125.252974032" observedRunningTime="2026-03-18 19:11:32.307117616 +0000 UTC m=+4126.874747968" watchObservedRunningTime="2026-03-18 19:11:32.314463773 +0000 UTC m=+4126.882094145" Mar 18 19:11:37 crc kubenswrapper[4830]: I0318 19:11:37.457979 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-v6dfk" Mar 18 19:11:37 crc kubenswrapper[4830]: I0318 19:11:37.459927 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-v6dfk" Mar 18 19:11:37 crc kubenswrapper[4830]: I0318 19:11:37.536935 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-v6dfk" Mar 18 19:11:38 crc kubenswrapper[4830]: I0318 19:11:38.401404 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-v6dfk" Mar 18 19:11:39 crc kubenswrapper[4830]: I0318 19:11:39.770140 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v6dfk"] Mar 18 19:11:41 crc kubenswrapper[4830]: I0318 19:11:41.365343 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-v6dfk" podUID="513b0077-e23d-446c-92ce-9b96827e5dd4" containerName="registry-server" containerID="cri-o://3916b8822a93aba3f583a4e8b986c66c9eb73ebf74e5fa1a0ec26db8084bdec8" gracePeriod=2 Mar 18 19:11:41 crc kubenswrapper[4830]: I0318 19:11:41.865529 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v6dfk" Mar 18 19:11:42 crc kubenswrapper[4830]: I0318 19:11:42.044148 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/513b0077-e23d-446c-92ce-9b96827e5dd4-catalog-content\") pod \"513b0077-e23d-446c-92ce-9b96827e5dd4\" (UID: \"513b0077-e23d-446c-92ce-9b96827e5dd4\") " Mar 18 19:11:42 crc kubenswrapper[4830]: I0318 19:11:42.044364 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/513b0077-e23d-446c-92ce-9b96827e5dd4-utilities\") pod \"513b0077-e23d-446c-92ce-9b96827e5dd4\" (UID: \"513b0077-e23d-446c-92ce-9b96827e5dd4\") " Mar 18 19:11:42 crc kubenswrapper[4830]: I0318 19:11:42.044654 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwcnv\" (UniqueName: \"kubernetes.io/projected/513b0077-e23d-446c-92ce-9b96827e5dd4-kube-api-access-kwcnv\") pod \"513b0077-e23d-446c-92ce-9b96827e5dd4\" (UID: \"513b0077-e23d-446c-92ce-9b96827e5dd4\") " Mar 18 19:11:42 crc kubenswrapper[4830]: I0318 19:11:42.047633 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/513b0077-e23d-446c-92ce-9b96827e5dd4-utilities" (OuterVolumeSpecName: "utilities") pod "513b0077-e23d-446c-92ce-9b96827e5dd4" (UID: "513b0077-e23d-446c-92ce-9b96827e5dd4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:11:42 crc kubenswrapper[4830]: I0318 19:11:42.051614 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/513b0077-e23d-446c-92ce-9b96827e5dd4-kube-api-access-kwcnv" (OuterVolumeSpecName: "kube-api-access-kwcnv") pod "513b0077-e23d-446c-92ce-9b96827e5dd4" (UID: "513b0077-e23d-446c-92ce-9b96827e5dd4"). InnerVolumeSpecName "kube-api-access-kwcnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:11:42 crc kubenswrapper[4830]: I0318 19:11:42.130405 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/513b0077-e23d-446c-92ce-9b96827e5dd4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "513b0077-e23d-446c-92ce-9b96827e5dd4" (UID: "513b0077-e23d-446c-92ce-9b96827e5dd4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:11:42 crc kubenswrapper[4830]: I0318 19:11:42.149073 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/513b0077-e23d-446c-92ce-9b96827e5dd4-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 19:11:42 crc kubenswrapper[4830]: I0318 19:11:42.149134 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwcnv\" (UniqueName: \"kubernetes.io/projected/513b0077-e23d-446c-92ce-9b96827e5dd4-kube-api-access-kwcnv\") on node \"crc\" DevicePath \"\"" Mar 18 19:11:42 crc kubenswrapper[4830]: I0318 19:11:42.149162 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/513b0077-e23d-446c-92ce-9b96827e5dd4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 19:11:42 crc kubenswrapper[4830]: I0318 19:11:42.380984 4830 generic.go:334] "Generic (PLEG): container finished" podID="513b0077-e23d-446c-92ce-9b96827e5dd4" containerID="3916b8822a93aba3f583a4e8b986c66c9eb73ebf74e5fa1a0ec26db8084bdec8" exitCode=0 Mar 18 19:11:42 crc kubenswrapper[4830]: I0318 19:11:42.381078 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v6dfk" Mar 18 19:11:42 crc kubenswrapper[4830]: I0318 19:11:42.381080 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6dfk" event={"ID":"513b0077-e23d-446c-92ce-9b96827e5dd4","Type":"ContainerDied","Data":"3916b8822a93aba3f583a4e8b986c66c9eb73ebf74e5fa1a0ec26db8084bdec8"} Mar 18 19:11:42 crc kubenswrapper[4830]: I0318 19:11:42.382250 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6dfk" event={"ID":"513b0077-e23d-446c-92ce-9b96827e5dd4","Type":"ContainerDied","Data":"fb490d2e72ecf25e6c17ae61cea22efc25b9f87c0ee366709d8137fa4cb5a11b"} Mar 18 19:11:42 crc kubenswrapper[4830]: I0318 19:11:42.382297 4830 scope.go:117] "RemoveContainer" containerID="3916b8822a93aba3f583a4e8b986c66c9eb73ebf74e5fa1a0ec26db8084bdec8" Mar 18 19:11:42 crc kubenswrapper[4830]: I0318 19:11:42.416435 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v6dfk"] Mar 18 19:11:42 crc kubenswrapper[4830]: I0318 19:11:42.421000 4830 scope.go:117] "RemoveContainer" containerID="c6dc18299477c621d38242b8277f52c8d3801b4561534c74538381ce8b1d00e8" Mar 18 19:11:42 crc kubenswrapper[4830]: I0318 19:11:42.425841 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-v6dfk"] Mar 18 19:11:42 crc kubenswrapper[4830]: I0318 19:11:42.448892 4830 scope.go:117] "RemoveContainer" containerID="dbe5d78d1df3ac0e92ecb0e7cdd5f047eca2a691423b830999201c34dc420113" Mar 18 19:11:42 crc kubenswrapper[4830]: I0318 19:11:42.497781 4830 scope.go:117] "RemoveContainer" containerID="3916b8822a93aba3f583a4e8b986c66c9eb73ebf74e5fa1a0ec26db8084bdec8" Mar 18 19:11:42 crc kubenswrapper[4830]: E0318 19:11:42.498352 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3916b8822a93aba3f583a4e8b986c66c9eb73ebf74e5fa1a0ec26db8084bdec8\": container with ID starting with 3916b8822a93aba3f583a4e8b986c66c9eb73ebf74e5fa1a0ec26db8084bdec8 not found: ID does not exist" containerID="3916b8822a93aba3f583a4e8b986c66c9eb73ebf74e5fa1a0ec26db8084bdec8" Mar 18 19:11:42 crc kubenswrapper[4830]: I0318 19:11:42.498423 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3916b8822a93aba3f583a4e8b986c66c9eb73ebf74e5fa1a0ec26db8084bdec8"} err="failed to get container status \"3916b8822a93aba3f583a4e8b986c66c9eb73ebf74e5fa1a0ec26db8084bdec8\": rpc error: code = NotFound desc = could not find container \"3916b8822a93aba3f583a4e8b986c66c9eb73ebf74e5fa1a0ec26db8084bdec8\": container with ID starting with 3916b8822a93aba3f583a4e8b986c66c9eb73ebf74e5fa1a0ec26db8084bdec8 not found: ID does not exist" Mar 18 19:11:42 crc kubenswrapper[4830]: I0318 19:11:42.498488 4830 scope.go:117] "RemoveContainer" containerID="c6dc18299477c621d38242b8277f52c8d3801b4561534c74538381ce8b1d00e8" Mar 18 19:11:42 crc kubenswrapper[4830]: E0318 19:11:42.498992 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6dc18299477c621d38242b8277f52c8d3801b4561534c74538381ce8b1d00e8\": container with ID starting with c6dc18299477c621d38242b8277f52c8d3801b4561534c74538381ce8b1d00e8 not found: ID does not exist" containerID="c6dc18299477c621d38242b8277f52c8d3801b4561534c74538381ce8b1d00e8" Mar 18 19:11:42 crc kubenswrapper[4830]: I0318 19:11:42.499032 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6dc18299477c621d38242b8277f52c8d3801b4561534c74538381ce8b1d00e8"} err="failed to get container status \"c6dc18299477c621d38242b8277f52c8d3801b4561534c74538381ce8b1d00e8\": rpc error: code = NotFound desc = could not find container \"c6dc18299477c621d38242b8277f52c8d3801b4561534c74538381ce8b1d00e8\": container with ID starting with c6dc18299477c621d38242b8277f52c8d3801b4561534c74538381ce8b1d00e8 not found: ID does not exist" Mar 18 19:11:42 crc kubenswrapper[4830]: I0318 19:11:42.499065 4830 scope.go:117] "RemoveContainer" containerID="dbe5d78d1df3ac0e92ecb0e7cdd5f047eca2a691423b830999201c34dc420113" Mar 18 19:11:42 crc kubenswrapper[4830]: E0318 19:11:42.499821 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbe5d78d1df3ac0e92ecb0e7cdd5f047eca2a691423b830999201c34dc420113\": container with ID starting with dbe5d78d1df3ac0e92ecb0e7cdd5f047eca2a691423b830999201c34dc420113 not found: ID does not exist" containerID="dbe5d78d1df3ac0e92ecb0e7cdd5f047eca2a691423b830999201c34dc420113" Mar 18 19:11:42 crc kubenswrapper[4830]: I0318 19:11:42.499847 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbe5d78d1df3ac0e92ecb0e7cdd5f047eca2a691423b830999201c34dc420113"} err="failed to get container status \"dbe5d78d1df3ac0e92ecb0e7cdd5f047eca2a691423b830999201c34dc420113\": rpc error: code = NotFound desc = could not find container \"dbe5d78d1df3ac0e92ecb0e7cdd5f047eca2a691423b830999201c34dc420113\": container with ID starting with dbe5d78d1df3ac0e92ecb0e7cdd5f047eca2a691423b830999201c34dc420113 not found: ID does not exist" Mar 18 19:11:44 crc kubenswrapper[4830]: I0318 19:11:44.253231 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="513b0077-e23d-446c-92ce-9b96827e5dd4" path="/var/lib/kubelet/pods/513b0077-e23d-446c-92ce-9b96827e5dd4/volumes" Mar 18 19:11:59 crc kubenswrapper[4830]: I0318 19:11:59.510085 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:11:59 crc kubenswrapper[4830]: I0318 19:11:59.511080 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:12:00 crc kubenswrapper[4830]: I0318 19:12:00.164604 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564352-7vd96"] Mar 18 19:12:00 crc kubenswrapper[4830]: E0318 19:12:00.165674 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="513b0077-e23d-446c-92ce-9b96827e5dd4" containerName="registry-server" Mar 18 19:12:00 crc kubenswrapper[4830]: I0318 19:12:00.165911 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="513b0077-e23d-446c-92ce-9b96827e5dd4" containerName="registry-server" Mar 18 19:12:00 crc kubenswrapper[4830]: E0318 19:12:00.166098 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="513b0077-e23d-446c-92ce-9b96827e5dd4" containerName="extract-utilities" Mar 18 19:12:00 crc kubenswrapper[4830]: I0318 19:12:00.166284 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="513b0077-e23d-446c-92ce-9b96827e5dd4" containerName="extract-utilities" Mar 18 19:12:00 crc kubenswrapper[4830]: E0318 19:12:00.166476 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="513b0077-e23d-446c-92ce-9b96827e5dd4" containerName="extract-content" Mar 18 19:12:00 crc kubenswrapper[4830]: I0318 19:12:00.166679 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="513b0077-e23d-446c-92ce-9b96827e5dd4" containerName="extract-content" Mar 18 19:12:00 crc kubenswrapper[4830]: I0318 19:12:00.167204 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="513b0077-e23d-446c-92ce-9b96827e5dd4" containerName="registry-server" Mar 18 19:12:00 crc kubenswrapper[4830]: I0318 19:12:00.168429 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564352-7vd96" Mar 18 19:12:00 crc kubenswrapper[4830]: I0318 19:12:00.171451 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 19:12:00 crc kubenswrapper[4830]: I0318 19:12:00.172626 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:12:00 crc kubenswrapper[4830]: I0318 19:12:00.176072 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:12:00 crc kubenswrapper[4830]: I0318 19:12:00.188615 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564352-7vd96"] Mar 18 19:12:00 crc kubenswrapper[4830]: I0318 19:12:00.264684 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv79m\" (UniqueName: \"kubernetes.io/projected/61019ad2-9549-4b71-bb90-06eaa98298fc-kube-api-access-dv79m\") pod \"auto-csr-approver-29564352-7vd96\" (UID: \"61019ad2-9549-4b71-bb90-06eaa98298fc\") " pod="openshift-infra/auto-csr-approver-29564352-7vd96" Mar 18 19:12:00 crc kubenswrapper[4830]: I0318 19:12:00.366899 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv79m\" (UniqueName: \"kubernetes.io/projected/61019ad2-9549-4b71-bb90-06eaa98298fc-kube-api-access-dv79m\") pod \"auto-csr-approver-29564352-7vd96\" (UID: \"61019ad2-9549-4b71-bb90-06eaa98298fc\") " pod="openshift-infra/auto-csr-approver-29564352-7vd96" Mar 18 19:12:00 crc kubenswrapper[4830]: I0318 19:12:00.399694 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv79m\" (UniqueName: \"kubernetes.io/projected/61019ad2-9549-4b71-bb90-06eaa98298fc-kube-api-access-dv79m\") pod \"auto-csr-approver-29564352-7vd96\" (UID: \"61019ad2-9549-4b71-bb90-06eaa98298fc\") " pod="openshift-infra/auto-csr-approver-29564352-7vd96" Mar 18 19:12:00 crc kubenswrapper[4830]: I0318 19:12:00.507013 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564352-7vd96" Mar 18 19:12:01 crc kubenswrapper[4830]: I0318 19:12:01.546269 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564352-7vd96"] Mar 18 19:12:01 crc kubenswrapper[4830]: I0318 19:12:01.568437 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564352-7vd96" event={"ID":"61019ad2-9549-4b71-bb90-06eaa98298fc","Type":"ContainerStarted","Data":"d64eeb948be72a57a9fa2858c2e5fc4dbdd89c1bd618926de57be20305c4545b"} Mar 18 19:12:03 crc kubenswrapper[4830]: I0318 19:12:03.593663 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564352-7vd96" event={"ID":"61019ad2-9549-4b71-bb90-06eaa98298fc","Type":"ContainerStarted","Data":"e7e8a6cca51c6788c2f81b0b4e5cf2275f953ec28b83fe3ee4063170fc7c2373"} Mar 18 19:12:03 crc kubenswrapper[4830]: I0318 19:12:03.613489 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564352-7vd96" podStartSLOduration=2.056385284 podStartE2EDuration="3.61346176s" podCreationTimestamp="2026-03-18 19:12:00 +0000 UTC" firstStartedPulling="2026-03-18 19:12:01.552847957 +0000 UTC m=+4156.120478319" lastFinishedPulling="2026-03-18 19:12:03.109924433 +0000 UTC m=+4157.677554795" observedRunningTime="2026-03-18 19:12:03.610878578 +0000 UTC m=+4158.178508960" watchObservedRunningTime="2026-03-18 19:12:03.61346176 +0000 UTC m=+4158.181092132" Mar 18 19:12:04 crc kubenswrapper[4830]: I0318 19:12:04.605416 4830 generic.go:334] "Generic (PLEG): container finished" podID="61019ad2-9549-4b71-bb90-06eaa98298fc" containerID="e7e8a6cca51c6788c2f81b0b4e5cf2275f953ec28b83fe3ee4063170fc7c2373" exitCode=0 Mar 18 19:12:04 crc kubenswrapper[4830]: I0318 19:12:04.605475 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564352-7vd96" event={"ID":"61019ad2-9549-4b71-bb90-06eaa98298fc","Type":"ContainerDied","Data":"e7e8a6cca51c6788c2f81b0b4e5cf2275f953ec28b83fe3ee4063170fc7c2373"} Mar 18 19:12:05 crc kubenswrapper[4830]: I0318 19:12:05.992271 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564352-7vd96" Mar 18 19:12:06 crc kubenswrapper[4830]: I0318 19:12:06.064330 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv79m\" (UniqueName: \"kubernetes.io/projected/61019ad2-9549-4b71-bb90-06eaa98298fc-kube-api-access-dv79m\") pod \"61019ad2-9549-4b71-bb90-06eaa98298fc\" (UID: \"61019ad2-9549-4b71-bb90-06eaa98298fc\") " Mar 18 19:12:06 crc kubenswrapper[4830]: I0318 19:12:06.077206 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61019ad2-9549-4b71-bb90-06eaa98298fc-kube-api-access-dv79m" (OuterVolumeSpecName: "kube-api-access-dv79m") pod "61019ad2-9549-4b71-bb90-06eaa98298fc" (UID: "61019ad2-9549-4b71-bb90-06eaa98298fc"). InnerVolumeSpecName "kube-api-access-dv79m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:12:06 crc kubenswrapper[4830]: I0318 19:12:06.166475 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dv79m\" (UniqueName: \"kubernetes.io/projected/61019ad2-9549-4b71-bb90-06eaa98298fc-kube-api-access-dv79m\") on node \"crc\" DevicePath \"\"" Mar 18 19:12:06 crc kubenswrapper[4830]: I0318 19:12:06.626163 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564352-7vd96" event={"ID":"61019ad2-9549-4b71-bb90-06eaa98298fc","Type":"ContainerDied","Data":"d64eeb948be72a57a9fa2858c2e5fc4dbdd89c1bd618926de57be20305c4545b"} Mar 18 19:12:06 crc kubenswrapper[4830]: I0318 19:12:06.626232 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d64eeb948be72a57a9fa2858c2e5fc4dbdd89c1bd618926de57be20305c4545b" Mar 18 19:12:06 crc kubenswrapper[4830]: I0318 19:12:06.626266 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564352-7vd96" Mar 18 19:12:06 crc kubenswrapper[4830]: I0318 19:12:06.707620 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564346-dtkfc"] Mar 18 19:12:06 crc kubenswrapper[4830]: I0318 19:12:06.716955 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564346-dtkfc"] Mar 18 19:12:08 crc kubenswrapper[4830]: I0318 19:12:08.252118 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f93b632-cb27-4d42-a3c0-72a82602e0a3" path="/var/lib/kubelet/pods/7f93b632-cb27-4d42-a3c0-72a82602e0a3/volumes" Mar 18 19:12:18 crc kubenswrapper[4830]: I0318 19:12:18.939340 4830 scope.go:117] "RemoveContainer" containerID="d0205b4f3835583bfd9434c58ab4f2a5777cae060dd1729eb9e6380fe383fa67" Mar 18 19:12:29 crc kubenswrapper[4830]: I0318 19:12:29.510275 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:12:29 crc kubenswrapper[4830]: I0318 19:12:29.511110 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:12:29 crc kubenswrapper[4830]: I0318 19:12:29.511196 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" Mar 18 19:12:29 crc kubenswrapper[4830]: I0318 19:12:29.512337 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cf7d9c2b6dea6abbfa87a4fa3f7f0f5653ccafbb3d4a9baa3dff8679fc8ef01c"} pod="openshift-machine-config-operator/machine-config-daemon-plzpb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 19:12:29 crc kubenswrapper[4830]: I0318 19:12:29.512457 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" containerID="cri-o://cf7d9c2b6dea6abbfa87a4fa3f7f0f5653ccafbb3d4a9baa3dff8679fc8ef01c" gracePeriod=600 Mar 18 19:12:29 crc kubenswrapper[4830]: I0318 19:12:29.836367 4830 generic.go:334] "Generic (PLEG): container finished" podID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerID="cf7d9c2b6dea6abbfa87a4fa3f7f0f5653ccafbb3d4a9baa3dff8679fc8ef01c" exitCode=0 Mar 18 19:12:29 crc kubenswrapper[4830]: I0318 19:12:29.836454 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerDied","Data":"cf7d9c2b6dea6abbfa87a4fa3f7f0f5653ccafbb3d4a9baa3dff8679fc8ef01c"} Mar 18 19:12:29 crc kubenswrapper[4830]: I0318 19:12:29.836931 4830 scope.go:117] "RemoveContainer" containerID="c8d0fa381806ea089b307b93dbbb4d8d9b6965319f263d699b74c2b596415bd8" Mar 18 19:12:30 crc kubenswrapper[4830]: I0318 19:12:30.846502 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerStarted","Data":"dd00990d3e0490eb7bb214bb6b0dc05f42337becb8fc6e26428974096f6651c9"} Mar 18 19:14:00 crc kubenswrapper[4830]: I0318 19:14:00.166976 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564354-n2fgb"] Mar 18 19:14:00 crc kubenswrapper[4830]: E0318 19:14:00.168238 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61019ad2-9549-4b71-bb90-06eaa98298fc" containerName="oc" Mar 18 19:14:00 crc kubenswrapper[4830]: I0318 19:14:00.168263 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="61019ad2-9549-4b71-bb90-06eaa98298fc" containerName="oc" Mar 18 19:14:00 crc kubenswrapper[4830]: I0318 19:14:00.168538 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="61019ad2-9549-4b71-bb90-06eaa98298fc" containerName="oc" Mar 18 19:14:00 crc kubenswrapper[4830]: I0318 19:14:00.169852 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564354-n2fgb" Mar 18 19:14:00 crc kubenswrapper[4830]: I0318 19:14:00.175872 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:14:00 crc kubenswrapper[4830]: I0318 19:14:00.176526 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 19:14:00 crc kubenswrapper[4830]: I0318 19:14:00.178258 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:14:00 crc kubenswrapper[4830]: I0318 19:14:00.186303 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564354-n2fgb"] Mar 18 19:14:00 crc kubenswrapper[4830]: I0318 19:14:00.236749 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c6m4\" (UniqueName: \"kubernetes.io/projected/7fc2a367-20ae-465b-b520-ee4ad9292563-kube-api-access-2c6m4\") pod \"auto-csr-approver-29564354-n2fgb\" (UID: \"7fc2a367-20ae-465b-b520-ee4ad9292563\") " pod="openshift-infra/auto-csr-approver-29564354-n2fgb" Mar 18 19:14:00 crc kubenswrapper[4830]: I0318 19:14:00.338983 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c6m4\" (UniqueName: \"kubernetes.io/projected/7fc2a367-20ae-465b-b520-ee4ad9292563-kube-api-access-2c6m4\") pod \"auto-csr-approver-29564354-n2fgb\" (UID: \"7fc2a367-20ae-465b-b520-ee4ad9292563\") " pod="openshift-infra/auto-csr-approver-29564354-n2fgb" Mar 18 19:14:00 crc kubenswrapper[4830]: I0318 19:14:00.375041 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c6m4\" (UniqueName: \"kubernetes.io/projected/7fc2a367-20ae-465b-b520-ee4ad9292563-kube-api-access-2c6m4\") pod \"auto-csr-approver-29564354-n2fgb\" (UID: \"7fc2a367-20ae-465b-b520-ee4ad9292563\") " pod="openshift-infra/auto-csr-approver-29564354-n2fgb" Mar 18 19:14:00 crc kubenswrapper[4830]: I0318 19:14:00.525295 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564354-n2fgb" Mar 18 19:14:00 crc kubenswrapper[4830]: I0318 19:14:00.776328 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564354-n2fgb"] Mar 18 19:14:00 crc kubenswrapper[4830]: I0318 19:14:00.779105 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 19:14:01 crc kubenswrapper[4830]: I0318 19:14:01.706563 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564354-n2fgb" event={"ID":"7fc2a367-20ae-465b-b520-ee4ad9292563","Type":"ContainerStarted","Data":"d0ece21b31e4fe9d7a178f10d442d448c37cb85baf398a8e9505909b38c9d61f"} Mar 18 19:14:02 crc kubenswrapper[4830]: I0318 19:14:02.719151 4830 generic.go:334] "Generic (PLEG): container finished" podID="7fc2a367-20ae-465b-b520-ee4ad9292563" containerID="251bbc0619075efdaaa56c35f4ba7ceabfd9cb735ee672554d3c98d25b08ae57" exitCode=0 Mar 18 19:14:02 crc kubenswrapper[4830]: I0318 19:14:02.719251 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564354-n2fgb" event={"ID":"7fc2a367-20ae-465b-b520-ee4ad9292563","Type":"ContainerDied","Data":"251bbc0619075efdaaa56c35f4ba7ceabfd9cb735ee672554d3c98d25b08ae57"} Mar 18 19:14:04 crc kubenswrapper[4830]: I0318 19:14:04.102460 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564354-n2fgb" Mar 18 19:14:04 crc kubenswrapper[4830]: I0318 19:14:04.204092 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2c6m4\" (UniqueName: \"kubernetes.io/projected/7fc2a367-20ae-465b-b520-ee4ad9292563-kube-api-access-2c6m4\") pod \"7fc2a367-20ae-465b-b520-ee4ad9292563\" (UID: \"7fc2a367-20ae-465b-b520-ee4ad9292563\") " Mar 18 19:14:04 crc kubenswrapper[4830]: I0318 19:14:04.211008 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fc2a367-20ae-465b-b520-ee4ad9292563-kube-api-access-2c6m4" (OuterVolumeSpecName: "kube-api-access-2c6m4") pod "7fc2a367-20ae-465b-b520-ee4ad9292563" (UID: "7fc2a367-20ae-465b-b520-ee4ad9292563"). InnerVolumeSpecName "kube-api-access-2c6m4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:14:04 crc kubenswrapper[4830]: I0318 19:14:04.306176 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2c6m4\" (UniqueName: \"kubernetes.io/projected/7fc2a367-20ae-465b-b520-ee4ad9292563-kube-api-access-2c6m4\") on node \"crc\" DevicePath \"\"" Mar 18 19:14:04 crc kubenswrapper[4830]: I0318 19:14:04.740177 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564354-n2fgb" event={"ID":"7fc2a367-20ae-465b-b520-ee4ad9292563","Type":"ContainerDied","Data":"d0ece21b31e4fe9d7a178f10d442d448c37cb85baf398a8e9505909b38c9d61f"} Mar 18 19:14:04 crc kubenswrapper[4830]: I0318 19:14:04.740576 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0ece21b31e4fe9d7a178f10d442d448c37cb85baf398a8e9505909b38c9d61f" Mar 18 19:14:04 crc kubenswrapper[4830]: I0318 19:14:04.740313 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564354-n2fgb" Mar 18 19:14:05 crc kubenswrapper[4830]: I0318 19:14:05.187260 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564348-tx82j"] Mar 18 19:14:05 crc kubenswrapper[4830]: I0318 19:14:05.196047 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564348-tx82j"] Mar 18 19:14:06 crc kubenswrapper[4830]: I0318 19:14:06.254821 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cef8db1d-daf3-422a-ac17-ff853a092b99" path="/var/lib/kubelet/pods/cef8db1d-daf3-422a-ac17-ff853a092b99/volumes" Mar 18 19:14:19 crc kubenswrapper[4830]: I0318 19:14:19.075743 4830 scope.go:117] "RemoveContainer" containerID="fb5666dc316f7d77d62639897ce477a72c94a5eade680251402242e0deaa1a61" Mar 18 19:14:29 crc kubenswrapper[4830]: I0318 19:14:29.509277 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:14:29 crc kubenswrapper[4830]: I0318 19:14:29.509683 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:14:59 crc kubenswrapper[4830]: I0318 19:14:59.510345 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:14:59 crc kubenswrapper[4830]: I0318 19:14:59.511115 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:15:00 crc kubenswrapper[4830]: I0318 19:15:00.154479 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564355-vwtrn"] Mar 18 19:15:00 crc kubenswrapper[4830]: E0318 19:15:00.154768 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fc2a367-20ae-465b-b520-ee4ad9292563" containerName="oc" Mar 18 19:15:00 crc kubenswrapper[4830]: I0318 19:15:00.154843 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fc2a367-20ae-465b-b520-ee4ad9292563" containerName="oc" Mar 18 19:15:00 crc kubenswrapper[4830]: I0318 19:15:00.154978 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fc2a367-20ae-465b-b520-ee4ad9292563" containerName="oc" Mar 18 19:15:00 crc kubenswrapper[4830]: I0318 19:15:00.155395 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564355-vwtrn" Mar 18 19:15:00 crc kubenswrapper[4830]: I0318 19:15:00.158506 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 19:15:00 crc kubenswrapper[4830]: I0318 19:15:00.158825 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 19:15:00 crc kubenswrapper[4830]: I0318 19:15:00.174481 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564355-vwtrn"] Mar 18 19:15:00 crc kubenswrapper[4830]: I0318 19:15:00.332417 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/065dad25-9d0f-42f9-9b7d-1d549f51f690-config-volume\") pod \"collect-profiles-29564355-vwtrn\" (UID: \"065dad25-9d0f-42f9-9b7d-1d549f51f690\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564355-vwtrn" Mar 18 19:15:00 crc kubenswrapper[4830]: I0318 19:15:00.332483 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/065dad25-9d0f-42f9-9b7d-1d549f51f690-secret-volume\") pod \"collect-profiles-29564355-vwtrn\" (UID: \"065dad25-9d0f-42f9-9b7d-1d549f51f690\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564355-vwtrn" Mar 18 19:15:00 crc kubenswrapper[4830]: I0318 19:15:00.332875 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5p4j\" (UniqueName: \"kubernetes.io/projected/065dad25-9d0f-42f9-9b7d-1d549f51f690-kube-api-access-z5p4j\") pod \"collect-profiles-29564355-vwtrn\" (UID: \"065dad25-9d0f-42f9-9b7d-1d549f51f690\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564355-vwtrn" Mar 18 19:15:00 crc kubenswrapper[4830]: I0318 19:15:00.434366 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5p4j\" (UniqueName: \"kubernetes.io/projected/065dad25-9d0f-42f9-9b7d-1d549f51f690-kube-api-access-z5p4j\") pod \"collect-profiles-29564355-vwtrn\" (UID: \"065dad25-9d0f-42f9-9b7d-1d549f51f690\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564355-vwtrn" Mar 18 19:15:00 crc kubenswrapper[4830]: I0318 19:15:00.434537 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/065dad25-9d0f-42f9-9b7d-1d549f51f690-config-volume\") pod \"collect-profiles-29564355-vwtrn\" (UID: \"065dad25-9d0f-42f9-9b7d-1d549f51f690\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564355-vwtrn" Mar 18 19:15:00 crc kubenswrapper[4830]: I0318 19:15:00.434598 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/065dad25-9d0f-42f9-9b7d-1d549f51f690-secret-volume\") pod \"collect-profiles-29564355-vwtrn\" (UID: \"065dad25-9d0f-42f9-9b7d-1d549f51f690\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564355-vwtrn" Mar 18 19:15:00 crc kubenswrapper[4830]: I0318 19:15:00.436504 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/065dad25-9d0f-42f9-9b7d-1d549f51f690-config-volume\") pod \"collect-profiles-29564355-vwtrn\" (UID: \"065dad25-9d0f-42f9-9b7d-1d549f51f690\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564355-vwtrn" Mar 18 19:15:00 crc kubenswrapper[4830]: I0318 19:15:00.449517 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/065dad25-9d0f-42f9-9b7d-1d549f51f690-secret-volume\") pod \"collect-profiles-29564355-vwtrn\" (UID: \"065dad25-9d0f-42f9-9b7d-1d549f51f690\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564355-vwtrn" Mar 18 19:15:00 crc kubenswrapper[4830]: I0318 19:15:00.471250 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5p4j\" (UniqueName: \"kubernetes.io/projected/065dad25-9d0f-42f9-9b7d-1d549f51f690-kube-api-access-z5p4j\") pod \"collect-profiles-29564355-vwtrn\" (UID: \"065dad25-9d0f-42f9-9b7d-1d549f51f690\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564355-vwtrn" Mar 18 19:15:00 crc kubenswrapper[4830]: I0318 19:15:00.772905 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564355-vwtrn" Mar 18 19:15:01 crc kubenswrapper[4830]: I0318 19:15:01.243766 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564355-vwtrn"] Mar 18 19:15:01 crc kubenswrapper[4830]: W0318 19:15:01.579985 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod065dad25_9d0f_42f9_9b7d_1d549f51f690.slice/crio-7c24e7034bed94f760c8781511f99d1242efb6dcc6b1f6816291a374058d6cec WatchSource:0}: Error finding container 7c24e7034bed94f760c8781511f99d1242efb6dcc6b1f6816291a374058d6cec: Status 404 returned error can't find the container with id 7c24e7034bed94f760c8781511f99d1242efb6dcc6b1f6816291a374058d6cec Mar 18 19:15:02 crc kubenswrapper[4830]: I0318 19:15:02.275302 4830 generic.go:334] "Generic (PLEG): container finished" podID="065dad25-9d0f-42f9-9b7d-1d549f51f690" containerID="b3186207fc072bb2a91ad5a2c7d1f397e32ca90e1f941d7d50f74575b50d08c5" exitCode=0 Mar 18 19:15:02 crc kubenswrapper[4830]: I0318 19:15:02.275474 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564355-vwtrn" event={"ID":"065dad25-9d0f-42f9-9b7d-1d549f51f690","Type":"ContainerDied","Data":"b3186207fc072bb2a91ad5a2c7d1f397e32ca90e1f941d7d50f74575b50d08c5"} Mar 18 19:15:02 crc kubenswrapper[4830]: I0318 19:15:02.276272 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564355-vwtrn" event={"ID":"065dad25-9d0f-42f9-9b7d-1d549f51f690","Type":"ContainerStarted","Data":"7c24e7034bed94f760c8781511f99d1242efb6dcc6b1f6816291a374058d6cec"} Mar 18 19:15:03 crc kubenswrapper[4830]: I0318 19:15:03.622507 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564355-vwtrn" Mar 18 19:15:03 crc kubenswrapper[4830]: I0318 19:15:03.784617 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/065dad25-9d0f-42f9-9b7d-1d549f51f690-config-volume\") pod \"065dad25-9d0f-42f9-9b7d-1d549f51f690\" (UID: \"065dad25-9d0f-42f9-9b7d-1d549f51f690\") " Mar 18 19:15:03 crc kubenswrapper[4830]: I0318 19:15:03.785084 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5p4j\" (UniqueName: \"kubernetes.io/projected/065dad25-9d0f-42f9-9b7d-1d549f51f690-kube-api-access-z5p4j\") pod \"065dad25-9d0f-42f9-9b7d-1d549f51f690\" (UID: \"065dad25-9d0f-42f9-9b7d-1d549f51f690\") " Mar 18 19:15:03 crc kubenswrapper[4830]: I0318 19:15:03.785179 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/065dad25-9d0f-42f9-9b7d-1d549f51f690-secret-volume\") pod \"065dad25-9d0f-42f9-9b7d-1d549f51f690\" (UID: \"065dad25-9d0f-42f9-9b7d-1d549f51f690\") " Mar 18 19:15:03 crc kubenswrapper[4830]: I0318 19:15:03.786551 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/065dad25-9d0f-42f9-9b7d-1d549f51f690-config-volume" (OuterVolumeSpecName: "config-volume") pod "065dad25-9d0f-42f9-9b7d-1d549f51f690" (UID: "065dad25-9d0f-42f9-9b7d-1d549f51f690"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:15:03 crc kubenswrapper[4830]: I0318 19:15:03.792521 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/065dad25-9d0f-42f9-9b7d-1d549f51f690-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "065dad25-9d0f-42f9-9b7d-1d549f51f690" (UID: "065dad25-9d0f-42f9-9b7d-1d549f51f690"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 19:15:03 crc kubenswrapper[4830]: I0318 19:15:03.792540 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/065dad25-9d0f-42f9-9b7d-1d549f51f690-kube-api-access-z5p4j" (OuterVolumeSpecName: "kube-api-access-z5p4j") pod "065dad25-9d0f-42f9-9b7d-1d549f51f690" (UID: "065dad25-9d0f-42f9-9b7d-1d549f51f690"). InnerVolumeSpecName "kube-api-access-z5p4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:15:03 crc kubenswrapper[4830]: I0318 19:15:03.887685 4830 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/065dad25-9d0f-42f9-9b7d-1d549f51f690-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 19:15:03 crc kubenswrapper[4830]: I0318 19:15:03.888023 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5p4j\" (UniqueName: \"kubernetes.io/projected/065dad25-9d0f-42f9-9b7d-1d549f51f690-kube-api-access-z5p4j\") on node \"crc\" DevicePath \"\"" Mar 18 19:15:03 crc kubenswrapper[4830]: I0318 19:15:03.888148 4830 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/065dad25-9d0f-42f9-9b7d-1d549f51f690-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 19:15:04 crc kubenswrapper[4830]: I0318 19:15:04.295840 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564355-vwtrn" event={"ID":"065dad25-9d0f-42f9-9b7d-1d549f51f690","Type":"ContainerDied","Data":"7c24e7034bed94f760c8781511f99d1242efb6dcc6b1f6816291a374058d6cec"} Mar 18 19:15:04 crc kubenswrapper[4830]: I0318 19:15:04.295887 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c24e7034bed94f760c8781511f99d1242efb6dcc6b1f6816291a374058d6cec" Mar 18 19:15:04 crc kubenswrapper[4830]: I0318 19:15:04.295917 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564355-vwtrn" Mar 18 19:15:04 crc kubenswrapper[4830]: I0318 19:15:04.717715 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564310-lpqmq"] Mar 18 19:15:04 crc kubenswrapper[4830]: I0318 19:15:04.726083 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564310-lpqmq"] Mar 18 19:15:06 crc kubenswrapper[4830]: I0318 19:15:06.245465 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eb0be4c-a5d0-4933-b04a-74b521c9a26b" path="/var/lib/kubelet/pods/4eb0be4c-a5d0-4933-b04a-74b521c9a26b/volumes" Mar 18 19:15:19 crc kubenswrapper[4830]: I0318 19:15:19.169697 4830 scope.go:117] "RemoveContainer" containerID="816a46d175ee8b1b5e08a4d5b9b015164dd17158fa145c0eb40cfb987577a846" Mar 18 19:15:23 crc kubenswrapper[4830]: I0318 19:15:23.958904 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wc8wj"] Mar 18 19:15:23 crc kubenswrapper[4830]: E0318 19:15:23.960332 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="065dad25-9d0f-42f9-9b7d-1d549f51f690" containerName="collect-profiles" Mar 18 19:15:23 crc kubenswrapper[4830]: I0318 19:15:23.960367 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="065dad25-9d0f-42f9-9b7d-1d549f51f690" containerName="collect-profiles" Mar 18 19:15:23 crc kubenswrapper[4830]: I0318 19:15:23.960752 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="065dad25-9d0f-42f9-9b7d-1d549f51f690" containerName="collect-profiles" Mar 18 19:15:23 crc kubenswrapper[4830]: I0318 19:15:23.972538 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wc8wj" Mar 18 19:15:23 crc kubenswrapper[4830]: I0318 19:15:23.995411 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wc8wj"] Mar 18 19:15:24 crc kubenswrapper[4830]: I0318 19:15:24.044401 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f986dda7-facc-4d2a-9d5c-1dceabdeed58-catalog-content\") pod \"redhat-operators-wc8wj\" (UID: \"f986dda7-facc-4d2a-9d5c-1dceabdeed58\") " pod="openshift-marketplace/redhat-operators-wc8wj" Mar 18 19:15:24 crc kubenswrapper[4830]: I0318 19:15:24.044447 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksvxl\" (UniqueName: \"kubernetes.io/projected/f986dda7-facc-4d2a-9d5c-1dceabdeed58-kube-api-access-ksvxl\") pod \"redhat-operators-wc8wj\" (UID: \"f986dda7-facc-4d2a-9d5c-1dceabdeed58\") " pod="openshift-marketplace/redhat-operators-wc8wj" Mar 18 19:15:24 crc kubenswrapper[4830]: I0318 19:15:24.044504 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f986dda7-facc-4d2a-9d5c-1dceabdeed58-utilities\") pod \"redhat-operators-wc8wj\" (UID: \"f986dda7-facc-4d2a-9d5c-1dceabdeed58\") " pod="openshift-marketplace/redhat-operators-wc8wj" Mar 18 19:15:24 crc kubenswrapper[4830]: I0318 19:15:24.145635 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f986dda7-facc-4d2a-9d5c-1dceabdeed58-catalog-content\") pod \"redhat-operators-wc8wj\" (UID: \"f986dda7-facc-4d2a-9d5c-1dceabdeed58\") " pod="openshift-marketplace/redhat-operators-wc8wj" Mar 18 19:15:24 crc kubenswrapper[4830]: I0318 19:15:24.145686 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksvxl\" (UniqueName: \"kubernetes.io/projected/f986dda7-facc-4d2a-9d5c-1dceabdeed58-kube-api-access-ksvxl\") pod \"redhat-operators-wc8wj\" (UID: \"f986dda7-facc-4d2a-9d5c-1dceabdeed58\") " pod="openshift-marketplace/redhat-operators-wc8wj" Mar 18 19:15:24 crc kubenswrapper[4830]: I0318 19:15:24.145747 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f986dda7-facc-4d2a-9d5c-1dceabdeed58-utilities\") pod \"redhat-operators-wc8wj\" (UID: \"f986dda7-facc-4d2a-9d5c-1dceabdeed58\") " pod="openshift-marketplace/redhat-operators-wc8wj" Mar 18 19:15:24 crc kubenswrapper[4830]: I0318 19:15:24.146261 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f986dda7-facc-4d2a-9d5c-1dceabdeed58-utilities\") pod \"redhat-operators-wc8wj\" (UID: \"f986dda7-facc-4d2a-9d5c-1dceabdeed58\") " pod="openshift-marketplace/redhat-operators-wc8wj" Mar 18 19:15:24 crc kubenswrapper[4830]: I0318 19:15:24.146600 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f986dda7-facc-4d2a-9d5c-1dceabdeed58-catalog-content\") pod \"redhat-operators-wc8wj\" (UID: \"f986dda7-facc-4d2a-9d5c-1dceabdeed58\") " pod="openshift-marketplace/redhat-operators-wc8wj" Mar 18 19:15:24 crc kubenswrapper[4830]: I0318 19:15:24.184140 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksvxl\" (UniqueName: \"kubernetes.io/projected/f986dda7-facc-4d2a-9d5c-1dceabdeed58-kube-api-access-ksvxl\") pod \"redhat-operators-wc8wj\" (UID: \"f986dda7-facc-4d2a-9d5c-1dceabdeed58\") " pod="openshift-marketplace/redhat-operators-wc8wj" Mar 18 19:15:24 crc kubenswrapper[4830]: I0318 19:15:24.310206 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wc8wj" Mar 18 19:15:24 crc kubenswrapper[4830]: I0318 19:15:24.577420 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wc8wj"] Mar 18 19:15:25 crc kubenswrapper[4830]: I0318 19:15:25.526680 4830 generic.go:334] "Generic (PLEG): container finished" podID="f986dda7-facc-4d2a-9d5c-1dceabdeed58" containerID="2b5eb430519cc2a541040512b4b9b41467396e7a5976d0549f8ceb1baff6e035" exitCode=0 Mar 18 19:15:25 crc kubenswrapper[4830]: I0318 19:15:25.526820 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wc8wj" event={"ID":"f986dda7-facc-4d2a-9d5c-1dceabdeed58","Type":"ContainerDied","Data":"2b5eb430519cc2a541040512b4b9b41467396e7a5976d0549f8ceb1baff6e035"} Mar 18 19:15:25 crc kubenswrapper[4830]: I0318 19:15:25.527044 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wc8wj" event={"ID":"f986dda7-facc-4d2a-9d5c-1dceabdeed58","Type":"ContainerStarted","Data":"1e466d14157830cb7ca49ec0bb15f10adcee76f989a1762155dad612b349a320"} Mar 18 19:15:26 crc kubenswrapper[4830]: I0318 19:15:26.543591 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wc8wj" event={"ID":"f986dda7-facc-4d2a-9d5c-1dceabdeed58","Type":"ContainerStarted","Data":"4eea866ee9af8f6cfa25f0fcc02ca53ea89e7fa038c12d929f04ea88c9e939a5"} Mar 18 19:15:27 crc kubenswrapper[4830]: I0318 19:15:27.557858 4830 generic.go:334] "Generic (PLEG): container finished" podID="f986dda7-facc-4d2a-9d5c-1dceabdeed58" containerID="4eea866ee9af8f6cfa25f0fcc02ca53ea89e7fa038c12d929f04ea88c9e939a5" exitCode=0 Mar 18 19:15:27 crc kubenswrapper[4830]: I0318 19:15:27.557932 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wc8wj" event={"ID":"f986dda7-facc-4d2a-9d5c-1dceabdeed58","Type":"ContainerDied","Data":"4eea866ee9af8f6cfa25f0fcc02ca53ea89e7fa038c12d929f04ea88c9e939a5"} Mar 18 19:15:28 crc kubenswrapper[4830]: I0318 19:15:28.574274 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wc8wj" event={"ID":"f986dda7-facc-4d2a-9d5c-1dceabdeed58","Type":"ContainerStarted","Data":"c5bce896ab37c1d7135885a485414e9ff9be9456be2332b95c07133349a04f8d"} Mar 18 19:15:28 crc kubenswrapper[4830]: I0318 19:15:28.608026 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wc8wj" podStartSLOduration=3.053004066 podStartE2EDuration="5.607999988s" podCreationTimestamp="2026-03-18 19:15:23 +0000 UTC" firstStartedPulling="2026-03-18 19:15:25.530436614 +0000 UTC m=+4360.098066946" lastFinishedPulling="2026-03-18 19:15:28.085432526 +0000 UTC m=+4362.653062868" observedRunningTime="2026-03-18 19:15:28.607564376 +0000 UTC m=+4363.175194738" watchObservedRunningTime="2026-03-18 19:15:28.607999988 +0000 UTC m=+4363.175630350" Mar 18 19:15:29 crc kubenswrapper[4830]: I0318 19:15:29.510418 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:15:29 crc kubenswrapper[4830]: I0318 19:15:29.510497 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:15:29 crc kubenswrapper[4830]: I0318 19:15:29.510555 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" Mar 18 19:15:29 crc kubenswrapper[4830]: I0318 19:15:29.511312 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dd00990d3e0490eb7bb214bb6b0dc05f42337becb8fc6e26428974096f6651c9"} pod="openshift-machine-config-operator/machine-config-daemon-plzpb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 19:15:29 crc kubenswrapper[4830]: I0318 19:15:29.511397 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" containerID="cri-o://dd00990d3e0490eb7bb214bb6b0dc05f42337becb8fc6e26428974096f6651c9" gracePeriod=600 Mar 18 19:15:29 crc kubenswrapper[4830]: E0318 19:15:29.640813 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:15:30 crc kubenswrapper[4830]: I0318 19:15:30.591326 4830 generic.go:334] "Generic (PLEG): container finished" podID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerID="dd00990d3e0490eb7bb214bb6b0dc05f42337becb8fc6e26428974096f6651c9" exitCode=0 Mar 18 19:15:30 crc kubenswrapper[4830]: I0318 19:15:30.591401 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerDied","Data":"dd00990d3e0490eb7bb214bb6b0dc05f42337becb8fc6e26428974096f6651c9"} Mar 18 19:15:30 crc kubenswrapper[4830]: I0318 19:15:30.591619 4830 scope.go:117] "RemoveContainer" containerID="cf7d9c2b6dea6abbfa87a4fa3f7f0f5653ccafbb3d4a9baa3dff8679fc8ef01c" Mar 18 19:15:30 crc kubenswrapper[4830]: I0318 19:15:30.592004 4830 scope.go:117] "RemoveContainer" containerID="dd00990d3e0490eb7bb214bb6b0dc05f42337becb8fc6e26428974096f6651c9" Mar 18 19:15:30 crc kubenswrapper[4830]: E0318 19:15:30.592200 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:15:34 crc kubenswrapper[4830]: I0318 19:15:34.311297 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wc8wj" Mar 18 19:15:34 crc kubenswrapper[4830]: I0318 19:15:34.311648 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wc8wj" Mar 18 19:15:35 crc kubenswrapper[4830]: I0318 19:15:35.362348 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wc8wj" podUID="f986dda7-facc-4d2a-9d5c-1dceabdeed58" containerName="registry-server" probeResult="failure" output=< Mar 18 19:15:35 crc kubenswrapper[4830]: timeout: failed to connect service ":50051" within 1s Mar 18 19:15:35 crc kubenswrapper[4830]: > Mar 18 19:15:42 crc kubenswrapper[4830]: I0318 19:15:42.235461 4830 scope.go:117] "RemoveContainer" containerID="dd00990d3e0490eb7bb214bb6b0dc05f42337becb8fc6e26428974096f6651c9" Mar 18 19:15:42 crc kubenswrapper[4830]: E0318 19:15:42.236581 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:15:44 crc kubenswrapper[4830]: I0318 19:15:44.387532 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wc8wj" Mar 18 19:15:44 crc kubenswrapper[4830]: I0318 19:15:44.461435 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wc8wj" Mar 18 19:15:44 crc kubenswrapper[4830]: I0318 19:15:44.633240 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wc8wj"] Mar 18 19:15:45 crc kubenswrapper[4830]: I0318 19:15:45.746141 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wc8wj" podUID="f986dda7-facc-4d2a-9d5c-1dceabdeed58" containerName="registry-server" containerID="cri-o://c5bce896ab37c1d7135885a485414e9ff9be9456be2332b95c07133349a04f8d" gracePeriod=2 Mar 18 19:15:46 crc kubenswrapper[4830]: I0318 19:15:46.272832 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wc8wj" Mar 18 19:15:46 crc kubenswrapper[4830]: I0318 19:15:46.436172 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksvxl\" (UniqueName: \"kubernetes.io/projected/f986dda7-facc-4d2a-9d5c-1dceabdeed58-kube-api-access-ksvxl\") pod \"f986dda7-facc-4d2a-9d5c-1dceabdeed58\" (UID: \"f986dda7-facc-4d2a-9d5c-1dceabdeed58\") " Mar 18 19:15:46 crc kubenswrapper[4830]: I0318 19:15:46.436327 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f986dda7-facc-4d2a-9d5c-1dceabdeed58-utilities\") pod \"f986dda7-facc-4d2a-9d5c-1dceabdeed58\" (UID: \"f986dda7-facc-4d2a-9d5c-1dceabdeed58\") " Mar 18 19:15:46 crc kubenswrapper[4830]: I0318 19:15:46.436464 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f986dda7-facc-4d2a-9d5c-1dceabdeed58-catalog-content\") pod \"f986dda7-facc-4d2a-9d5c-1dceabdeed58\" (UID: \"f986dda7-facc-4d2a-9d5c-1dceabdeed58\") " Mar 18 19:15:46 crc kubenswrapper[4830]: I0318 19:15:46.437398 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f986dda7-facc-4d2a-9d5c-1dceabdeed58-utilities" (OuterVolumeSpecName: "utilities") pod "f986dda7-facc-4d2a-9d5c-1dceabdeed58" (UID: "f986dda7-facc-4d2a-9d5c-1dceabdeed58"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:15:46 crc kubenswrapper[4830]: I0318 19:15:46.446966 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f986dda7-facc-4d2a-9d5c-1dceabdeed58-kube-api-access-ksvxl" (OuterVolumeSpecName: "kube-api-access-ksvxl") pod "f986dda7-facc-4d2a-9d5c-1dceabdeed58" (UID: "f986dda7-facc-4d2a-9d5c-1dceabdeed58"). InnerVolumeSpecName "kube-api-access-ksvxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:15:46 crc kubenswrapper[4830]: I0318 19:15:46.539073 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f986dda7-facc-4d2a-9d5c-1dceabdeed58-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 19:15:46 crc kubenswrapper[4830]: I0318 19:15:46.539131 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksvxl\" (UniqueName: \"kubernetes.io/projected/f986dda7-facc-4d2a-9d5c-1dceabdeed58-kube-api-access-ksvxl\") on node \"crc\" DevicePath \"\"" Mar 18 19:15:46 crc kubenswrapper[4830]: I0318 19:15:46.617984 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f986dda7-facc-4d2a-9d5c-1dceabdeed58-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f986dda7-facc-4d2a-9d5c-1dceabdeed58" (UID: "f986dda7-facc-4d2a-9d5c-1dceabdeed58"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:15:46 crc kubenswrapper[4830]: I0318 19:15:46.640625 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f986dda7-facc-4d2a-9d5c-1dceabdeed58-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 19:15:46 crc kubenswrapper[4830]: I0318 19:15:46.759063 4830 generic.go:334] "Generic (PLEG): container finished" podID="f986dda7-facc-4d2a-9d5c-1dceabdeed58" containerID="c5bce896ab37c1d7135885a485414e9ff9be9456be2332b95c07133349a04f8d" exitCode=0 Mar 18 19:15:46 crc kubenswrapper[4830]: I0318 19:15:46.759141 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wc8wj" event={"ID":"f986dda7-facc-4d2a-9d5c-1dceabdeed58","Type":"ContainerDied","Data":"c5bce896ab37c1d7135885a485414e9ff9be9456be2332b95c07133349a04f8d"} Mar 18 19:15:46 crc kubenswrapper[4830]: I0318 19:15:46.759197 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wc8wj" event={"ID":"f986dda7-facc-4d2a-9d5c-1dceabdeed58","Type":"ContainerDied","Data":"1e466d14157830cb7ca49ec0bb15f10adcee76f989a1762155dad612b349a320"} Mar 18 19:15:46 crc kubenswrapper[4830]: I0318 19:15:46.759236 4830 scope.go:117] "RemoveContainer" containerID="c5bce896ab37c1d7135885a485414e9ff9be9456be2332b95c07133349a04f8d" Mar 18 19:15:46 crc kubenswrapper[4830]: I0318 19:15:46.759458 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wc8wj" Mar 18 19:15:46 crc kubenswrapper[4830]: I0318 19:15:46.785914 4830 scope.go:117] "RemoveContainer" containerID="4eea866ee9af8f6cfa25f0fcc02ca53ea89e7fa038c12d929f04ea88c9e939a5" Mar 18 19:15:46 crc kubenswrapper[4830]: I0318 19:15:46.812809 4830 scope.go:117] "RemoveContainer" containerID="2b5eb430519cc2a541040512b4b9b41467396e7a5976d0549f8ceb1baff6e035" Mar 18 19:15:46 crc kubenswrapper[4830]: I0318 19:15:46.820303 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wc8wj"] Mar 18 19:15:46 crc kubenswrapper[4830]: I0318 19:15:46.829190 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wc8wj"] Mar 18 19:15:46 crc kubenswrapper[4830]: I0318 19:15:46.862181 4830 scope.go:117] "RemoveContainer" containerID="c5bce896ab37c1d7135885a485414e9ff9be9456be2332b95c07133349a04f8d" Mar 18 19:15:46 crc kubenswrapper[4830]: E0318 19:15:46.862578 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5bce896ab37c1d7135885a485414e9ff9be9456be2332b95c07133349a04f8d\": container with ID starting with c5bce896ab37c1d7135885a485414e9ff9be9456be2332b95c07133349a04f8d not found: ID does not exist" containerID="c5bce896ab37c1d7135885a485414e9ff9be9456be2332b95c07133349a04f8d" Mar 18 19:15:46 crc kubenswrapper[4830]: I0318 19:15:46.862624 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5bce896ab37c1d7135885a485414e9ff9be9456be2332b95c07133349a04f8d"} err="failed to get container status \"c5bce896ab37c1d7135885a485414e9ff9be9456be2332b95c07133349a04f8d\": rpc error: code = NotFound desc = could not find container \"c5bce896ab37c1d7135885a485414e9ff9be9456be2332b95c07133349a04f8d\": container with ID starting with c5bce896ab37c1d7135885a485414e9ff9be9456be2332b95c07133349a04f8d not found: ID does not exist" Mar 18 19:15:46 crc kubenswrapper[4830]: I0318 19:15:46.862656 4830 scope.go:117] "RemoveContainer" containerID="4eea866ee9af8f6cfa25f0fcc02ca53ea89e7fa038c12d929f04ea88c9e939a5" Mar 18 19:15:46 crc kubenswrapper[4830]: E0318 19:15:46.863162 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4eea866ee9af8f6cfa25f0fcc02ca53ea89e7fa038c12d929f04ea88c9e939a5\": container with ID starting with 4eea866ee9af8f6cfa25f0fcc02ca53ea89e7fa038c12d929f04ea88c9e939a5 not found: ID does not exist" containerID="4eea866ee9af8f6cfa25f0fcc02ca53ea89e7fa038c12d929f04ea88c9e939a5" Mar 18 19:15:46 crc kubenswrapper[4830]: I0318 19:15:46.863387 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4eea866ee9af8f6cfa25f0fcc02ca53ea89e7fa038c12d929f04ea88c9e939a5"} err="failed to get container status \"4eea866ee9af8f6cfa25f0fcc02ca53ea89e7fa038c12d929f04ea88c9e939a5\": rpc error: code = NotFound desc = could not find container \"4eea866ee9af8f6cfa25f0fcc02ca53ea89e7fa038c12d929f04ea88c9e939a5\": container with ID starting with 4eea866ee9af8f6cfa25f0fcc02ca53ea89e7fa038c12d929f04ea88c9e939a5 not found: ID does not exist" Mar 18 19:15:46 crc kubenswrapper[4830]: I0318 19:15:46.863485 4830 scope.go:117] "RemoveContainer" containerID="2b5eb430519cc2a541040512b4b9b41467396e7a5976d0549f8ceb1baff6e035" Mar 18 19:15:46 crc kubenswrapper[4830]: E0318 19:15:46.863901 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b5eb430519cc2a541040512b4b9b41467396e7a5976d0549f8ceb1baff6e035\": container with ID starting with 2b5eb430519cc2a541040512b4b9b41467396e7a5976d0549f8ceb1baff6e035 not found: ID does not exist" containerID="2b5eb430519cc2a541040512b4b9b41467396e7a5976d0549f8ceb1baff6e035" Mar 18 19:15:46 crc kubenswrapper[4830]: I0318 19:15:46.863927 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b5eb430519cc2a541040512b4b9b41467396e7a5976d0549f8ceb1baff6e035"} err="failed to get container status \"2b5eb430519cc2a541040512b4b9b41467396e7a5976d0549f8ceb1baff6e035\": rpc error: code = NotFound desc = could not find container \"2b5eb430519cc2a541040512b4b9b41467396e7a5976d0549f8ceb1baff6e035\": container with ID starting with 2b5eb430519cc2a541040512b4b9b41467396e7a5976d0549f8ceb1baff6e035 not found: ID does not exist" Mar 18 19:15:48 crc kubenswrapper[4830]: I0318 19:15:48.245992 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f986dda7-facc-4d2a-9d5c-1dceabdeed58" path="/var/lib/kubelet/pods/f986dda7-facc-4d2a-9d5c-1dceabdeed58/volumes" Mar 18 19:15:57 crc kubenswrapper[4830]: I0318 19:15:57.235442 4830 scope.go:117] "RemoveContainer" containerID="dd00990d3e0490eb7bb214bb6b0dc05f42337becb8fc6e26428974096f6651c9" Mar 18 19:15:57 crc kubenswrapper[4830]: E0318 19:15:57.236162 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:16:00 crc kubenswrapper[4830]: I0318 19:16:00.159706 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564356-n56vq"] Mar 18 19:16:00 crc kubenswrapper[4830]: E0318 19:16:00.160475 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f986dda7-facc-4d2a-9d5c-1dceabdeed58" containerName="registry-server" Mar 18 19:16:00 crc kubenswrapper[4830]: I0318 19:16:00.160495 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f986dda7-facc-4d2a-9d5c-1dceabdeed58" containerName="registry-server" Mar 18 19:16:00 crc kubenswrapper[4830]: E0318 19:16:00.160530 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f986dda7-facc-4d2a-9d5c-1dceabdeed58" containerName="extract-content" Mar 18 19:16:00 crc kubenswrapper[4830]: I0318 19:16:00.160542 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f986dda7-facc-4d2a-9d5c-1dceabdeed58" containerName="extract-content" Mar 18 19:16:00 crc kubenswrapper[4830]: E0318 19:16:00.160573 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f986dda7-facc-4d2a-9d5c-1dceabdeed58" containerName="extract-utilities" Mar 18 19:16:00 crc kubenswrapper[4830]: I0318 19:16:00.160587 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f986dda7-facc-4d2a-9d5c-1dceabdeed58" containerName="extract-utilities" Mar 18 19:16:00 crc kubenswrapper[4830]: I0318 19:16:00.160970 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f986dda7-facc-4d2a-9d5c-1dceabdeed58" containerName="registry-server" Mar 18 19:16:00 crc kubenswrapper[4830]: I0318 19:16:00.161719 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564356-n56vq" Mar 18 19:16:00 crc kubenswrapper[4830]: I0318 19:16:00.164888 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 19:16:00 crc kubenswrapper[4830]: I0318 19:16:00.165055 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:16:00 crc kubenswrapper[4830]: I0318 19:16:00.167868 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:16:00 crc kubenswrapper[4830]: I0318 19:16:00.204106 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564356-n56vq"] Mar 18 19:16:00 crc kubenswrapper[4830]: I0318 19:16:00.262335 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxzkt\" (UniqueName: \"kubernetes.io/projected/d6aa1292-dcc9-4ea5-8825-18e0fc478a5b-kube-api-access-cxzkt\") pod \"auto-csr-approver-29564356-n56vq\" (UID: \"d6aa1292-dcc9-4ea5-8825-18e0fc478a5b\") " pod="openshift-infra/auto-csr-approver-29564356-n56vq" Mar 18 19:16:00 crc kubenswrapper[4830]: I0318 19:16:00.364237 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxzkt\" (UniqueName: \"kubernetes.io/projected/d6aa1292-dcc9-4ea5-8825-18e0fc478a5b-kube-api-access-cxzkt\") pod \"auto-csr-approver-29564356-n56vq\" (UID: \"d6aa1292-dcc9-4ea5-8825-18e0fc478a5b\") " pod="openshift-infra/auto-csr-approver-29564356-n56vq" Mar 18 19:16:00 crc kubenswrapper[4830]: I0318 19:16:00.415360 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxzkt\" (UniqueName: \"kubernetes.io/projected/d6aa1292-dcc9-4ea5-8825-18e0fc478a5b-kube-api-access-cxzkt\") pod \"auto-csr-approver-29564356-n56vq\" (UID: \"d6aa1292-dcc9-4ea5-8825-18e0fc478a5b\") " pod="openshift-infra/auto-csr-approver-29564356-n56vq" Mar 18 19:16:00 crc kubenswrapper[4830]: I0318 19:16:00.511461 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564356-n56vq" Mar 18 19:16:00 crc kubenswrapper[4830]: I0318 19:16:00.788089 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564356-n56vq"] Mar 18 19:16:00 crc kubenswrapper[4830]: I0318 19:16:00.893940 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564356-n56vq" event={"ID":"d6aa1292-dcc9-4ea5-8825-18e0fc478a5b","Type":"ContainerStarted","Data":"051abf3819658bb3480d356d86eed6cf48fdc2e2020582dff4c62478703bdb46"} Mar 18 19:16:02 crc kubenswrapper[4830]: I0318 19:16:02.912456 4830 generic.go:334] "Generic (PLEG): container finished" podID="d6aa1292-dcc9-4ea5-8825-18e0fc478a5b" containerID="679f40214815ba5fa3a8e61b1b1bf08607b92061f5640f9bc7f2785eeb752a54" exitCode=0 Mar 18 19:16:02 crc kubenswrapper[4830]: I0318 19:16:02.912527 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564356-n56vq" event={"ID":"d6aa1292-dcc9-4ea5-8825-18e0fc478a5b","Type":"ContainerDied","Data":"679f40214815ba5fa3a8e61b1b1bf08607b92061f5640f9bc7f2785eeb752a54"} Mar 18 19:16:04 crc kubenswrapper[4830]: I0318 19:16:04.271290 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564356-n56vq" Mar 18 19:16:04 crc kubenswrapper[4830]: I0318 19:16:04.325377 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxzkt\" (UniqueName: \"kubernetes.io/projected/d6aa1292-dcc9-4ea5-8825-18e0fc478a5b-kube-api-access-cxzkt\") pod \"d6aa1292-dcc9-4ea5-8825-18e0fc478a5b\" (UID: \"d6aa1292-dcc9-4ea5-8825-18e0fc478a5b\") " Mar 18 19:16:04 crc kubenswrapper[4830]: I0318 19:16:04.333847 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6aa1292-dcc9-4ea5-8825-18e0fc478a5b-kube-api-access-cxzkt" (OuterVolumeSpecName: "kube-api-access-cxzkt") pod "d6aa1292-dcc9-4ea5-8825-18e0fc478a5b" (UID: "d6aa1292-dcc9-4ea5-8825-18e0fc478a5b"). InnerVolumeSpecName "kube-api-access-cxzkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:16:04 crc kubenswrapper[4830]: I0318 19:16:04.427372 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxzkt\" (UniqueName: \"kubernetes.io/projected/d6aa1292-dcc9-4ea5-8825-18e0fc478a5b-kube-api-access-cxzkt\") on node \"crc\" DevicePath \"\"" Mar 18 19:16:04 crc kubenswrapper[4830]: I0318 19:16:04.959423 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564356-n56vq" event={"ID":"d6aa1292-dcc9-4ea5-8825-18e0fc478a5b","Type":"ContainerDied","Data":"051abf3819658bb3480d356d86eed6cf48fdc2e2020582dff4c62478703bdb46"} Mar 18 19:16:04 crc kubenswrapper[4830]: I0318 19:16:04.959479 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564356-n56vq" Mar 18 19:16:04 crc kubenswrapper[4830]: I0318 19:16:04.959508 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="051abf3819658bb3480d356d86eed6cf48fdc2e2020582dff4c62478703bdb46" Mar 18 19:16:05 crc kubenswrapper[4830]: I0318 19:16:05.371586 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564350-ms9hw"] Mar 18 19:16:05 crc kubenswrapper[4830]: I0318 19:16:05.383460 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564350-ms9hw"] Mar 18 19:16:06 crc kubenswrapper[4830]: I0318 19:16:06.250639 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f87c0735-7c95-4e91-acb4-897e6db6909d" path="/var/lib/kubelet/pods/f87c0735-7c95-4e91-acb4-897e6db6909d/volumes" Mar 18 19:16:08 crc kubenswrapper[4830]: I0318 19:16:08.234939 4830 scope.go:117] "RemoveContainer" containerID="dd00990d3e0490eb7bb214bb6b0dc05f42337becb8fc6e26428974096f6651c9" Mar 18 19:16:08 crc kubenswrapper[4830]: E0318 19:16:08.235687 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:16:19 crc kubenswrapper[4830]: I0318 19:16:19.238421 4830 scope.go:117] "RemoveContainer" containerID="12a773d396c894165f8c80547c107752459a02cdd7a174af094337122ee97f81" Mar 18 19:16:23 crc kubenswrapper[4830]: I0318 19:16:23.235633 4830 scope.go:117] "RemoveContainer" containerID="dd00990d3e0490eb7bb214bb6b0dc05f42337becb8fc6e26428974096f6651c9" Mar 18 19:16:23 crc kubenswrapper[4830]: E0318 19:16:23.236372 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:16:37 crc kubenswrapper[4830]: I0318 19:16:37.235708 4830 scope.go:117] "RemoveContainer" containerID="dd00990d3e0490eb7bb214bb6b0dc05f42337becb8fc6e26428974096f6651c9" Mar 18 19:16:37 crc kubenswrapper[4830]: E0318 19:16:37.237007 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:16:49 crc kubenswrapper[4830]: I0318 19:16:49.234908 4830 scope.go:117] "RemoveContainer" containerID="dd00990d3e0490eb7bb214bb6b0dc05f42337becb8fc6e26428974096f6651c9" Mar 18 19:16:49 crc kubenswrapper[4830]: E0318 19:16:49.235903 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:17:02 crc kubenswrapper[4830]: I0318 19:17:02.234569 4830 scope.go:117] "RemoveContainer" containerID="dd00990d3e0490eb7bb214bb6b0dc05f42337becb8fc6e26428974096f6651c9" Mar 18 19:17:02 crc kubenswrapper[4830]: E0318 19:17:02.235735 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:17:13 crc kubenswrapper[4830]: I0318 19:17:13.234924 4830 scope.go:117] "RemoveContainer" containerID="dd00990d3e0490eb7bb214bb6b0dc05f42337becb8fc6e26428974096f6651c9" Mar 18 19:17:13 crc kubenswrapper[4830]: E0318 19:17:13.235579 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:17:28 crc kubenswrapper[4830]: I0318 19:17:28.235976 4830 scope.go:117] "RemoveContainer" containerID="dd00990d3e0490eb7bb214bb6b0dc05f42337becb8fc6e26428974096f6651c9" Mar 18 19:17:28 crc kubenswrapper[4830]: E0318 19:17:28.237197 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:17:42 crc kubenswrapper[4830]: I0318 19:17:42.236079 4830 scope.go:117] "RemoveContainer" containerID="dd00990d3e0490eb7bb214bb6b0dc05f42337becb8fc6e26428974096f6651c9" Mar 18 19:17:42 crc kubenswrapper[4830]: E0318 19:17:42.237358 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:17:53 crc kubenswrapper[4830]: I0318 19:17:53.234821 4830 scope.go:117] "RemoveContainer" containerID="dd00990d3e0490eb7bb214bb6b0dc05f42337becb8fc6e26428974096f6651c9" Mar 18 19:17:53 crc kubenswrapper[4830]: E0318 19:17:53.235931 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:18:00 crc kubenswrapper[4830]: I0318 19:18:00.164836 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564358-sttt2"] Mar 18 19:18:00 crc kubenswrapper[4830]: E0318 19:18:00.166111 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6aa1292-dcc9-4ea5-8825-18e0fc478a5b" containerName="oc" Mar 18 19:18:00 crc kubenswrapper[4830]: I0318 19:18:00.166136 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6aa1292-dcc9-4ea5-8825-18e0fc478a5b" containerName="oc" Mar 18 19:18:00 crc kubenswrapper[4830]: I0318 19:18:00.166380 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6aa1292-dcc9-4ea5-8825-18e0fc478a5b" containerName="oc" Mar 18 19:18:00 crc kubenswrapper[4830]: I0318 19:18:00.167194 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564358-sttt2" Mar 18 19:18:00 crc kubenswrapper[4830]: I0318 19:18:00.171761 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 19:18:00 crc kubenswrapper[4830]: I0318 19:18:00.172194 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:18:00 crc kubenswrapper[4830]: I0318 19:18:00.172612 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:18:00 crc kubenswrapper[4830]: I0318 19:18:00.201093 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564358-sttt2"] Mar 18 19:18:00 crc kubenswrapper[4830]: I0318 19:18:00.291936 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92fj4\" (UniqueName: \"kubernetes.io/projected/991f9ef1-91ba-4acd-8a7f-6b308ede9334-kube-api-access-92fj4\") pod \"auto-csr-approver-29564358-sttt2\" (UID: \"991f9ef1-91ba-4acd-8a7f-6b308ede9334\") " pod="openshift-infra/auto-csr-approver-29564358-sttt2" Mar 18 19:18:00 crc kubenswrapper[4830]: I0318 19:18:00.393921 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92fj4\" (UniqueName: \"kubernetes.io/projected/991f9ef1-91ba-4acd-8a7f-6b308ede9334-kube-api-access-92fj4\") pod \"auto-csr-approver-29564358-sttt2\" (UID: \"991f9ef1-91ba-4acd-8a7f-6b308ede9334\") " pod="openshift-infra/auto-csr-approver-29564358-sttt2" Mar 18 19:18:00 crc kubenswrapper[4830]: I0318 19:18:00.412860 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92fj4\" (UniqueName: \"kubernetes.io/projected/991f9ef1-91ba-4acd-8a7f-6b308ede9334-kube-api-access-92fj4\") pod \"auto-csr-approver-29564358-sttt2\" (UID: \"991f9ef1-91ba-4acd-8a7f-6b308ede9334\") " pod="openshift-infra/auto-csr-approver-29564358-sttt2" Mar 18 19:18:00 crc kubenswrapper[4830]: I0318 19:18:00.526239 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564358-sttt2" Mar 18 19:18:00 crc kubenswrapper[4830]: I0318 19:18:00.986604 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564358-sttt2"] Mar 18 19:18:01 crc kubenswrapper[4830]: I0318 19:18:01.069396 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564358-sttt2" event={"ID":"991f9ef1-91ba-4acd-8a7f-6b308ede9334","Type":"ContainerStarted","Data":"0b9cc931781bc3c16881bc4d2138219052189cf8bff721e7eac0ddf548af88ba"} Mar 18 19:18:03 crc kubenswrapper[4830]: I0318 19:18:03.095747 4830 generic.go:334] "Generic (PLEG): container finished" podID="991f9ef1-91ba-4acd-8a7f-6b308ede9334" containerID="1a7ebc514e39463bc7f9063c6d8db38403e70d19f546bd0d275023aef7535119" exitCode=0 Mar 18 19:18:03 crc kubenswrapper[4830]: I0318 19:18:03.095838 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564358-sttt2" event={"ID":"991f9ef1-91ba-4acd-8a7f-6b308ede9334","Type":"ContainerDied","Data":"1a7ebc514e39463bc7f9063c6d8db38403e70d19f546bd0d275023aef7535119"} Mar 18 19:18:04 crc kubenswrapper[4830]: I0318 19:18:04.235489 4830 scope.go:117] "RemoveContainer" containerID="dd00990d3e0490eb7bb214bb6b0dc05f42337becb8fc6e26428974096f6651c9" Mar 18 19:18:04 crc kubenswrapper[4830]: E0318 19:18:04.236577 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:18:04 crc kubenswrapper[4830]: I0318 19:18:04.473759 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564358-sttt2" Mar 18 19:18:04 crc kubenswrapper[4830]: I0318 19:18:04.558051 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92fj4\" (UniqueName: \"kubernetes.io/projected/991f9ef1-91ba-4acd-8a7f-6b308ede9334-kube-api-access-92fj4\") pod \"991f9ef1-91ba-4acd-8a7f-6b308ede9334\" (UID: \"991f9ef1-91ba-4acd-8a7f-6b308ede9334\") " Mar 18 19:18:04 crc kubenswrapper[4830]: I0318 19:18:04.568168 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/991f9ef1-91ba-4acd-8a7f-6b308ede9334-kube-api-access-92fj4" (OuterVolumeSpecName: "kube-api-access-92fj4") pod "991f9ef1-91ba-4acd-8a7f-6b308ede9334" (UID: "991f9ef1-91ba-4acd-8a7f-6b308ede9334"). InnerVolumeSpecName "kube-api-access-92fj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:18:04 crc kubenswrapper[4830]: I0318 19:18:04.659751 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92fj4\" (UniqueName: \"kubernetes.io/projected/991f9ef1-91ba-4acd-8a7f-6b308ede9334-kube-api-access-92fj4\") on node \"crc\" DevicePath \"\"" Mar 18 19:18:05 crc kubenswrapper[4830]: I0318 19:18:05.113294 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564358-sttt2" event={"ID":"991f9ef1-91ba-4acd-8a7f-6b308ede9334","Type":"ContainerDied","Data":"0b9cc931781bc3c16881bc4d2138219052189cf8bff721e7eac0ddf548af88ba"} Mar 18 19:18:05 crc kubenswrapper[4830]: I0318 19:18:05.113610 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b9cc931781bc3c16881bc4d2138219052189cf8bff721e7eac0ddf548af88ba" Mar 18 19:18:05 crc kubenswrapper[4830]: I0318 19:18:05.113396 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564358-sttt2" Mar 18 19:18:05 crc kubenswrapper[4830]: I0318 19:18:05.570162 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564352-7vd96"] Mar 18 19:18:05 crc kubenswrapper[4830]: I0318 19:18:05.579307 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564352-7vd96"] Mar 18 19:18:06 crc kubenswrapper[4830]: I0318 19:18:06.251455 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61019ad2-9549-4b71-bb90-06eaa98298fc" path="/var/lib/kubelet/pods/61019ad2-9549-4b71-bb90-06eaa98298fc/volumes" Mar 18 19:18:16 crc kubenswrapper[4830]: I0318 19:18:16.243569 4830 scope.go:117] "RemoveContainer" containerID="dd00990d3e0490eb7bb214bb6b0dc05f42337becb8fc6e26428974096f6651c9" Mar 18 19:18:16 crc kubenswrapper[4830]: E0318 19:18:16.245202 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:18:19 crc kubenswrapper[4830]: I0318 19:18:19.359556 4830 scope.go:117] "RemoveContainer" containerID="e7e8a6cca51c6788c2f81b0b4e5cf2275f953ec28b83fe3ee4063170fc7c2373" Mar 18 19:18:29 crc kubenswrapper[4830]: I0318 19:18:29.234373 4830 scope.go:117] "RemoveContainer" containerID="dd00990d3e0490eb7bb214bb6b0dc05f42337becb8fc6e26428974096f6651c9" Mar 18 19:18:29 crc kubenswrapper[4830]: E0318 19:18:29.235160 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:18:40 crc kubenswrapper[4830]: I0318 19:18:40.522615 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-49ctn"] Mar 18 19:18:40 crc kubenswrapper[4830]: E0318 19:18:40.523639 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="991f9ef1-91ba-4acd-8a7f-6b308ede9334" containerName="oc" Mar 18 19:18:40 crc kubenswrapper[4830]: I0318 19:18:40.523663 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="991f9ef1-91ba-4acd-8a7f-6b308ede9334" containerName="oc" Mar 18 19:18:40 crc kubenswrapper[4830]: I0318 19:18:40.523977 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="991f9ef1-91ba-4acd-8a7f-6b308ede9334" containerName="oc" Mar 18 19:18:40 crc kubenswrapper[4830]: I0318 19:18:40.525638 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-49ctn" Mar 18 19:18:40 crc kubenswrapper[4830]: I0318 19:18:40.549670 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-49ctn"] Mar 18 19:18:40 crc kubenswrapper[4830]: I0318 19:18:40.585267 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2vtv\" (UniqueName: \"kubernetes.io/projected/b502d9f5-44e7-412d-92aa-c4f3b7de6f95-kube-api-access-d2vtv\") pod \"redhat-marketplace-49ctn\" (UID: \"b502d9f5-44e7-412d-92aa-c4f3b7de6f95\") " pod="openshift-marketplace/redhat-marketplace-49ctn" Mar 18 19:18:40 crc kubenswrapper[4830]: I0318 19:18:40.585328 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b502d9f5-44e7-412d-92aa-c4f3b7de6f95-utilities\") pod \"redhat-marketplace-49ctn\" (UID: \"b502d9f5-44e7-412d-92aa-c4f3b7de6f95\") " pod="openshift-marketplace/redhat-marketplace-49ctn" Mar 18 19:18:40 crc kubenswrapper[4830]: I0318 19:18:40.585363 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b502d9f5-44e7-412d-92aa-c4f3b7de6f95-catalog-content\") pod \"redhat-marketplace-49ctn\" (UID: \"b502d9f5-44e7-412d-92aa-c4f3b7de6f95\") " pod="openshift-marketplace/redhat-marketplace-49ctn" Mar 18 19:18:40 crc kubenswrapper[4830]: I0318 19:18:40.686225 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2vtv\" (UniqueName: \"kubernetes.io/projected/b502d9f5-44e7-412d-92aa-c4f3b7de6f95-kube-api-access-d2vtv\") pod \"redhat-marketplace-49ctn\" (UID: \"b502d9f5-44e7-412d-92aa-c4f3b7de6f95\") " pod="openshift-marketplace/redhat-marketplace-49ctn" Mar 18 19:18:40 crc kubenswrapper[4830]: I0318 19:18:40.686288 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b502d9f5-44e7-412d-92aa-c4f3b7de6f95-utilities\") pod \"redhat-marketplace-49ctn\" (UID: \"b502d9f5-44e7-412d-92aa-c4f3b7de6f95\") " pod="openshift-marketplace/redhat-marketplace-49ctn" Mar 18 19:18:40 crc kubenswrapper[4830]: I0318 19:18:40.686322 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b502d9f5-44e7-412d-92aa-c4f3b7de6f95-catalog-content\") pod \"redhat-marketplace-49ctn\" (UID: \"b502d9f5-44e7-412d-92aa-c4f3b7de6f95\") " pod="openshift-marketplace/redhat-marketplace-49ctn" Mar 18 19:18:40 crc kubenswrapper[4830]: I0318 19:18:40.686798 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b502d9f5-44e7-412d-92aa-c4f3b7de6f95-utilities\") pod \"redhat-marketplace-49ctn\" (UID: \"b502d9f5-44e7-412d-92aa-c4f3b7de6f95\") " pod="openshift-marketplace/redhat-marketplace-49ctn" Mar 18 19:18:40 crc kubenswrapper[4830]: I0318 19:18:40.686827 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b502d9f5-44e7-412d-92aa-c4f3b7de6f95-catalog-content\") pod \"redhat-marketplace-49ctn\" (UID: \"b502d9f5-44e7-412d-92aa-c4f3b7de6f95\") " pod="openshift-marketplace/redhat-marketplace-49ctn" Mar 18 19:18:40 crc kubenswrapper[4830]: I0318 19:18:40.719754 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2vtv\" (UniqueName: \"kubernetes.io/projected/b502d9f5-44e7-412d-92aa-c4f3b7de6f95-kube-api-access-d2vtv\") pod \"redhat-marketplace-49ctn\" (UID: \"b502d9f5-44e7-412d-92aa-c4f3b7de6f95\") " pod="openshift-marketplace/redhat-marketplace-49ctn" Mar 18 19:18:40 crc kubenswrapper[4830]: I0318 19:18:40.862155 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-49ctn" Mar 18 19:18:41 crc kubenswrapper[4830]: I0318 19:18:41.234679 4830 scope.go:117] "RemoveContainer" containerID="dd00990d3e0490eb7bb214bb6b0dc05f42337becb8fc6e26428974096f6651c9" Mar 18 19:18:41 crc kubenswrapper[4830]: E0318 19:18:41.235472 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:18:41 crc kubenswrapper[4830]: I0318 19:18:41.306425 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-49ctn"] Mar 18 19:18:41 crc kubenswrapper[4830]: I0318 19:18:41.484086 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-49ctn" event={"ID":"b502d9f5-44e7-412d-92aa-c4f3b7de6f95","Type":"ContainerStarted","Data":"15dd7edcd792629c546ef322d3284a694fb804b2b3dd69568934b799d77d9bc6"} Mar 18 19:18:41 crc kubenswrapper[4830]: I0318 19:18:41.484139 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-49ctn" event={"ID":"b502d9f5-44e7-412d-92aa-c4f3b7de6f95","Type":"ContainerStarted","Data":"6e88cdaec7dfb172257260d8011391cb04db1afdf195115a0b1122165c308991"} Mar 18 19:18:42 crc kubenswrapper[4830]: I0318 19:18:42.495264 4830 generic.go:334] "Generic (PLEG): container finished" podID="b502d9f5-44e7-412d-92aa-c4f3b7de6f95" containerID="15dd7edcd792629c546ef322d3284a694fb804b2b3dd69568934b799d77d9bc6" exitCode=0 Mar 18 19:18:42 crc kubenswrapper[4830]: I0318 19:18:42.495632 4830 generic.go:334] "Generic (PLEG): container finished" podID="b502d9f5-44e7-412d-92aa-c4f3b7de6f95" containerID="0ead5a4101c8a17fbeda124d2195ddcd81f92a38974d87ad89f04f048f168712" exitCode=0 Mar 18 19:18:42 crc kubenswrapper[4830]: I0318 19:18:42.495557 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-49ctn" event={"ID":"b502d9f5-44e7-412d-92aa-c4f3b7de6f95","Type":"ContainerDied","Data":"15dd7edcd792629c546ef322d3284a694fb804b2b3dd69568934b799d77d9bc6"} Mar 18 19:18:42 crc kubenswrapper[4830]: I0318 19:18:42.495678 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-49ctn" event={"ID":"b502d9f5-44e7-412d-92aa-c4f3b7de6f95","Type":"ContainerDied","Data":"0ead5a4101c8a17fbeda124d2195ddcd81f92a38974d87ad89f04f048f168712"} Mar 18 19:18:43 crc kubenswrapper[4830]: I0318 19:18:43.507289 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-49ctn" event={"ID":"b502d9f5-44e7-412d-92aa-c4f3b7de6f95","Type":"ContainerStarted","Data":"4e8164d10a04ec0de5087a4b3beb8ac29e9860480f568aabc185b039b01fdd3d"} Mar 18 19:18:43 crc kubenswrapper[4830]: I0318 19:18:43.564944 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-49ctn" podStartSLOduration=2.148101013 podStartE2EDuration="3.564912977s" podCreationTimestamp="2026-03-18 19:18:40 +0000 UTC" firstStartedPulling="2026-03-18 19:18:41.487881952 +0000 UTC m=+4556.055512294" lastFinishedPulling="2026-03-18 19:18:42.904693886 +0000 UTC m=+4557.472324258" observedRunningTime="2026-03-18 19:18:43.555177773 +0000 UTC m=+4558.122808145" watchObservedRunningTime="2026-03-18 19:18:43.564912977 +0000 UTC m=+4558.132543349" Mar 18 19:18:50 crc kubenswrapper[4830]: I0318 19:18:50.862752 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-49ctn" Mar 18 19:18:50 crc kubenswrapper[4830]: I0318 19:18:50.863515 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-49ctn" Mar 18 19:18:50 crc kubenswrapper[4830]: I0318 19:18:50.943706 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-49ctn" Mar 18 19:18:51 crc kubenswrapper[4830]: I0318 19:18:51.638129 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-49ctn" Mar 18 19:18:51 crc kubenswrapper[4830]: I0318 19:18:51.696294 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-49ctn"] Mar 18 19:18:53 crc kubenswrapper[4830]: I0318 19:18:53.595301 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-49ctn" podUID="b502d9f5-44e7-412d-92aa-c4f3b7de6f95" containerName="registry-server" containerID="cri-o://4e8164d10a04ec0de5087a4b3beb8ac29e9860480f568aabc185b039b01fdd3d" gracePeriod=2 Mar 18 19:18:54 crc kubenswrapper[4830]: I0318 19:18:54.118390 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-49ctn" Mar 18 19:18:54 crc kubenswrapper[4830]: I0318 19:18:54.208410 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b502d9f5-44e7-412d-92aa-c4f3b7de6f95-utilities\") pod \"b502d9f5-44e7-412d-92aa-c4f3b7de6f95\" (UID: \"b502d9f5-44e7-412d-92aa-c4f3b7de6f95\") " Mar 18 19:18:54 crc kubenswrapper[4830]: I0318 19:18:54.208525 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2vtv\" (UniqueName: \"kubernetes.io/projected/b502d9f5-44e7-412d-92aa-c4f3b7de6f95-kube-api-access-d2vtv\") pod \"b502d9f5-44e7-412d-92aa-c4f3b7de6f95\" (UID: \"b502d9f5-44e7-412d-92aa-c4f3b7de6f95\") " Mar 18 19:18:54 crc kubenswrapper[4830]: I0318 19:18:54.208636 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b502d9f5-44e7-412d-92aa-c4f3b7de6f95-catalog-content\") pod \"b502d9f5-44e7-412d-92aa-c4f3b7de6f95\" (UID: \"b502d9f5-44e7-412d-92aa-c4f3b7de6f95\") " Mar 18 19:18:54 crc kubenswrapper[4830]: I0318 19:18:54.209928 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b502d9f5-44e7-412d-92aa-c4f3b7de6f95-utilities" (OuterVolumeSpecName: "utilities") pod "b502d9f5-44e7-412d-92aa-c4f3b7de6f95" (UID: "b502d9f5-44e7-412d-92aa-c4f3b7de6f95"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:18:54 crc kubenswrapper[4830]: I0318 19:18:54.217356 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b502d9f5-44e7-412d-92aa-c4f3b7de6f95-kube-api-access-d2vtv" (OuterVolumeSpecName: "kube-api-access-d2vtv") pod "b502d9f5-44e7-412d-92aa-c4f3b7de6f95" (UID: "b502d9f5-44e7-412d-92aa-c4f3b7de6f95"). InnerVolumeSpecName "kube-api-access-d2vtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:18:54 crc kubenswrapper[4830]: I0318 19:18:54.243918 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b502d9f5-44e7-412d-92aa-c4f3b7de6f95-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b502d9f5-44e7-412d-92aa-c4f3b7de6f95" (UID: "b502d9f5-44e7-412d-92aa-c4f3b7de6f95"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:18:54 crc kubenswrapper[4830]: I0318 19:18:54.310604 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b502d9f5-44e7-412d-92aa-c4f3b7de6f95-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 19:18:54 crc kubenswrapper[4830]: I0318 19:18:54.310629 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b502d9f5-44e7-412d-92aa-c4f3b7de6f95-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 19:18:54 crc kubenswrapper[4830]: I0318 19:18:54.310639 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2vtv\" (UniqueName: \"kubernetes.io/projected/b502d9f5-44e7-412d-92aa-c4f3b7de6f95-kube-api-access-d2vtv\") on node \"crc\" DevicePath \"\"" Mar 18 19:18:54 crc kubenswrapper[4830]: I0318 19:18:54.607840 4830 generic.go:334] "Generic (PLEG): container finished" podID="b502d9f5-44e7-412d-92aa-c4f3b7de6f95" containerID="4e8164d10a04ec0de5087a4b3beb8ac29e9860480f568aabc185b039b01fdd3d" exitCode=0 Mar 18 19:18:54 crc kubenswrapper[4830]: I0318 19:18:54.607888 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-49ctn" event={"ID":"b502d9f5-44e7-412d-92aa-c4f3b7de6f95","Type":"ContainerDied","Data":"4e8164d10a04ec0de5087a4b3beb8ac29e9860480f568aabc185b039b01fdd3d"} Mar 18 19:18:54 crc kubenswrapper[4830]: I0318 19:18:54.607918 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-49ctn" event={"ID":"b502d9f5-44e7-412d-92aa-c4f3b7de6f95","Type":"ContainerDied","Data":"6e88cdaec7dfb172257260d8011391cb04db1afdf195115a0b1122165c308991"} Mar 18 19:18:54 crc kubenswrapper[4830]: I0318 19:18:54.607958 4830 scope.go:117] "RemoveContainer" containerID="4e8164d10a04ec0de5087a4b3beb8ac29e9860480f568aabc185b039b01fdd3d" Mar 18 19:18:54 crc kubenswrapper[4830]: I0318 19:18:54.607975 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-49ctn" Mar 18 19:18:54 crc kubenswrapper[4830]: I0318 19:18:54.641594 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-49ctn"] Mar 18 19:18:54 crc kubenswrapper[4830]: I0318 19:18:54.649210 4830 scope.go:117] "RemoveContainer" containerID="0ead5a4101c8a17fbeda124d2195ddcd81f92a38974d87ad89f04f048f168712" Mar 18 19:18:54 crc kubenswrapper[4830]: I0318 19:18:54.654020 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-49ctn"] Mar 18 19:18:54 crc kubenswrapper[4830]: I0318 19:18:54.677920 4830 scope.go:117] "RemoveContainer" containerID="15dd7edcd792629c546ef322d3284a694fb804b2b3dd69568934b799d77d9bc6" Mar 18 19:18:54 crc kubenswrapper[4830]: I0318 19:18:54.713060 4830 scope.go:117] "RemoveContainer" containerID="4e8164d10a04ec0de5087a4b3beb8ac29e9860480f568aabc185b039b01fdd3d" Mar 18 19:18:54 crc kubenswrapper[4830]: E0318 19:18:54.713556 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e8164d10a04ec0de5087a4b3beb8ac29e9860480f568aabc185b039b01fdd3d\": container with ID starting with 4e8164d10a04ec0de5087a4b3beb8ac29e9860480f568aabc185b039b01fdd3d not found: ID does not exist" containerID="4e8164d10a04ec0de5087a4b3beb8ac29e9860480f568aabc185b039b01fdd3d" Mar 18 19:18:54 crc kubenswrapper[4830]: I0318 19:18:54.713640 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e8164d10a04ec0de5087a4b3beb8ac29e9860480f568aabc185b039b01fdd3d"} err="failed to get container status \"4e8164d10a04ec0de5087a4b3beb8ac29e9860480f568aabc185b039b01fdd3d\": rpc error: code = NotFound desc = could not find container \"4e8164d10a04ec0de5087a4b3beb8ac29e9860480f568aabc185b039b01fdd3d\": container with ID starting with 4e8164d10a04ec0de5087a4b3beb8ac29e9860480f568aabc185b039b01fdd3d not found: ID does not exist" Mar 18 19:18:54 crc kubenswrapper[4830]: I0318 19:18:54.713683 4830 scope.go:117] "RemoveContainer" containerID="0ead5a4101c8a17fbeda124d2195ddcd81f92a38974d87ad89f04f048f168712" Mar 18 19:18:54 crc kubenswrapper[4830]: E0318 19:18:54.714127 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ead5a4101c8a17fbeda124d2195ddcd81f92a38974d87ad89f04f048f168712\": container with ID starting with 0ead5a4101c8a17fbeda124d2195ddcd81f92a38974d87ad89f04f048f168712 not found: ID does not exist" containerID="0ead5a4101c8a17fbeda124d2195ddcd81f92a38974d87ad89f04f048f168712" Mar 18 19:18:54 crc kubenswrapper[4830]: I0318 19:18:54.714163 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ead5a4101c8a17fbeda124d2195ddcd81f92a38974d87ad89f04f048f168712"} err="failed to get container status \"0ead5a4101c8a17fbeda124d2195ddcd81f92a38974d87ad89f04f048f168712\": rpc error: code = NotFound desc = could not find container \"0ead5a4101c8a17fbeda124d2195ddcd81f92a38974d87ad89f04f048f168712\": container with ID starting with 0ead5a4101c8a17fbeda124d2195ddcd81f92a38974d87ad89f04f048f168712 not found: ID does not exist" Mar 18 19:18:54 crc kubenswrapper[4830]: I0318 19:18:54.714185 4830 scope.go:117] "RemoveContainer" containerID="15dd7edcd792629c546ef322d3284a694fb804b2b3dd69568934b799d77d9bc6" Mar 18 19:18:54 crc kubenswrapper[4830]: E0318 19:18:54.714439 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15dd7edcd792629c546ef322d3284a694fb804b2b3dd69568934b799d77d9bc6\": container with ID starting with 15dd7edcd792629c546ef322d3284a694fb804b2b3dd69568934b799d77d9bc6 not found: ID does not exist" containerID="15dd7edcd792629c546ef322d3284a694fb804b2b3dd69568934b799d77d9bc6" Mar 18 19:18:54 crc kubenswrapper[4830]: I0318 19:18:54.714489 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15dd7edcd792629c546ef322d3284a694fb804b2b3dd69568934b799d77d9bc6"} err="failed to get container status \"15dd7edcd792629c546ef322d3284a694fb804b2b3dd69568934b799d77d9bc6\": rpc error: code = NotFound desc = could not find container \"15dd7edcd792629c546ef322d3284a694fb804b2b3dd69568934b799d77d9bc6\": container with ID starting with 15dd7edcd792629c546ef322d3284a694fb804b2b3dd69568934b799d77d9bc6 not found: ID does not exist" Mar 18 19:18:55 crc kubenswrapper[4830]: I0318 19:18:55.236010 4830 scope.go:117] "RemoveContainer" containerID="dd00990d3e0490eb7bb214bb6b0dc05f42337becb8fc6e26428974096f6651c9" Mar 18 19:18:55 crc kubenswrapper[4830]: E0318 19:18:55.236758 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:18:56 crc kubenswrapper[4830]: I0318 19:18:56.245168 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b502d9f5-44e7-412d-92aa-c4f3b7de6f95" path="/var/lib/kubelet/pods/b502d9f5-44e7-412d-92aa-c4f3b7de6f95/volumes" Mar 18 19:19:08 crc kubenswrapper[4830]: I0318 19:19:08.235147 4830 scope.go:117] "RemoveContainer" containerID="dd00990d3e0490eb7bb214bb6b0dc05f42337becb8fc6e26428974096f6651c9" Mar 18 19:19:08 crc kubenswrapper[4830]: E0318 19:19:08.236278 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:19:23 crc kubenswrapper[4830]: I0318 19:19:23.235334 4830 scope.go:117] "RemoveContainer" containerID="dd00990d3e0490eb7bb214bb6b0dc05f42337becb8fc6e26428974096f6651c9" Mar 18 19:19:23 crc kubenswrapper[4830]: E0318 19:19:23.236434 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:19:35 crc kubenswrapper[4830]: I0318 19:19:35.235185 4830 scope.go:117] "RemoveContainer" containerID="dd00990d3e0490eb7bb214bb6b0dc05f42337becb8fc6e26428974096f6651c9" Mar 18 19:19:35 crc kubenswrapper[4830]: E0318 19:19:35.236137 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:19:46 crc kubenswrapper[4830]: I0318 19:19:46.242605 4830 scope.go:117] "RemoveContainer" containerID="dd00990d3e0490eb7bb214bb6b0dc05f42337becb8fc6e26428974096f6651c9" Mar 18 19:19:46 crc kubenswrapper[4830]: E0318 19:19:46.243754 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:19:57 crc kubenswrapper[4830]: I0318 19:19:57.234837 4830 scope.go:117] "RemoveContainer" containerID="dd00990d3e0490eb7bb214bb6b0dc05f42337becb8fc6e26428974096f6651c9" Mar 18 19:19:57 crc kubenswrapper[4830]: E0318 19:19:57.235518 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:20:00 crc kubenswrapper[4830]: I0318 19:20:00.165464 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564360-kxczb"] Mar 18 19:20:00 crc kubenswrapper[4830]: E0318 19:20:00.167501 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b502d9f5-44e7-412d-92aa-c4f3b7de6f95" containerName="registry-server" Mar 18 19:20:00 crc kubenswrapper[4830]: I0318 19:20:00.167659 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b502d9f5-44e7-412d-92aa-c4f3b7de6f95" containerName="registry-server" Mar 18 19:20:00 crc kubenswrapper[4830]: E0318 19:20:00.167895 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b502d9f5-44e7-412d-92aa-c4f3b7de6f95" containerName="extract-content" Mar 18 19:20:00 crc kubenswrapper[4830]: I0318 19:20:00.168082 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b502d9f5-44e7-412d-92aa-c4f3b7de6f95" containerName="extract-content" Mar 18 19:20:00 crc kubenswrapper[4830]: E0318 19:20:00.168242 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b502d9f5-44e7-412d-92aa-c4f3b7de6f95" containerName="extract-utilities" Mar 18 19:20:00 crc kubenswrapper[4830]: I0318 19:20:00.168383 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b502d9f5-44e7-412d-92aa-c4f3b7de6f95" containerName="extract-utilities" Mar 18 19:20:00 crc kubenswrapper[4830]: I0318 19:20:00.168818 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="b502d9f5-44e7-412d-92aa-c4f3b7de6f95" containerName="registry-server" Mar 18 19:20:00 crc kubenswrapper[4830]: I0318 19:20:00.169768 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564360-kxczb" Mar 18 19:20:00 crc kubenswrapper[4830]: I0318 19:20:00.173089 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 19:20:00 crc kubenswrapper[4830]: I0318 19:20:00.174199 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:20:00 crc kubenswrapper[4830]: I0318 19:20:00.174252 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:20:00 crc kubenswrapper[4830]: I0318 19:20:00.177255 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564360-kxczb"] Mar 18 19:20:00 crc kubenswrapper[4830]: I0318 19:20:00.342990 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwcr7\" (UniqueName: \"kubernetes.io/projected/b3d2b36a-4163-4218-915b-92c6ec36414d-kube-api-access-fwcr7\") pod \"auto-csr-approver-29564360-kxczb\" (UID: \"b3d2b36a-4163-4218-915b-92c6ec36414d\") " pod="openshift-infra/auto-csr-approver-29564360-kxczb" Mar 18 19:20:00 crc kubenswrapper[4830]: I0318 19:20:00.445169 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwcr7\" (UniqueName: \"kubernetes.io/projected/b3d2b36a-4163-4218-915b-92c6ec36414d-kube-api-access-fwcr7\") pod \"auto-csr-approver-29564360-kxczb\" (UID: \"b3d2b36a-4163-4218-915b-92c6ec36414d\") " pod="openshift-infra/auto-csr-approver-29564360-kxczb" Mar 18 19:20:00 crc kubenswrapper[4830]: I0318 19:20:00.479885 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwcr7\" (UniqueName: \"kubernetes.io/projected/b3d2b36a-4163-4218-915b-92c6ec36414d-kube-api-access-fwcr7\") pod \"auto-csr-approver-29564360-kxczb\" (UID: \"b3d2b36a-4163-4218-915b-92c6ec36414d\") " pod="openshift-infra/auto-csr-approver-29564360-kxczb" Mar 18 19:20:00 crc kubenswrapper[4830]: I0318 19:20:00.510328 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564360-kxczb" Mar 18 19:20:00 crc kubenswrapper[4830]: I0318 19:20:00.821557 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564360-kxczb"] Mar 18 19:20:00 crc kubenswrapper[4830]: I0318 19:20:00.826923 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 19:20:01 crc kubenswrapper[4830]: I0318 19:20:01.207651 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564360-kxczb" event={"ID":"b3d2b36a-4163-4218-915b-92c6ec36414d","Type":"ContainerStarted","Data":"0e9cdce7b986e6007e0a1ffe11a4d5532b23199a92974298230ca7af4268dbb6"} Mar 18 19:20:03 crc kubenswrapper[4830]: I0318 19:20:03.225951 4830 generic.go:334] "Generic (PLEG): container finished" podID="b3d2b36a-4163-4218-915b-92c6ec36414d" containerID="ffb9cea5eeb3bd91132199e378645b5e519aa3b9f41e7af03f50b6cc0f445973" exitCode=0 Mar 18 19:20:03 crc kubenswrapper[4830]: I0318 19:20:03.226057 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564360-kxczb" event={"ID":"b3d2b36a-4163-4218-915b-92c6ec36414d","Type":"ContainerDied","Data":"ffb9cea5eeb3bd91132199e378645b5e519aa3b9f41e7af03f50b6cc0f445973"} Mar 18 19:20:04 crc kubenswrapper[4830]: I0318 19:20:04.670381 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564360-kxczb" Mar 18 19:20:04 crc kubenswrapper[4830]: I0318 19:20:04.818126 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwcr7\" (UniqueName: \"kubernetes.io/projected/b3d2b36a-4163-4218-915b-92c6ec36414d-kube-api-access-fwcr7\") pod \"b3d2b36a-4163-4218-915b-92c6ec36414d\" (UID: \"b3d2b36a-4163-4218-915b-92c6ec36414d\") " Mar 18 19:20:04 crc kubenswrapper[4830]: I0318 19:20:04.825072 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3d2b36a-4163-4218-915b-92c6ec36414d-kube-api-access-fwcr7" (OuterVolumeSpecName: "kube-api-access-fwcr7") pod "b3d2b36a-4163-4218-915b-92c6ec36414d" (UID: "b3d2b36a-4163-4218-915b-92c6ec36414d"). InnerVolumeSpecName "kube-api-access-fwcr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:20:04 crc kubenswrapper[4830]: I0318 19:20:04.920414 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwcr7\" (UniqueName: \"kubernetes.io/projected/b3d2b36a-4163-4218-915b-92c6ec36414d-kube-api-access-fwcr7\") on node \"crc\" DevicePath \"\"" Mar 18 19:20:05 crc kubenswrapper[4830]: I0318 19:20:05.246111 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564360-kxczb" event={"ID":"b3d2b36a-4163-4218-915b-92c6ec36414d","Type":"ContainerDied","Data":"0e9cdce7b986e6007e0a1ffe11a4d5532b23199a92974298230ca7af4268dbb6"} Mar 18 19:20:05 crc kubenswrapper[4830]: I0318 19:20:05.246156 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e9cdce7b986e6007e0a1ffe11a4d5532b23199a92974298230ca7af4268dbb6" Mar 18 19:20:05 crc kubenswrapper[4830]: I0318 19:20:05.246183 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564360-kxczb" Mar 18 19:20:05 crc kubenswrapper[4830]: I0318 19:20:05.759215 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564354-n2fgb"] Mar 18 19:20:05 crc kubenswrapper[4830]: I0318 19:20:05.767889 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564354-n2fgb"] Mar 18 19:20:06 crc kubenswrapper[4830]: I0318 19:20:06.247571 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fc2a367-20ae-465b-b520-ee4ad9292563" path="/var/lib/kubelet/pods/7fc2a367-20ae-465b-b520-ee4ad9292563/volumes" Mar 18 19:20:08 crc kubenswrapper[4830]: I0318 19:20:08.235429 4830 scope.go:117] "RemoveContainer" containerID="dd00990d3e0490eb7bb214bb6b0dc05f42337becb8fc6e26428974096f6651c9" Mar 18 19:20:08 crc kubenswrapper[4830]: E0318 19:20:08.236203 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:20:11 crc kubenswrapper[4830]: I0318 19:20:11.092339 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pk82g"] Mar 18 19:20:11 crc kubenswrapper[4830]: E0318 19:20:11.093427 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d2b36a-4163-4218-915b-92c6ec36414d" containerName="oc" Mar 18 19:20:11 crc kubenswrapper[4830]: I0318 19:20:11.093458 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d2b36a-4163-4218-915b-92c6ec36414d" containerName="oc" Mar 18 19:20:11 crc kubenswrapper[4830]: I0318 19:20:11.093807 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3d2b36a-4163-4218-915b-92c6ec36414d" containerName="oc" Mar 18 19:20:11 crc kubenswrapper[4830]: I0318 19:20:11.095992 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pk82g" Mar 18 19:20:11 crc kubenswrapper[4830]: I0318 19:20:11.104888 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pk82g"] Mar 18 19:20:11 crc kubenswrapper[4830]: I0318 19:20:11.226519 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdvq6\" (UniqueName: \"kubernetes.io/projected/dcb0c844-d738-4212-9c21-d0d3ae65cb1e-kube-api-access-gdvq6\") pod \"certified-operators-pk82g\" (UID: \"dcb0c844-d738-4212-9c21-d0d3ae65cb1e\") " pod="openshift-marketplace/certified-operators-pk82g" Mar 18 19:20:11 crc kubenswrapper[4830]: I0318 19:20:11.226619 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcb0c844-d738-4212-9c21-d0d3ae65cb1e-catalog-content\") pod \"certified-operators-pk82g\" (UID: \"dcb0c844-d738-4212-9c21-d0d3ae65cb1e\") " pod="openshift-marketplace/certified-operators-pk82g" Mar 18 19:20:11 crc kubenswrapper[4830]: I0318 19:20:11.226650 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcb0c844-d738-4212-9c21-d0d3ae65cb1e-utilities\") pod \"certified-operators-pk82g\" (UID: \"dcb0c844-d738-4212-9c21-d0d3ae65cb1e\") " pod="openshift-marketplace/certified-operators-pk82g" Mar 18 19:20:11 crc kubenswrapper[4830]: I0318 19:20:11.327575 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcb0c844-d738-4212-9c21-d0d3ae65cb1e-catalog-content\") pod \"certified-operators-pk82g\" (UID: \"dcb0c844-d738-4212-9c21-d0d3ae65cb1e\") " pod="openshift-marketplace/certified-operators-pk82g" Mar 18 19:20:11 crc kubenswrapper[4830]: I0318 19:20:11.327855 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcb0c844-d738-4212-9c21-d0d3ae65cb1e-utilities\") pod \"certified-operators-pk82g\" (UID: \"dcb0c844-d738-4212-9c21-d0d3ae65cb1e\") " pod="openshift-marketplace/certified-operators-pk82g" Mar 18 19:20:11 crc kubenswrapper[4830]: I0318 19:20:11.328029 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdvq6\" (UniqueName: \"kubernetes.io/projected/dcb0c844-d738-4212-9c21-d0d3ae65cb1e-kube-api-access-gdvq6\") pod \"certified-operators-pk82g\" (UID: \"dcb0c844-d738-4212-9c21-d0d3ae65cb1e\") " pod="openshift-marketplace/certified-operators-pk82g" Mar 18 19:20:11 crc kubenswrapper[4830]: I0318 19:20:11.328314 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcb0c844-d738-4212-9c21-d0d3ae65cb1e-catalog-content\") pod \"certified-operators-pk82g\" (UID: \"dcb0c844-d738-4212-9c21-d0d3ae65cb1e\") " pod="openshift-marketplace/certified-operators-pk82g" Mar 18 19:20:11 crc kubenswrapper[4830]: I0318 19:20:11.328741 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcb0c844-d738-4212-9c21-d0d3ae65cb1e-utilities\") pod \"certified-operators-pk82g\" (UID: \"dcb0c844-d738-4212-9c21-d0d3ae65cb1e\") " pod="openshift-marketplace/certified-operators-pk82g" Mar 18 19:20:11 crc kubenswrapper[4830]: I0318 19:20:11.365997 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdvq6\" (UniqueName: \"kubernetes.io/projected/dcb0c844-d738-4212-9c21-d0d3ae65cb1e-kube-api-access-gdvq6\") pod \"certified-operators-pk82g\" (UID: \"dcb0c844-d738-4212-9c21-d0d3ae65cb1e\") " pod="openshift-marketplace/certified-operators-pk82g" Mar 18 19:20:11 crc kubenswrapper[4830]: I0318 19:20:11.425073 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pk82g" Mar 18 19:20:11 crc kubenswrapper[4830]: I0318 19:20:11.938224 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pk82g"] Mar 18 19:20:12 crc kubenswrapper[4830]: I0318 19:20:12.306670 4830 generic.go:334] "Generic (PLEG): container finished" podID="dcb0c844-d738-4212-9c21-d0d3ae65cb1e" containerID="8032f8ec9e2e09a65cc5509d974c130cfea774a3ee2b77f7f0a41eb8cb6fde62" exitCode=0 Mar 18 19:20:12 crc kubenswrapper[4830]: I0318 19:20:12.306820 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pk82g" event={"ID":"dcb0c844-d738-4212-9c21-d0d3ae65cb1e","Type":"ContainerDied","Data":"8032f8ec9e2e09a65cc5509d974c130cfea774a3ee2b77f7f0a41eb8cb6fde62"} Mar 18 19:20:12 crc kubenswrapper[4830]: I0318 19:20:12.307225 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pk82g" event={"ID":"dcb0c844-d738-4212-9c21-d0d3ae65cb1e","Type":"ContainerStarted","Data":"22654fbf1db372d8208fd105847f811382ddceb74aaaf55417bbdeebb5bca53e"} Mar 18 19:20:14 crc kubenswrapper[4830]: I0318 19:20:14.330828 4830 generic.go:334] "Generic (PLEG): container finished" podID="dcb0c844-d738-4212-9c21-d0d3ae65cb1e" containerID="fe3c8af7f8f2f7a90603d52d5169fd96e712ad18e8c39526d1f5218deced8e25" exitCode=0 Mar 18 19:20:14 crc kubenswrapper[4830]: I0318 19:20:14.330893 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pk82g" event={"ID":"dcb0c844-d738-4212-9c21-d0d3ae65cb1e","Type":"ContainerDied","Data":"fe3c8af7f8f2f7a90603d52d5169fd96e712ad18e8c39526d1f5218deced8e25"} Mar 18 19:20:15 crc kubenswrapper[4830]: I0318 19:20:15.344853 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pk82g" event={"ID":"dcb0c844-d738-4212-9c21-d0d3ae65cb1e","Type":"ContainerStarted","Data":"18417dddf7982757da75de211bfa5a127c1df297eb38f51019fc9e1f46685dc3"} Mar 18 19:20:15 crc kubenswrapper[4830]: I0318 19:20:15.383249 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pk82g" podStartSLOduration=1.9275042249999998 podStartE2EDuration="4.383219576s" podCreationTimestamp="2026-03-18 19:20:11 +0000 UTC" firstStartedPulling="2026-03-18 19:20:12.309408587 +0000 UTC m=+4646.877038949" lastFinishedPulling="2026-03-18 19:20:14.765123938 +0000 UTC m=+4649.332754300" observedRunningTime="2026-03-18 19:20:15.370696674 +0000 UTC m=+4649.938327046" watchObservedRunningTime="2026-03-18 19:20:15.383219576 +0000 UTC m=+4649.950849948" Mar 18 19:20:19 crc kubenswrapper[4830]: I0318 19:20:19.494600 4830 scope.go:117] "RemoveContainer" containerID="251bbc0619075efdaaa56c35f4ba7ceabfd9cb735ee672554d3c98d25b08ae57" Mar 18 19:20:21 crc kubenswrapper[4830]: I0318 19:20:21.235107 4830 scope.go:117] "RemoveContainer" containerID="dd00990d3e0490eb7bb214bb6b0dc05f42337becb8fc6e26428974096f6651c9" Mar 18 19:20:21 crc kubenswrapper[4830]: E0318 19:20:21.235632 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:20:21 crc kubenswrapper[4830]: I0318 19:20:21.426039 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pk82g" Mar 18 19:20:21 crc kubenswrapper[4830]: I0318 19:20:21.426627 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pk82g" Mar 18 19:20:21 crc kubenswrapper[4830]: I0318 19:20:21.507820 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pk82g" Mar 18 19:20:22 crc kubenswrapper[4830]: I0318 19:20:22.466549 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pk82g" Mar 18 19:20:22 crc kubenswrapper[4830]: I0318 19:20:22.522878 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pk82g"] Mar 18 19:20:24 crc kubenswrapper[4830]: I0318 19:20:24.425084 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pk82g" podUID="dcb0c844-d738-4212-9c21-d0d3ae65cb1e" containerName="registry-server" containerID="cri-o://18417dddf7982757da75de211bfa5a127c1df297eb38f51019fc9e1f46685dc3" gracePeriod=2 Mar 18 19:20:24 crc kubenswrapper[4830]: I0318 19:20:24.890215 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pk82g" Mar 18 19:20:24 crc kubenswrapper[4830]: I0318 19:20:24.954152 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcb0c844-d738-4212-9c21-d0d3ae65cb1e-utilities\") pod \"dcb0c844-d738-4212-9c21-d0d3ae65cb1e\" (UID: \"dcb0c844-d738-4212-9c21-d0d3ae65cb1e\") " Mar 18 19:20:24 crc kubenswrapper[4830]: I0318 19:20:24.954222 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdvq6\" (UniqueName: \"kubernetes.io/projected/dcb0c844-d738-4212-9c21-d0d3ae65cb1e-kube-api-access-gdvq6\") pod \"dcb0c844-d738-4212-9c21-d0d3ae65cb1e\" (UID: \"dcb0c844-d738-4212-9c21-d0d3ae65cb1e\") " Mar 18 19:20:24 crc kubenswrapper[4830]: I0318 19:20:24.954265 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcb0c844-d738-4212-9c21-d0d3ae65cb1e-catalog-content\") pod \"dcb0c844-d738-4212-9c21-d0d3ae65cb1e\" (UID: \"dcb0c844-d738-4212-9c21-d0d3ae65cb1e\") " Mar 18 19:20:24 crc kubenswrapper[4830]: I0318 19:20:24.956987 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcb0c844-d738-4212-9c21-d0d3ae65cb1e-utilities" (OuterVolumeSpecName: "utilities") pod "dcb0c844-d738-4212-9c21-d0d3ae65cb1e" (UID: "dcb0c844-d738-4212-9c21-d0d3ae65cb1e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:20:24 crc kubenswrapper[4830]: I0318 19:20:24.966173 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcb0c844-d738-4212-9c21-d0d3ae65cb1e-kube-api-access-gdvq6" (OuterVolumeSpecName: "kube-api-access-gdvq6") pod "dcb0c844-d738-4212-9c21-d0d3ae65cb1e" (UID: "dcb0c844-d738-4212-9c21-d0d3ae65cb1e"). InnerVolumeSpecName "kube-api-access-gdvq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:20:25 crc kubenswrapper[4830]: I0318 19:20:25.056237 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcb0c844-d738-4212-9c21-d0d3ae65cb1e-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 19:20:25 crc kubenswrapper[4830]: I0318 19:20:25.056285 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdvq6\" (UniqueName: \"kubernetes.io/projected/dcb0c844-d738-4212-9c21-d0d3ae65cb1e-kube-api-access-gdvq6\") on node \"crc\" DevicePath \"\"" Mar 18 19:20:25 crc kubenswrapper[4830]: I0318 19:20:25.159849 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcb0c844-d738-4212-9c21-d0d3ae65cb1e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dcb0c844-d738-4212-9c21-d0d3ae65cb1e" (UID: "dcb0c844-d738-4212-9c21-d0d3ae65cb1e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:20:25 crc kubenswrapper[4830]: I0318 19:20:25.260634 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcb0c844-d738-4212-9c21-d0d3ae65cb1e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 19:20:25 crc kubenswrapper[4830]: I0318 19:20:25.433986 4830 generic.go:334] "Generic (PLEG): container finished" podID="dcb0c844-d738-4212-9c21-d0d3ae65cb1e" containerID="18417dddf7982757da75de211bfa5a127c1df297eb38f51019fc9e1f46685dc3" exitCode=0 Mar 18 19:20:25 crc kubenswrapper[4830]: I0318 19:20:25.434038 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pk82g" event={"ID":"dcb0c844-d738-4212-9c21-d0d3ae65cb1e","Type":"ContainerDied","Data":"18417dddf7982757da75de211bfa5a127c1df297eb38f51019fc9e1f46685dc3"} Mar 18 19:20:25 crc kubenswrapper[4830]: I0318 19:20:25.434072 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pk82g" event={"ID":"dcb0c844-d738-4212-9c21-d0d3ae65cb1e","Type":"ContainerDied","Data":"22654fbf1db372d8208fd105847f811382ddceb74aaaf55417bbdeebb5bca53e"} Mar 18 19:20:25 crc kubenswrapper[4830]: I0318 19:20:25.434093 4830 scope.go:117] "RemoveContainer" containerID="18417dddf7982757da75de211bfa5a127c1df297eb38f51019fc9e1f46685dc3" Mar 18 19:20:25 crc kubenswrapper[4830]: I0318 19:20:25.435073 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pk82g" Mar 18 19:20:25 crc kubenswrapper[4830]: I0318 19:20:25.473164 4830 scope.go:117] "RemoveContainer" containerID="fe3c8af7f8f2f7a90603d52d5169fd96e712ad18e8c39526d1f5218deced8e25" Mar 18 19:20:25 crc kubenswrapper[4830]: I0318 19:20:25.487753 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pk82g"] Mar 18 19:20:25 crc kubenswrapper[4830]: I0318 19:20:25.499109 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pk82g"] Mar 18 19:20:25 crc kubenswrapper[4830]: I0318 19:20:25.530368 4830 scope.go:117] "RemoveContainer" containerID="8032f8ec9e2e09a65cc5509d974c130cfea774a3ee2b77f7f0a41eb8cb6fde62" Mar 18 19:20:25 crc kubenswrapper[4830]: I0318 19:20:25.548613 4830 scope.go:117] "RemoveContainer" containerID="18417dddf7982757da75de211bfa5a127c1df297eb38f51019fc9e1f46685dc3" Mar 18 19:20:25 crc kubenswrapper[4830]: E0318 19:20:25.549086 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18417dddf7982757da75de211bfa5a127c1df297eb38f51019fc9e1f46685dc3\": container with ID starting with 18417dddf7982757da75de211bfa5a127c1df297eb38f51019fc9e1f46685dc3 not found: ID does not exist" containerID="18417dddf7982757da75de211bfa5a127c1df297eb38f51019fc9e1f46685dc3" Mar 18 19:20:25 crc kubenswrapper[4830]: I0318 19:20:25.549145 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18417dddf7982757da75de211bfa5a127c1df297eb38f51019fc9e1f46685dc3"} err="failed to get container status \"18417dddf7982757da75de211bfa5a127c1df297eb38f51019fc9e1f46685dc3\": rpc error: code = NotFound desc = could not find container \"18417dddf7982757da75de211bfa5a127c1df297eb38f51019fc9e1f46685dc3\": container with ID starting with 18417dddf7982757da75de211bfa5a127c1df297eb38f51019fc9e1f46685dc3 not found: ID does not exist" Mar 18 19:20:25 crc kubenswrapper[4830]: I0318 19:20:25.549168 4830 scope.go:117] "RemoveContainer" containerID="fe3c8af7f8f2f7a90603d52d5169fd96e712ad18e8c39526d1f5218deced8e25" Mar 18 19:20:25 crc kubenswrapper[4830]: E0318 19:20:25.549659 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe3c8af7f8f2f7a90603d52d5169fd96e712ad18e8c39526d1f5218deced8e25\": container with ID starting with fe3c8af7f8f2f7a90603d52d5169fd96e712ad18e8c39526d1f5218deced8e25 not found: ID does not exist" containerID="fe3c8af7f8f2f7a90603d52d5169fd96e712ad18e8c39526d1f5218deced8e25" Mar 18 19:20:25 crc kubenswrapper[4830]: I0318 19:20:25.549740 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe3c8af7f8f2f7a90603d52d5169fd96e712ad18e8c39526d1f5218deced8e25"} err="failed to get container status \"fe3c8af7f8f2f7a90603d52d5169fd96e712ad18e8c39526d1f5218deced8e25\": rpc error: code = NotFound desc = could not find container \"fe3c8af7f8f2f7a90603d52d5169fd96e712ad18e8c39526d1f5218deced8e25\": container with ID starting with fe3c8af7f8f2f7a90603d52d5169fd96e712ad18e8c39526d1f5218deced8e25 not found: ID does not exist" Mar 18 19:20:25 crc kubenswrapper[4830]: I0318 19:20:25.549829 4830 scope.go:117] "RemoveContainer" containerID="8032f8ec9e2e09a65cc5509d974c130cfea774a3ee2b77f7f0a41eb8cb6fde62" Mar 18 19:20:25 crc kubenswrapper[4830]: E0318 19:20:25.550717 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8032f8ec9e2e09a65cc5509d974c130cfea774a3ee2b77f7f0a41eb8cb6fde62\": container with ID starting with 8032f8ec9e2e09a65cc5509d974c130cfea774a3ee2b77f7f0a41eb8cb6fde62 not found: ID does not exist" containerID="8032f8ec9e2e09a65cc5509d974c130cfea774a3ee2b77f7f0a41eb8cb6fde62" Mar 18 19:20:25 crc kubenswrapper[4830]: I0318 19:20:25.550907 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8032f8ec9e2e09a65cc5509d974c130cfea774a3ee2b77f7f0a41eb8cb6fde62"} err="failed to get container status \"8032f8ec9e2e09a65cc5509d974c130cfea774a3ee2b77f7f0a41eb8cb6fde62\": rpc error: code = NotFound desc = could not find container \"8032f8ec9e2e09a65cc5509d974c130cfea774a3ee2b77f7f0a41eb8cb6fde62\": container with ID starting with 8032f8ec9e2e09a65cc5509d974c130cfea774a3ee2b77f7f0a41eb8cb6fde62 not found: ID does not exist" Mar 18 19:20:26 crc kubenswrapper[4830]: I0318 19:20:26.269134 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcb0c844-d738-4212-9c21-d0d3ae65cb1e" path="/var/lib/kubelet/pods/dcb0c844-d738-4212-9c21-d0d3ae65cb1e/volumes" Mar 18 19:20:33 crc kubenswrapper[4830]: I0318 19:20:33.235204 4830 scope.go:117] "RemoveContainer" containerID="dd00990d3e0490eb7bb214bb6b0dc05f42337becb8fc6e26428974096f6651c9" Mar 18 19:20:33 crc kubenswrapper[4830]: I0318 19:20:33.514509 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerStarted","Data":"8ab0e5f91a7dbe7856c1160fe1f3fd41d957d07b4601ca351dadf0e31da9f972"} Mar 18 19:21:30 crc kubenswrapper[4830]: I0318 19:21:30.600103 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7mvn5"] Mar 18 19:21:30 crc kubenswrapper[4830]: E0318 19:21:30.605180 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcb0c844-d738-4212-9c21-d0d3ae65cb1e" containerName="extract-content" Mar 18 19:21:30 crc kubenswrapper[4830]: I0318 19:21:30.605199 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcb0c844-d738-4212-9c21-d0d3ae65cb1e" containerName="extract-content" Mar 18 19:21:30 crc kubenswrapper[4830]: E0318 19:21:30.605219 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcb0c844-d738-4212-9c21-d0d3ae65cb1e" containerName="extract-utilities" Mar 18 19:21:30 crc kubenswrapper[4830]: I0318 19:21:30.605227 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcb0c844-d738-4212-9c21-d0d3ae65cb1e" containerName="extract-utilities" Mar 18 19:21:30 crc kubenswrapper[4830]: E0318 19:21:30.605258 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcb0c844-d738-4212-9c21-d0d3ae65cb1e" containerName="registry-server" Mar 18 19:21:30 crc kubenswrapper[4830]: I0318 19:21:30.605266 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcb0c844-d738-4212-9c21-d0d3ae65cb1e" containerName="registry-server" Mar 18 19:21:30 crc kubenswrapper[4830]: I0318 19:21:30.605436 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcb0c844-d738-4212-9c21-d0d3ae65cb1e" containerName="registry-server" Mar 18 19:21:30 crc kubenswrapper[4830]: I0318 19:21:30.606651 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7mvn5" Mar 18 19:21:30 crc kubenswrapper[4830]: I0318 19:21:30.621814 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7mvn5"] Mar 18 19:21:30 crc kubenswrapper[4830]: I0318 19:21:30.806697 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42453534-9dd6-4942-a1fe-170491fd30e4-catalog-content\") pod \"community-operators-7mvn5\" (UID: \"42453534-9dd6-4942-a1fe-170491fd30e4\") " pod="openshift-marketplace/community-operators-7mvn5" Mar 18 19:21:30 crc kubenswrapper[4830]: I0318 19:21:30.806764 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42453534-9dd6-4942-a1fe-170491fd30e4-utilities\") pod \"community-operators-7mvn5\" (UID: \"42453534-9dd6-4942-a1fe-170491fd30e4\") " pod="openshift-marketplace/community-operators-7mvn5" Mar 18 19:21:30 crc kubenswrapper[4830]: I0318 19:21:30.806828 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6vfc\" (UniqueName: \"kubernetes.io/projected/42453534-9dd6-4942-a1fe-170491fd30e4-kube-api-access-v6vfc\") pod \"community-operators-7mvn5\" (UID: \"42453534-9dd6-4942-a1fe-170491fd30e4\") " pod="openshift-marketplace/community-operators-7mvn5" Mar 18 19:21:30 crc kubenswrapper[4830]: I0318 19:21:30.908579 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42453534-9dd6-4942-a1fe-170491fd30e4-catalog-content\") pod \"community-operators-7mvn5\" (UID: \"42453534-9dd6-4942-a1fe-170491fd30e4\") " pod="openshift-marketplace/community-operators-7mvn5" Mar 18 19:21:30 crc kubenswrapper[4830]: I0318 19:21:30.909160 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42453534-9dd6-4942-a1fe-170491fd30e4-utilities\") pod \"community-operators-7mvn5\" (UID: \"42453534-9dd6-4942-a1fe-170491fd30e4\") " pod="openshift-marketplace/community-operators-7mvn5" Mar 18 19:21:30 crc kubenswrapper[4830]: I0318 19:21:30.909241 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42453534-9dd6-4942-a1fe-170491fd30e4-catalog-content\") pod \"community-operators-7mvn5\" (UID: \"42453534-9dd6-4942-a1fe-170491fd30e4\") " pod="openshift-marketplace/community-operators-7mvn5" Mar 18 19:21:30 crc kubenswrapper[4830]: I0318 19:21:30.909659 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42453534-9dd6-4942-a1fe-170491fd30e4-utilities\") pod \"community-operators-7mvn5\" (UID: \"42453534-9dd6-4942-a1fe-170491fd30e4\") " pod="openshift-marketplace/community-operators-7mvn5" Mar 18 19:21:30 crc kubenswrapper[4830]: I0318 19:21:30.909664 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6vfc\" (UniqueName: \"kubernetes.io/projected/42453534-9dd6-4942-a1fe-170491fd30e4-kube-api-access-v6vfc\") pod \"community-operators-7mvn5\" (UID: \"42453534-9dd6-4942-a1fe-170491fd30e4\") " pod="openshift-marketplace/community-operators-7mvn5" Mar 18 19:21:30 crc kubenswrapper[4830]: I0318 19:21:30.940255 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6vfc\" (UniqueName: \"kubernetes.io/projected/42453534-9dd6-4942-a1fe-170491fd30e4-kube-api-access-v6vfc\") pod \"community-operators-7mvn5\" (UID: \"42453534-9dd6-4942-a1fe-170491fd30e4\") " pod="openshift-marketplace/community-operators-7mvn5" Mar 18 19:21:30 crc kubenswrapper[4830]: I0318 19:21:30.946863 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7mvn5" Mar 18 19:21:31 crc kubenswrapper[4830]: I0318 19:21:31.435267 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7mvn5"] Mar 18 19:21:32 crc kubenswrapper[4830]: I0318 19:21:32.053413 4830 generic.go:334] "Generic (PLEG): container finished" podID="42453534-9dd6-4942-a1fe-170491fd30e4" containerID="fc5f0676b8a56c492c992f55881906f92ba533d0392ba3b792d28e8713484222" exitCode=0 Mar 18 19:21:32 crc kubenswrapper[4830]: I0318 19:21:32.053533 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7mvn5" event={"ID":"42453534-9dd6-4942-a1fe-170491fd30e4","Type":"ContainerDied","Data":"fc5f0676b8a56c492c992f55881906f92ba533d0392ba3b792d28e8713484222"} Mar 18 19:21:32 crc kubenswrapper[4830]: I0318 19:21:32.055031 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7mvn5" event={"ID":"42453534-9dd6-4942-a1fe-170491fd30e4","Type":"ContainerStarted","Data":"f91d69d834d86acad08d7ef511af4c9fda681dc7d48049b41c66354f19c65213"} Mar 18 19:21:33 crc kubenswrapper[4830]: I0318 19:21:33.065041 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7mvn5" event={"ID":"42453534-9dd6-4942-a1fe-170491fd30e4","Type":"ContainerStarted","Data":"d956f8bb221ef057af8371154e518a6134f8c89d5053603eb34fbec15d5809cc"} Mar 18 19:21:34 crc kubenswrapper[4830]: I0318 19:21:34.079611 4830 generic.go:334] "Generic (PLEG): container finished" podID="42453534-9dd6-4942-a1fe-170491fd30e4" containerID="d956f8bb221ef057af8371154e518a6134f8c89d5053603eb34fbec15d5809cc" exitCode=0 Mar 18 19:21:34 crc kubenswrapper[4830]: I0318 19:21:34.079677 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7mvn5" event={"ID":"42453534-9dd6-4942-a1fe-170491fd30e4","Type":"ContainerDied","Data":"d956f8bb221ef057af8371154e518a6134f8c89d5053603eb34fbec15d5809cc"} Mar 18 19:21:35 crc kubenswrapper[4830]: I0318 19:21:35.095714 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7mvn5" event={"ID":"42453534-9dd6-4942-a1fe-170491fd30e4","Type":"ContainerStarted","Data":"8c14d7d01d66ade62313fc009dda880f6bdb82ddf992f310233f1bcc4b041274"} Mar 18 19:21:35 crc kubenswrapper[4830]: I0318 19:21:35.117795 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7mvn5" podStartSLOduration=2.409113876 podStartE2EDuration="5.117758608s" podCreationTimestamp="2026-03-18 19:21:30 +0000 UTC" firstStartedPulling="2026-03-18 19:21:32.057628354 +0000 UTC m=+4726.625258716" lastFinishedPulling="2026-03-18 19:21:34.766273076 +0000 UTC m=+4729.333903448" observedRunningTime="2026-03-18 19:21:35.113535799 +0000 UTC m=+4729.681166141" watchObservedRunningTime="2026-03-18 19:21:35.117758608 +0000 UTC m=+4729.685388950" Mar 18 19:21:40 crc kubenswrapper[4830]: I0318 19:21:40.948022 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7mvn5" Mar 18 19:21:40 crc kubenswrapper[4830]: I0318 19:21:40.950498 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7mvn5" Mar 18 19:21:41 crc kubenswrapper[4830]: I0318 19:21:41.030933 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7mvn5" Mar 18 19:21:41 crc kubenswrapper[4830]: I0318 19:21:41.232856 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7mvn5" Mar 18 19:21:41 crc kubenswrapper[4830]: I0318 19:21:41.310372 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7mvn5"] Mar 18 19:21:43 crc kubenswrapper[4830]: I0318 19:21:43.172663 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7mvn5" podUID="42453534-9dd6-4942-a1fe-170491fd30e4" containerName="registry-server" containerID="cri-o://8c14d7d01d66ade62313fc009dda880f6bdb82ddf992f310233f1bcc4b041274" gracePeriod=2 Mar 18 19:21:44 crc kubenswrapper[4830]: I0318 19:21:44.117319 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7mvn5" Mar 18 19:21:44 crc kubenswrapper[4830]: I0318 19:21:44.182581 4830 generic.go:334] "Generic (PLEG): container finished" podID="42453534-9dd6-4942-a1fe-170491fd30e4" containerID="8c14d7d01d66ade62313fc009dda880f6bdb82ddf992f310233f1bcc4b041274" exitCode=0 Mar 18 19:21:44 crc kubenswrapper[4830]: I0318 19:21:44.182629 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7mvn5" event={"ID":"42453534-9dd6-4942-a1fe-170491fd30e4","Type":"ContainerDied","Data":"8c14d7d01d66ade62313fc009dda880f6bdb82ddf992f310233f1bcc4b041274"} Mar 18 19:21:44 crc kubenswrapper[4830]: I0318 19:21:44.182639 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7mvn5" Mar 18 19:21:44 crc kubenswrapper[4830]: I0318 19:21:44.182659 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7mvn5" event={"ID":"42453534-9dd6-4942-a1fe-170491fd30e4","Type":"ContainerDied","Data":"f91d69d834d86acad08d7ef511af4c9fda681dc7d48049b41c66354f19c65213"} Mar 18 19:21:44 crc kubenswrapper[4830]: I0318 19:21:44.182682 4830 scope.go:117] "RemoveContainer" containerID="8c14d7d01d66ade62313fc009dda880f6bdb82ddf992f310233f1bcc4b041274" Mar 18 19:21:44 crc kubenswrapper[4830]: I0318 19:21:44.203304 4830 scope.go:117] "RemoveContainer" containerID="d956f8bb221ef057af8371154e518a6134f8c89d5053603eb34fbec15d5809cc" Mar 18 19:21:44 crc kubenswrapper[4830]: I0318 19:21:44.223113 4830 scope.go:117] "RemoveContainer" containerID="fc5f0676b8a56c492c992f55881906f92ba533d0392ba3b792d28e8713484222" Mar 18 19:21:44 crc kubenswrapper[4830]: I0318 19:21:44.228039 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6vfc\" (UniqueName: \"kubernetes.io/projected/42453534-9dd6-4942-a1fe-170491fd30e4-kube-api-access-v6vfc\") pod \"42453534-9dd6-4942-a1fe-170491fd30e4\" (UID: \"42453534-9dd6-4942-a1fe-170491fd30e4\") " Mar 18 19:21:44 crc kubenswrapper[4830]: I0318 19:21:44.228163 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42453534-9dd6-4942-a1fe-170491fd30e4-catalog-content\") pod \"42453534-9dd6-4942-a1fe-170491fd30e4\" (UID: \"42453534-9dd6-4942-a1fe-170491fd30e4\") " Mar 18 19:21:44 crc kubenswrapper[4830]: I0318 19:21:44.228241 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42453534-9dd6-4942-a1fe-170491fd30e4-utilities\") pod \"42453534-9dd6-4942-a1fe-170491fd30e4\" (UID: \"42453534-9dd6-4942-a1fe-170491fd30e4\") " Mar 18 19:21:44 crc kubenswrapper[4830]: I0318 19:21:44.229127 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42453534-9dd6-4942-a1fe-170491fd30e4-utilities" (OuterVolumeSpecName: "utilities") pod "42453534-9dd6-4942-a1fe-170491fd30e4" (UID: "42453534-9dd6-4942-a1fe-170491fd30e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:21:44 crc kubenswrapper[4830]: I0318 19:21:44.235822 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42453534-9dd6-4942-a1fe-170491fd30e4-kube-api-access-v6vfc" (OuterVolumeSpecName: "kube-api-access-v6vfc") pod "42453534-9dd6-4942-a1fe-170491fd30e4" (UID: "42453534-9dd6-4942-a1fe-170491fd30e4"). InnerVolumeSpecName "kube-api-access-v6vfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:21:44 crc kubenswrapper[4830]: I0318 19:21:44.286444 4830 scope.go:117] "RemoveContainer" containerID="8c14d7d01d66ade62313fc009dda880f6bdb82ddf992f310233f1bcc4b041274" Mar 18 19:21:44 crc kubenswrapper[4830]: E0318 19:21:44.286824 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c14d7d01d66ade62313fc009dda880f6bdb82ddf992f310233f1bcc4b041274\": container with ID starting with 8c14d7d01d66ade62313fc009dda880f6bdb82ddf992f310233f1bcc4b041274 not found: ID does not exist" containerID="8c14d7d01d66ade62313fc009dda880f6bdb82ddf992f310233f1bcc4b041274" Mar 18 19:21:44 crc kubenswrapper[4830]: I0318 19:21:44.286855 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c14d7d01d66ade62313fc009dda880f6bdb82ddf992f310233f1bcc4b041274"} err="failed to get container status \"8c14d7d01d66ade62313fc009dda880f6bdb82ddf992f310233f1bcc4b041274\": rpc error: code = NotFound desc = could not find container \"8c14d7d01d66ade62313fc009dda880f6bdb82ddf992f310233f1bcc4b041274\": container with ID starting with 8c14d7d01d66ade62313fc009dda880f6bdb82ddf992f310233f1bcc4b041274 not found: ID does not exist" Mar 18 19:21:44 crc kubenswrapper[4830]: I0318 19:21:44.286875 4830 scope.go:117] "RemoveContainer" containerID="d956f8bb221ef057af8371154e518a6134f8c89d5053603eb34fbec15d5809cc" Mar 18 19:21:44 crc kubenswrapper[4830]: E0318 19:21:44.287131 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d956f8bb221ef057af8371154e518a6134f8c89d5053603eb34fbec15d5809cc\": container with ID starting with d956f8bb221ef057af8371154e518a6134f8c89d5053603eb34fbec15d5809cc not found: ID does not exist" containerID="d956f8bb221ef057af8371154e518a6134f8c89d5053603eb34fbec15d5809cc" Mar 18 19:21:44 crc kubenswrapper[4830]: I0318 19:21:44.287153 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d956f8bb221ef057af8371154e518a6134f8c89d5053603eb34fbec15d5809cc"} err="failed to get container status \"d956f8bb221ef057af8371154e518a6134f8c89d5053603eb34fbec15d5809cc\": rpc error: code = NotFound desc = could not find container \"d956f8bb221ef057af8371154e518a6134f8c89d5053603eb34fbec15d5809cc\": container with ID starting with d956f8bb221ef057af8371154e518a6134f8c89d5053603eb34fbec15d5809cc not found: ID does not exist" Mar 18 19:21:44 crc kubenswrapper[4830]: I0318 19:21:44.287167 4830 scope.go:117] "RemoveContainer" containerID="fc5f0676b8a56c492c992f55881906f92ba533d0392ba3b792d28e8713484222" Mar 18 19:21:44 crc kubenswrapper[4830]: E0318 19:21:44.287423 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc5f0676b8a56c492c992f55881906f92ba533d0392ba3b792d28e8713484222\": container with ID starting with fc5f0676b8a56c492c992f55881906f92ba533d0392ba3b792d28e8713484222 not found: ID does not exist" containerID="fc5f0676b8a56c492c992f55881906f92ba533d0392ba3b792d28e8713484222" Mar 18 19:21:44 crc kubenswrapper[4830]: I0318 19:21:44.287443 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc5f0676b8a56c492c992f55881906f92ba533d0392ba3b792d28e8713484222"} err="failed to get container status \"fc5f0676b8a56c492c992f55881906f92ba533d0392ba3b792d28e8713484222\": rpc error: code = NotFound desc = could not find container \"fc5f0676b8a56c492c992f55881906f92ba533d0392ba3b792d28e8713484222\": container with ID starting with fc5f0676b8a56c492c992f55881906f92ba533d0392ba3b792d28e8713484222 not found: ID does not exist" Mar 18 19:21:44 crc kubenswrapper[4830]: I0318 19:21:44.300426 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42453534-9dd6-4942-a1fe-170491fd30e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42453534-9dd6-4942-a1fe-170491fd30e4" (UID: "42453534-9dd6-4942-a1fe-170491fd30e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:21:44 crc kubenswrapper[4830]: I0318 19:21:44.330360 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42453534-9dd6-4942-a1fe-170491fd30e4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 19:21:44 crc kubenswrapper[4830]: I0318 19:21:44.330387 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42453534-9dd6-4942-a1fe-170491fd30e4-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 19:21:44 crc kubenswrapper[4830]: I0318 19:21:44.330396 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6vfc\" (UniqueName: \"kubernetes.io/projected/42453534-9dd6-4942-a1fe-170491fd30e4-kube-api-access-v6vfc\") on node \"crc\" DevicePath \"\"" Mar 18 19:21:44 crc kubenswrapper[4830]: I0318 19:21:44.537973 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7mvn5"] Mar 18 19:21:44 crc kubenswrapper[4830]: I0318 19:21:44.550017 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7mvn5"] Mar 18 19:21:46 crc kubenswrapper[4830]: I0318 19:21:46.249158 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42453534-9dd6-4942-a1fe-170491fd30e4" path="/var/lib/kubelet/pods/42453534-9dd6-4942-a1fe-170491fd30e4/volumes" Mar 18 19:22:00 crc kubenswrapper[4830]: I0318 19:22:00.164798 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564362-mpc2b"] Mar 18 19:22:00 crc kubenswrapper[4830]: E0318 19:22:00.165979 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42453534-9dd6-4942-a1fe-170491fd30e4" containerName="extract-utilities" Mar 18 19:22:00 crc kubenswrapper[4830]: I0318 19:22:00.166005 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="42453534-9dd6-4942-a1fe-170491fd30e4" containerName="extract-utilities" Mar 18 19:22:00 crc kubenswrapper[4830]: E0318 19:22:00.166059 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42453534-9dd6-4942-a1fe-170491fd30e4" containerName="registry-server" Mar 18 19:22:00 crc kubenswrapper[4830]: I0318 19:22:00.166074 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="42453534-9dd6-4942-a1fe-170491fd30e4" containerName="registry-server" Mar 18 19:22:00 crc kubenswrapper[4830]: E0318 19:22:00.166097 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42453534-9dd6-4942-a1fe-170491fd30e4" containerName="extract-content" Mar 18 19:22:00 crc kubenswrapper[4830]: I0318 19:22:00.166111 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="42453534-9dd6-4942-a1fe-170491fd30e4" containerName="extract-content" Mar 18 19:22:00 crc kubenswrapper[4830]: I0318 19:22:00.166475 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="42453534-9dd6-4942-a1fe-170491fd30e4" containerName="registry-server" Mar 18 19:22:00 crc kubenswrapper[4830]: I0318 19:22:00.167208 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564362-mpc2b" Mar 18 19:22:00 crc kubenswrapper[4830]: I0318 19:22:00.171484 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:22:00 crc kubenswrapper[4830]: I0318 19:22:00.171662 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:22:00 crc kubenswrapper[4830]: I0318 19:22:00.172069 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 19:22:00 crc kubenswrapper[4830]: I0318 19:22:00.180477 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564362-mpc2b"] Mar 18 19:22:00 crc kubenswrapper[4830]: I0318 19:22:00.321815 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tpks\" (UniqueName: \"kubernetes.io/projected/b57ef22a-4cfb-4f81-841c-23851f749849-kube-api-access-9tpks\") pod \"auto-csr-approver-29564362-mpc2b\" (UID: \"b57ef22a-4cfb-4f81-841c-23851f749849\") " pod="openshift-infra/auto-csr-approver-29564362-mpc2b" Mar 18 19:22:00 crc kubenswrapper[4830]: I0318 19:22:00.423015 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tpks\" (UniqueName: \"kubernetes.io/projected/b57ef22a-4cfb-4f81-841c-23851f749849-kube-api-access-9tpks\") pod \"auto-csr-approver-29564362-mpc2b\" (UID: \"b57ef22a-4cfb-4f81-841c-23851f749849\") " pod="openshift-infra/auto-csr-approver-29564362-mpc2b" Mar 18 19:22:00 crc kubenswrapper[4830]: I0318 19:22:00.442970 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tpks\" (UniqueName: \"kubernetes.io/projected/b57ef22a-4cfb-4f81-841c-23851f749849-kube-api-access-9tpks\") pod \"auto-csr-approver-29564362-mpc2b\" (UID: \"b57ef22a-4cfb-4f81-841c-23851f749849\") " pod="openshift-infra/auto-csr-approver-29564362-mpc2b" Mar 18 19:22:00 crc kubenswrapper[4830]: I0318 19:22:00.502065 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564362-mpc2b" Mar 18 19:22:00 crc kubenswrapper[4830]: I0318 19:22:00.793423 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564362-mpc2b"] Mar 18 19:22:01 crc kubenswrapper[4830]: I0318 19:22:01.343739 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564362-mpc2b" event={"ID":"b57ef22a-4cfb-4f81-841c-23851f749849","Type":"ContainerStarted","Data":"79f46c7590cf503b0dd778065f022b5fe8b09f1712701edf34c86653a43be81b"} Mar 18 19:22:02 crc kubenswrapper[4830]: I0318 19:22:02.356238 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564362-mpc2b" event={"ID":"b57ef22a-4cfb-4f81-841c-23851f749849","Type":"ContainerStarted","Data":"43ba8d7b6a349d4d3df028886c0c5146d3f6ac7931e5a6fc910670598d34a940"} Mar 18 19:22:02 crc kubenswrapper[4830]: I0318 19:22:02.383157 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564362-mpc2b" podStartSLOduration=1.310033865 podStartE2EDuration="2.383129035s" podCreationTimestamp="2026-03-18 19:22:00 +0000 UTC" firstStartedPulling="2026-03-18 19:22:00.802952808 +0000 UTC m=+4755.370583150" lastFinishedPulling="2026-03-18 19:22:01.876047948 +0000 UTC m=+4756.443678320" observedRunningTime="2026-03-18 19:22:02.375538151 +0000 UTC m=+4756.943168493" watchObservedRunningTime="2026-03-18 19:22:02.383129035 +0000 UTC m=+4756.950759377" Mar 18 19:22:03 crc kubenswrapper[4830]: I0318 19:22:03.368083 4830 generic.go:334] "Generic (PLEG): container finished" podID="b57ef22a-4cfb-4f81-841c-23851f749849" containerID="43ba8d7b6a349d4d3df028886c0c5146d3f6ac7931e5a6fc910670598d34a940" exitCode=0 Mar 18 19:22:03 crc kubenswrapper[4830]: I0318 19:22:03.368152 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564362-mpc2b" event={"ID":"b57ef22a-4cfb-4f81-841c-23851f749849","Type":"ContainerDied","Data":"43ba8d7b6a349d4d3df028886c0c5146d3f6ac7931e5a6fc910670598d34a940"} Mar 18 19:22:04 crc kubenswrapper[4830]: I0318 19:22:04.726798 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564362-mpc2b" Mar 18 19:22:04 crc kubenswrapper[4830]: I0318 19:22:04.892692 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tpks\" (UniqueName: \"kubernetes.io/projected/b57ef22a-4cfb-4f81-841c-23851f749849-kube-api-access-9tpks\") pod \"b57ef22a-4cfb-4f81-841c-23851f749849\" (UID: \"b57ef22a-4cfb-4f81-841c-23851f749849\") " Mar 18 19:22:04 crc kubenswrapper[4830]: I0318 19:22:04.898377 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b57ef22a-4cfb-4f81-841c-23851f749849-kube-api-access-9tpks" (OuterVolumeSpecName: "kube-api-access-9tpks") pod "b57ef22a-4cfb-4f81-841c-23851f749849" (UID: "b57ef22a-4cfb-4f81-841c-23851f749849"). InnerVolumeSpecName "kube-api-access-9tpks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:22:04 crc kubenswrapper[4830]: I0318 19:22:04.994676 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tpks\" (UniqueName: \"kubernetes.io/projected/b57ef22a-4cfb-4f81-841c-23851f749849-kube-api-access-9tpks\") on node \"crc\" DevicePath \"\"" Mar 18 19:22:05 crc kubenswrapper[4830]: I0318 19:22:05.387417 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564362-mpc2b" event={"ID":"b57ef22a-4cfb-4f81-841c-23851f749849","Type":"ContainerDied","Data":"79f46c7590cf503b0dd778065f022b5fe8b09f1712701edf34c86653a43be81b"} Mar 18 19:22:05 crc kubenswrapper[4830]: I0318 19:22:05.387464 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79f46c7590cf503b0dd778065f022b5fe8b09f1712701edf34c86653a43be81b" Mar 18 19:22:05 crc kubenswrapper[4830]: I0318 19:22:05.387520 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564362-mpc2b" Mar 18 19:22:05 crc kubenswrapper[4830]: I0318 19:22:05.480654 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564356-n56vq"] Mar 18 19:22:05 crc kubenswrapper[4830]: I0318 19:22:05.487048 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564356-n56vq"] Mar 18 19:22:06 crc kubenswrapper[4830]: I0318 19:22:06.250028 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6aa1292-dcc9-4ea5-8825-18e0fc478a5b" path="/var/lib/kubelet/pods/d6aa1292-dcc9-4ea5-8825-18e0fc478a5b/volumes" Mar 18 19:22:19 crc kubenswrapper[4830]: I0318 19:22:19.644601 4830 scope.go:117] "RemoveContainer" containerID="679f40214815ba5fa3a8e61b1b1bf08607b92061f5640f9bc7f2785eeb752a54" Mar 18 19:22:59 crc kubenswrapper[4830]: I0318 19:22:59.509637 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:22:59 crc kubenswrapper[4830]: I0318 19:22:59.510169 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:23:04 crc kubenswrapper[4830]: I0318 19:23:04.256699 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-rld4k"] Mar 18 19:23:04 crc kubenswrapper[4830]: I0318 19:23:04.261364 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-rld4k"] Mar 18 19:23:04 crc kubenswrapper[4830]: I0318 19:23:04.398746 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-vb9lr"] Mar 18 19:23:04 crc kubenswrapper[4830]: E0318 19:23:04.403363 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b57ef22a-4cfb-4f81-841c-23851f749849" containerName="oc" Mar 18 19:23:04 crc kubenswrapper[4830]: I0318 19:23:04.403403 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b57ef22a-4cfb-4f81-841c-23851f749849" containerName="oc" Mar 18 19:23:04 crc kubenswrapper[4830]: I0318 19:23:04.403729 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="b57ef22a-4cfb-4f81-841c-23851f749849" containerName="oc" Mar 18 19:23:04 crc kubenswrapper[4830]: I0318 19:23:04.404492 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-vb9lr" Mar 18 19:23:04 crc kubenswrapper[4830]: I0318 19:23:04.407141 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 18 19:23:04 crc kubenswrapper[4830]: I0318 19:23:04.407502 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 18 19:23:04 crc kubenswrapper[4830]: I0318 19:23:04.407788 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 18 19:23:04 crc kubenswrapper[4830]: I0318 19:23:04.408045 4830 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-dvfpb" Mar 18 19:23:04 crc kubenswrapper[4830]: I0318 19:23:04.431950 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-vb9lr"] Mar 18 19:23:04 crc kubenswrapper[4830]: I0318 19:23:04.501884 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k886w\" (UniqueName: \"kubernetes.io/projected/c9ced5a4-cc22-4cf9-bba2-e51d731283e9-kube-api-access-k886w\") pod \"crc-storage-crc-vb9lr\" (UID: \"c9ced5a4-cc22-4cf9-bba2-e51d731283e9\") " pod="crc-storage/crc-storage-crc-vb9lr" Mar 18 19:23:04 crc kubenswrapper[4830]: I0318 19:23:04.501969 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c9ced5a4-cc22-4cf9-bba2-e51d731283e9-crc-storage\") pod \"crc-storage-crc-vb9lr\" (UID: \"c9ced5a4-cc22-4cf9-bba2-e51d731283e9\") " pod="crc-storage/crc-storage-crc-vb9lr" Mar 18 19:23:04 crc kubenswrapper[4830]: I0318 19:23:04.502038 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c9ced5a4-cc22-4cf9-bba2-e51d731283e9-node-mnt\") pod \"crc-storage-crc-vb9lr\" (UID: \"c9ced5a4-cc22-4cf9-bba2-e51d731283e9\") " pod="crc-storage/crc-storage-crc-vb9lr" Mar 18 19:23:04 crc kubenswrapper[4830]: I0318 19:23:04.603032 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c9ced5a4-cc22-4cf9-bba2-e51d731283e9-crc-storage\") pod \"crc-storage-crc-vb9lr\" (UID: \"c9ced5a4-cc22-4cf9-bba2-e51d731283e9\") " pod="crc-storage/crc-storage-crc-vb9lr" Mar 18 19:23:04 crc kubenswrapper[4830]: I0318 19:23:04.603123 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c9ced5a4-cc22-4cf9-bba2-e51d731283e9-node-mnt\") pod \"crc-storage-crc-vb9lr\" (UID: \"c9ced5a4-cc22-4cf9-bba2-e51d731283e9\") " pod="crc-storage/crc-storage-crc-vb9lr" Mar 18 19:23:04 crc kubenswrapper[4830]: I0318 19:23:04.603191 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k886w\" (UniqueName: \"kubernetes.io/projected/c9ced5a4-cc22-4cf9-bba2-e51d731283e9-kube-api-access-k886w\") pod \"crc-storage-crc-vb9lr\" (UID: \"c9ced5a4-cc22-4cf9-bba2-e51d731283e9\") " pod="crc-storage/crc-storage-crc-vb9lr" Mar 18 19:23:04 crc kubenswrapper[4830]: I0318 19:23:04.603960 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c9ced5a4-cc22-4cf9-bba2-e51d731283e9-node-mnt\") pod \"crc-storage-crc-vb9lr\" (UID: \"c9ced5a4-cc22-4cf9-bba2-e51d731283e9\") " pod="crc-storage/crc-storage-crc-vb9lr" Mar 18 19:23:04 crc kubenswrapper[4830]: I0318 19:23:04.604278 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c9ced5a4-cc22-4cf9-bba2-e51d731283e9-crc-storage\") pod \"crc-storage-crc-vb9lr\" (UID: \"c9ced5a4-cc22-4cf9-bba2-e51d731283e9\") " pod="crc-storage/crc-storage-crc-vb9lr" Mar 18 19:23:04 crc kubenswrapper[4830]: I0318 19:23:04.637894 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k886w\" (UniqueName: \"kubernetes.io/projected/c9ced5a4-cc22-4cf9-bba2-e51d731283e9-kube-api-access-k886w\") pod \"crc-storage-crc-vb9lr\" (UID: \"c9ced5a4-cc22-4cf9-bba2-e51d731283e9\") " pod="crc-storage/crc-storage-crc-vb9lr" Mar 18 19:23:04 crc kubenswrapper[4830]: I0318 19:23:04.735061 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-vb9lr" Mar 18 19:23:05 crc kubenswrapper[4830]: I0318 19:23:05.255967 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-vb9lr"] Mar 18 19:23:05 crc kubenswrapper[4830]: I0318 19:23:05.923935 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-vb9lr" event={"ID":"c9ced5a4-cc22-4cf9-bba2-e51d731283e9","Type":"ContainerStarted","Data":"03c9c391c4b1874665bcf525583874a01392e41152d3de7ddc7c90c6b511ade5"} Mar 18 19:23:06 crc kubenswrapper[4830]: I0318 19:23:06.357110 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42e2d0f3-46a9-4395-abf0-cfba0ce8803e" path="/var/lib/kubelet/pods/42e2d0f3-46a9-4395-abf0-cfba0ce8803e/volumes" Mar 18 19:23:06 crc kubenswrapper[4830]: I0318 19:23:06.935304 4830 generic.go:334] "Generic (PLEG): container finished" podID="c9ced5a4-cc22-4cf9-bba2-e51d731283e9" containerID="65fa2f44cdf2b50b7675724bf08b7be35806b60643093ea085b9f69536a61c76" exitCode=0 Mar 18 19:23:06 crc kubenswrapper[4830]: I0318 19:23:06.935831 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-vb9lr" event={"ID":"c9ced5a4-cc22-4cf9-bba2-e51d731283e9","Type":"ContainerDied","Data":"65fa2f44cdf2b50b7675724bf08b7be35806b60643093ea085b9f69536a61c76"} Mar 18 19:23:08 crc kubenswrapper[4830]: I0318 19:23:08.302324 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-vb9lr" Mar 18 19:23:08 crc kubenswrapper[4830]: I0318 19:23:08.461502 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c9ced5a4-cc22-4cf9-bba2-e51d731283e9-crc-storage\") pod \"c9ced5a4-cc22-4cf9-bba2-e51d731283e9\" (UID: \"c9ced5a4-cc22-4cf9-bba2-e51d731283e9\") " Mar 18 19:23:08 crc kubenswrapper[4830]: I0318 19:23:08.461631 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k886w\" (UniqueName: \"kubernetes.io/projected/c9ced5a4-cc22-4cf9-bba2-e51d731283e9-kube-api-access-k886w\") pod \"c9ced5a4-cc22-4cf9-bba2-e51d731283e9\" (UID: \"c9ced5a4-cc22-4cf9-bba2-e51d731283e9\") " Mar 18 19:23:08 crc kubenswrapper[4830]: I0318 19:23:08.461664 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c9ced5a4-cc22-4cf9-bba2-e51d731283e9-node-mnt\") pod \"c9ced5a4-cc22-4cf9-bba2-e51d731283e9\" (UID: \"c9ced5a4-cc22-4cf9-bba2-e51d731283e9\") " Mar 18 19:23:08 crc kubenswrapper[4830]: I0318 19:23:08.462032 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9ced5a4-cc22-4cf9-bba2-e51d731283e9-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "c9ced5a4-cc22-4cf9-bba2-e51d731283e9" (UID: "c9ced5a4-cc22-4cf9-bba2-e51d731283e9"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 19:23:08 crc kubenswrapper[4830]: I0318 19:23:08.467198 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9ced5a4-cc22-4cf9-bba2-e51d731283e9-kube-api-access-k886w" (OuterVolumeSpecName: "kube-api-access-k886w") pod "c9ced5a4-cc22-4cf9-bba2-e51d731283e9" (UID: "c9ced5a4-cc22-4cf9-bba2-e51d731283e9"). InnerVolumeSpecName "kube-api-access-k886w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:23:08 crc kubenswrapper[4830]: I0318 19:23:08.477718 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9ced5a4-cc22-4cf9-bba2-e51d731283e9-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "c9ced5a4-cc22-4cf9-bba2-e51d731283e9" (UID: "c9ced5a4-cc22-4cf9-bba2-e51d731283e9"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:23:08 crc kubenswrapper[4830]: I0318 19:23:08.562721 4830 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c9ced5a4-cc22-4cf9-bba2-e51d731283e9-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 18 19:23:08 crc kubenswrapper[4830]: I0318 19:23:08.562761 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k886w\" (UniqueName: \"kubernetes.io/projected/c9ced5a4-cc22-4cf9-bba2-e51d731283e9-kube-api-access-k886w\") on node \"crc\" DevicePath \"\"" Mar 18 19:23:08 crc kubenswrapper[4830]: I0318 19:23:08.562792 4830 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c9ced5a4-cc22-4cf9-bba2-e51d731283e9-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 18 19:23:08 crc kubenswrapper[4830]: I0318 19:23:08.957244 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-vb9lr" event={"ID":"c9ced5a4-cc22-4cf9-bba2-e51d731283e9","Type":"ContainerDied","Data":"03c9c391c4b1874665bcf525583874a01392e41152d3de7ddc7c90c6b511ade5"} Mar 18 19:23:08 crc kubenswrapper[4830]: I0318 19:23:08.957305 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03c9c391c4b1874665bcf525583874a01392e41152d3de7ddc7c90c6b511ade5" Mar 18 19:23:08 crc kubenswrapper[4830]: I0318 19:23:08.957315 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-vb9lr" Mar 18 19:23:10 crc kubenswrapper[4830]: I0318 19:23:10.676338 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-vb9lr"] Mar 18 19:23:10 crc kubenswrapper[4830]: I0318 19:23:10.681007 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-vb9lr"] Mar 18 19:23:10 crc kubenswrapper[4830]: I0318 19:23:10.849205 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-tld7h"] Mar 18 19:23:10 crc kubenswrapper[4830]: E0318 19:23:10.849663 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ced5a4-cc22-4cf9-bba2-e51d731283e9" containerName="storage" Mar 18 19:23:10 crc kubenswrapper[4830]: I0318 19:23:10.849696 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ced5a4-cc22-4cf9-bba2-e51d731283e9" containerName="storage" Mar 18 19:23:10 crc kubenswrapper[4830]: I0318 19:23:10.850011 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9ced5a4-cc22-4cf9-bba2-e51d731283e9" containerName="storage" Mar 18 19:23:10 crc kubenswrapper[4830]: I0318 19:23:10.850698 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tld7h" Mar 18 19:23:10 crc kubenswrapper[4830]: I0318 19:23:10.856881 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 18 19:23:10 crc kubenswrapper[4830]: I0318 19:23:10.857398 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 18 19:23:10 crc kubenswrapper[4830]: I0318 19:23:10.857461 4830 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-dvfpb" Mar 18 19:23:10 crc kubenswrapper[4830]: I0318 19:23:10.857824 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 18 19:23:10 crc kubenswrapper[4830]: I0318 19:23:10.878398 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-tld7h"] Mar 18 19:23:10 crc kubenswrapper[4830]: I0318 19:23:10.995438 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/449a35f3-a0dc-43b9-bb8b-e89d117b9aef-node-mnt\") pod \"crc-storage-crc-tld7h\" (UID: \"449a35f3-a0dc-43b9-bb8b-e89d117b9aef\") " pod="crc-storage/crc-storage-crc-tld7h" Mar 18 19:23:10 crc kubenswrapper[4830]: I0318 19:23:10.995509 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdqnr\" (UniqueName: \"kubernetes.io/projected/449a35f3-a0dc-43b9-bb8b-e89d117b9aef-kube-api-access-mdqnr\") pod \"crc-storage-crc-tld7h\" (UID: \"449a35f3-a0dc-43b9-bb8b-e89d117b9aef\") " pod="crc-storage/crc-storage-crc-tld7h" Mar 18 19:23:10 crc kubenswrapper[4830]: I0318 19:23:10.995543 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/449a35f3-a0dc-43b9-bb8b-e89d117b9aef-crc-storage\") pod \"crc-storage-crc-tld7h\" (UID: \"449a35f3-a0dc-43b9-bb8b-e89d117b9aef\") " pod="crc-storage/crc-storage-crc-tld7h" Mar 18 19:23:11 crc kubenswrapper[4830]: I0318 19:23:11.097167 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/449a35f3-a0dc-43b9-bb8b-e89d117b9aef-node-mnt\") pod \"crc-storage-crc-tld7h\" (UID: \"449a35f3-a0dc-43b9-bb8b-e89d117b9aef\") " pod="crc-storage/crc-storage-crc-tld7h" Mar 18 19:23:11 crc kubenswrapper[4830]: I0318 19:23:11.097303 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdqnr\" (UniqueName: \"kubernetes.io/projected/449a35f3-a0dc-43b9-bb8b-e89d117b9aef-kube-api-access-mdqnr\") pod \"crc-storage-crc-tld7h\" (UID: \"449a35f3-a0dc-43b9-bb8b-e89d117b9aef\") " pod="crc-storage/crc-storage-crc-tld7h" Mar 18 19:23:11 crc kubenswrapper[4830]: I0318 19:23:11.097367 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/449a35f3-a0dc-43b9-bb8b-e89d117b9aef-crc-storage\") pod \"crc-storage-crc-tld7h\" (UID: \"449a35f3-a0dc-43b9-bb8b-e89d117b9aef\") " pod="crc-storage/crc-storage-crc-tld7h" Mar 18 19:23:11 crc kubenswrapper[4830]: I0318 19:23:11.098577 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/449a35f3-a0dc-43b9-bb8b-e89d117b9aef-crc-storage\") pod \"crc-storage-crc-tld7h\" (UID: \"449a35f3-a0dc-43b9-bb8b-e89d117b9aef\") " pod="crc-storage/crc-storage-crc-tld7h" Mar 18 19:23:11 crc kubenswrapper[4830]: I0318 19:23:11.098953 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/449a35f3-a0dc-43b9-bb8b-e89d117b9aef-node-mnt\") pod \"crc-storage-crc-tld7h\" (UID: \"449a35f3-a0dc-43b9-bb8b-e89d117b9aef\") " pod="crc-storage/crc-storage-crc-tld7h" Mar 18 19:23:11 crc kubenswrapper[4830]: I0318 19:23:11.127737 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdqnr\" (UniqueName: \"kubernetes.io/projected/449a35f3-a0dc-43b9-bb8b-e89d117b9aef-kube-api-access-mdqnr\") pod \"crc-storage-crc-tld7h\" (UID: \"449a35f3-a0dc-43b9-bb8b-e89d117b9aef\") " pod="crc-storage/crc-storage-crc-tld7h" Mar 18 19:23:11 crc kubenswrapper[4830]: I0318 19:23:11.181386 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tld7h" Mar 18 19:23:11 crc kubenswrapper[4830]: I0318 19:23:11.669834 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-tld7h"] Mar 18 19:23:11 crc kubenswrapper[4830]: I0318 19:23:11.983808 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tld7h" event={"ID":"449a35f3-a0dc-43b9-bb8b-e89d117b9aef","Type":"ContainerStarted","Data":"7cfa8920b683860e7841ead7cd8e04494d6533d7b2c0fc111e86e7c4659b24bf"} Mar 18 19:23:12 crc kubenswrapper[4830]: I0318 19:23:12.254462 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9ced5a4-cc22-4cf9-bba2-e51d731283e9" path="/var/lib/kubelet/pods/c9ced5a4-cc22-4cf9-bba2-e51d731283e9/volumes" Mar 18 19:23:12 crc kubenswrapper[4830]: I0318 19:23:12.999163 4830 generic.go:334] "Generic (PLEG): container finished" podID="449a35f3-a0dc-43b9-bb8b-e89d117b9aef" containerID="5c33f6d8acb93e6588060b34c396435020b073bc5c8631360c9bb81f75d24aaa" exitCode=0 Mar 18 19:23:13 crc kubenswrapper[4830]: I0318 19:23:12.999262 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tld7h" event={"ID":"449a35f3-a0dc-43b9-bb8b-e89d117b9aef","Type":"ContainerDied","Data":"5c33f6d8acb93e6588060b34c396435020b073bc5c8631360c9bb81f75d24aaa"} Mar 18 19:23:14 crc kubenswrapper[4830]: I0318 19:23:14.424332 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tld7h" Mar 18 19:23:14 crc kubenswrapper[4830]: I0318 19:23:14.549469 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/449a35f3-a0dc-43b9-bb8b-e89d117b9aef-node-mnt\") pod \"449a35f3-a0dc-43b9-bb8b-e89d117b9aef\" (UID: \"449a35f3-a0dc-43b9-bb8b-e89d117b9aef\") " Mar 18 19:23:14 crc kubenswrapper[4830]: I0318 19:23:14.549542 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdqnr\" (UniqueName: \"kubernetes.io/projected/449a35f3-a0dc-43b9-bb8b-e89d117b9aef-kube-api-access-mdqnr\") pod \"449a35f3-a0dc-43b9-bb8b-e89d117b9aef\" (UID: \"449a35f3-a0dc-43b9-bb8b-e89d117b9aef\") " Mar 18 19:23:14 crc kubenswrapper[4830]: I0318 19:23:14.549609 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/449a35f3-a0dc-43b9-bb8b-e89d117b9aef-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "449a35f3-a0dc-43b9-bb8b-e89d117b9aef" (UID: "449a35f3-a0dc-43b9-bb8b-e89d117b9aef"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 19:23:14 crc kubenswrapper[4830]: I0318 19:23:14.549755 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/449a35f3-a0dc-43b9-bb8b-e89d117b9aef-crc-storage\") pod \"449a35f3-a0dc-43b9-bb8b-e89d117b9aef\" (UID: \"449a35f3-a0dc-43b9-bb8b-e89d117b9aef\") " Mar 18 19:23:14 crc kubenswrapper[4830]: I0318 19:23:14.550191 4830 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/449a35f3-a0dc-43b9-bb8b-e89d117b9aef-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 18 19:23:14 crc kubenswrapper[4830]: I0318 19:23:14.558139 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/449a35f3-a0dc-43b9-bb8b-e89d117b9aef-kube-api-access-mdqnr" (OuterVolumeSpecName: "kube-api-access-mdqnr") pod "449a35f3-a0dc-43b9-bb8b-e89d117b9aef" (UID: "449a35f3-a0dc-43b9-bb8b-e89d117b9aef"). InnerVolumeSpecName "kube-api-access-mdqnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:23:14 crc kubenswrapper[4830]: I0318 19:23:14.584501 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/449a35f3-a0dc-43b9-bb8b-e89d117b9aef-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "449a35f3-a0dc-43b9-bb8b-e89d117b9aef" (UID: "449a35f3-a0dc-43b9-bb8b-e89d117b9aef"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:23:14 crc kubenswrapper[4830]: I0318 19:23:14.651602 4830 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/449a35f3-a0dc-43b9-bb8b-e89d117b9aef-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 18 19:23:14 crc kubenswrapper[4830]: I0318 19:23:14.651653 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdqnr\" (UniqueName: \"kubernetes.io/projected/449a35f3-a0dc-43b9-bb8b-e89d117b9aef-kube-api-access-mdqnr\") on node \"crc\" DevicePath \"\"" Mar 18 19:23:15 crc kubenswrapper[4830]: I0318 19:23:15.022845 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tld7h" event={"ID":"449a35f3-a0dc-43b9-bb8b-e89d117b9aef","Type":"ContainerDied","Data":"7cfa8920b683860e7841ead7cd8e04494d6533d7b2c0fc111e86e7c4659b24bf"} Mar 18 19:23:15 crc kubenswrapper[4830]: I0318 19:23:15.022903 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7cfa8920b683860e7841ead7cd8e04494d6533d7b2c0fc111e86e7c4659b24bf" Mar 18 19:23:15 crc kubenswrapper[4830]: I0318 19:23:15.022902 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tld7h" Mar 18 19:23:19 crc kubenswrapper[4830]: I0318 19:23:19.750733 4830 scope.go:117] "RemoveContainer" containerID="c79a38fa06ab08731da9f5b76c8877107b5bd2828136bc7e5f348e91feb6593f" Mar 18 19:23:29 crc kubenswrapper[4830]: I0318 19:23:29.510272 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:23:29 crc kubenswrapper[4830]: I0318 19:23:29.511111 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:23:59 crc kubenswrapper[4830]: I0318 19:23:59.510171 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:23:59 crc kubenswrapper[4830]: I0318 19:23:59.510984 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:23:59 crc kubenswrapper[4830]: I0318 19:23:59.511046 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" Mar 18 19:23:59 crc kubenswrapper[4830]: I0318 19:23:59.511764 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8ab0e5f91a7dbe7856c1160fe1f3fd41d957d07b4601ca351dadf0e31da9f972"} pod="openshift-machine-config-operator/machine-config-daemon-plzpb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 19:23:59 crc kubenswrapper[4830]: I0318 19:23:59.511862 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" containerID="cri-o://8ab0e5f91a7dbe7856c1160fe1f3fd41d957d07b4601ca351dadf0e31da9f972" gracePeriod=600 Mar 18 19:24:00 crc kubenswrapper[4830]: I0318 19:24:00.170395 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564364-hxvwf"] Mar 18 19:24:00 crc kubenswrapper[4830]: E0318 19:24:00.171206 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="449a35f3-a0dc-43b9-bb8b-e89d117b9aef" containerName="storage" Mar 18 19:24:00 crc kubenswrapper[4830]: I0318 19:24:00.171232 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="449a35f3-a0dc-43b9-bb8b-e89d117b9aef" containerName="storage" Mar 18 19:24:00 crc kubenswrapper[4830]: I0318 19:24:00.171490 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="449a35f3-a0dc-43b9-bb8b-e89d117b9aef" containerName="storage" Mar 18 19:24:00 crc kubenswrapper[4830]: I0318 19:24:00.172225 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564364-hxvwf" Mar 18 19:24:00 crc kubenswrapper[4830]: I0318 19:24:00.175404 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:24:00 crc kubenswrapper[4830]: I0318 19:24:00.175901 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:24:00 crc kubenswrapper[4830]: I0318 19:24:00.177234 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 19:24:00 crc kubenswrapper[4830]: I0318 19:24:00.186856 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564364-hxvwf"] Mar 18 19:24:00 crc kubenswrapper[4830]: I0318 19:24:00.206192 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86pmd\" (UniqueName: \"kubernetes.io/projected/4d23f38d-d210-46b2-9f30-b80e980f274c-kube-api-access-86pmd\") pod \"auto-csr-approver-29564364-hxvwf\" (UID: \"4d23f38d-d210-46b2-9f30-b80e980f274c\") " pod="openshift-infra/auto-csr-approver-29564364-hxvwf" Mar 18 19:24:00 crc kubenswrapper[4830]: I0318 19:24:00.308026 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86pmd\" (UniqueName: \"kubernetes.io/projected/4d23f38d-d210-46b2-9f30-b80e980f274c-kube-api-access-86pmd\") pod \"auto-csr-approver-29564364-hxvwf\" (UID: \"4d23f38d-d210-46b2-9f30-b80e980f274c\") " pod="openshift-infra/auto-csr-approver-29564364-hxvwf" Mar 18 19:24:00 crc kubenswrapper[4830]: I0318 19:24:00.339349 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86pmd\" (UniqueName: \"kubernetes.io/projected/4d23f38d-d210-46b2-9f30-b80e980f274c-kube-api-access-86pmd\") pod \"auto-csr-approver-29564364-hxvwf\" (UID: \"4d23f38d-d210-46b2-9f30-b80e980f274c\") " pod="openshift-infra/auto-csr-approver-29564364-hxvwf" Mar 18 19:24:00 crc kubenswrapper[4830]: I0318 19:24:00.475125 4830 generic.go:334] "Generic (PLEG): container finished" podID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerID="8ab0e5f91a7dbe7856c1160fe1f3fd41d957d07b4601ca351dadf0e31da9f972" exitCode=0 Mar 18 19:24:00 crc kubenswrapper[4830]: I0318 19:24:00.475176 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerDied","Data":"8ab0e5f91a7dbe7856c1160fe1f3fd41d957d07b4601ca351dadf0e31da9f972"} Mar 18 19:24:00 crc kubenswrapper[4830]: I0318 19:24:00.475207 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerStarted","Data":"2b2583ffa620998bcee9d8c36a0271eaafa77acd768af552c84519fa8e9cd8a5"} Mar 18 19:24:00 crc kubenswrapper[4830]: I0318 19:24:00.475228 4830 scope.go:117] "RemoveContainer" containerID="dd00990d3e0490eb7bb214bb6b0dc05f42337becb8fc6e26428974096f6651c9" Mar 18 19:24:00 crc kubenswrapper[4830]: I0318 19:24:00.537354 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564364-hxvwf" Mar 18 19:24:00 crc kubenswrapper[4830]: I0318 19:24:00.844008 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564364-hxvwf"] Mar 18 19:24:01 crc kubenswrapper[4830]: I0318 19:24:01.485541 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564364-hxvwf" event={"ID":"4d23f38d-d210-46b2-9f30-b80e980f274c","Type":"ContainerStarted","Data":"a0b86bc7704dc73a7e5c4eb5677969fa0716bc0496b822e7a1049ddc61e0fb6b"} Mar 18 19:24:02 crc kubenswrapper[4830]: I0318 19:24:02.500000 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564364-hxvwf" event={"ID":"4d23f38d-d210-46b2-9f30-b80e980f274c","Type":"ContainerStarted","Data":"0ecffee954cff0fe8e929d6ef29507d5e60e90c7f2a06198355eab86d41f3f4b"} Mar 18 19:24:02 crc kubenswrapper[4830]: I0318 19:24:02.520341 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564364-hxvwf" podStartSLOduration=1.461513364 podStartE2EDuration="2.520322354s" podCreationTimestamp="2026-03-18 19:24:00 +0000 UTC" firstStartedPulling="2026-03-18 19:24:00.858102721 +0000 UTC m=+4875.425733093" lastFinishedPulling="2026-03-18 19:24:01.916911721 +0000 UTC m=+4876.484542083" observedRunningTime="2026-03-18 19:24:02.51349503 +0000 UTC m=+4877.081125372" watchObservedRunningTime="2026-03-18 19:24:02.520322354 +0000 UTC m=+4877.087952696" Mar 18 19:24:02 crc kubenswrapper[4830]: E0318 19:24:02.629858 4830 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d23f38d_d210_46b2_9f30_b80e980f274c.slice/crio-conmon-0ecffee954cff0fe8e929d6ef29507d5e60e90c7f2a06198355eab86d41f3f4b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d23f38d_d210_46b2_9f30_b80e980f274c.slice/crio-0ecffee954cff0fe8e929d6ef29507d5e60e90c7f2a06198355eab86d41f3f4b.scope\": RecentStats: unable to find data in memory cache]" Mar 18 19:24:03 crc kubenswrapper[4830]: I0318 19:24:03.510170 4830 generic.go:334] "Generic (PLEG): container finished" podID="4d23f38d-d210-46b2-9f30-b80e980f274c" containerID="0ecffee954cff0fe8e929d6ef29507d5e60e90c7f2a06198355eab86d41f3f4b" exitCode=0 Mar 18 19:24:03 crc kubenswrapper[4830]: I0318 19:24:03.510233 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564364-hxvwf" event={"ID":"4d23f38d-d210-46b2-9f30-b80e980f274c","Type":"ContainerDied","Data":"0ecffee954cff0fe8e929d6ef29507d5e60e90c7f2a06198355eab86d41f3f4b"} Mar 18 19:24:04 crc kubenswrapper[4830]: I0318 19:24:04.809305 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564364-hxvwf" Mar 18 19:24:04 crc kubenswrapper[4830]: I0318 19:24:04.882534 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86pmd\" (UniqueName: \"kubernetes.io/projected/4d23f38d-d210-46b2-9f30-b80e980f274c-kube-api-access-86pmd\") pod \"4d23f38d-d210-46b2-9f30-b80e980f274c\" (UID: \"4d23f38d-d210-46b2-9f30-b80e980f274c\") " Mar 18 19:24:04 crc kubenswrapper[4830]: I0318 19:24:04.889239 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d23f38d-d210-46b2-9f30-b80e980f274c-kube-api-access-86pmd" (OuterVolumeSpecName: "kube-api-access-86pmd") pod "4d23f38d-d210-46b2-9f30-b80e980f274c" (UID: "4d23f38d-d210-46b2-9f30-b80e980f274c"). InnerVolumeSpecName "kube-api-access-86pmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:24:04 crc kubenswrapper[4830]: I0318 19:24:04.984973 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86pmd\" (UniqueName: \"kubernetes.io/projected/4d23f38d-d210-46b2-9f30-b80e980f274c-kube-api-access-86pmd\") on node \"crc\" DevicePath \"\"" Mar 18 19:24:05 crc kubenswrapper[4830]: I0318 19:24:05.528284 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564364-hxvwf" event={"ID":"4d23f38d-d210-46b2-9f30-b80e980f274c","Type":"ContainerDied","Data":"a0b86bc7704dc73a7e5c4eb5677969fa0716bc0496b822e7a1049ddc61e0fb6b"} Mar 18 19:24:05 crc kubenswrapper[4830]: I0318 19:24:05.528332 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564364-hxvwf" Mar 18 19:24:05 crc kubenswrapper[4830]: I0318 19:24:05.528342 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0b86bc7704dc73a7e5c4eb5677969fa0716bc0496b822e7a1049ddc61e0fb6b" Mar 18 19:24:05 crc kubenswrapper[4830]: I0318 19:24:05.606096 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564358-sttt2"] Mar 18 19:24:05 crc kubenswrapper[4830]: I0318 19:24:05.616218 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564358-sttt2"] Mar 18 19:24:06 crc kubenswrapper[4830]: I0318 19:24:06.244723 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="991f9ef1-91ba-4acd-8a7f-6b308ede9334" path="/var/lib/kubelet/pods/991f9ef1-91ba-4acd-8a7f-6b308ede9334/volumes" Mar 18 19:24:19 crc kubenswrapper[4830]: I0318 19:24:19.827990 4830 scope.go:117] "RemoveContainer" containerID="1a7ebc514e39463bc7f9063c6d8db38403e70d19f546bd0d275023aef7535119" Mar 18 19:25:16 crc kubenswrapper[4830]: I0318 19:25:16.937034 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dcc4d9b5-p9j84"] Mar 18 19:25:16 crc kubenswrapper[4830]: E0318 19:25:16.937712 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d23f38d-d210-46b2-9f30-b80e980f274c" containerName="oc" Mar 18 19:25:16 crc kubenswrapper[4830]: I0318 19:25:16.937726 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d23f38d-d210-46b2-9f30-b80e980f274c" containerName="oc" Mar 18 19:25:16 crc kubenswrapper[4830]: I0318 19:25:16.937872 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d23f38d-d210-46b2-9f30-b80e980f274c" containerName="oc" Mar 18 19:25:16 crc kubenswrapper[4830]: I0318 19:25:16.938502 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dcc4d9b5-p9j84" Mar 18 19:25:16 crc kubenswrapper[4830]: I0318 19:25:16.951367 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 18 19:25:16 crc kubenswrapper[4830]: I0318 19:25:16.951661 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 18 19:25:16 crc kubenswrapper[4830]: I0318 19:25:16.951733 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-sbwnq" Mar 18 19:25:16 crc kubenswrapper[4830]: I0318 19:25:16.951803 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 18 19:25:16 crc kubenswrapper[4830]: I0318 19:25:16.958123 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dcc4d9b5-p9j84"] Mar 18 19:25:16 crc kubenswrapper[4830]: I0318 19:25:16.966413 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76f4889f87-spp2z"] Mar 18 19:25:16 crc kubenswrapper[4830]: I0318 19:25:16.967504 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f4889f87-spp2z" Mar 18 19:25:16 crc kubenswrapper[4830]: I0318 19:25:16.970294 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.029633 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76f4889f87-spp2z"] Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.092559 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d-dns-svc\") pod \"dnsmasq-dns-76f4889f87-spp2z\" (UID: \"d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d\") " pod="openstack/dnsmasq-dns-76f4889f87-spp2z" Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.092624 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jdc4\" (UniqueName: \"kubernetes.io/projected/d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d-kube-api-access-6jdc4\") pod \"dnsmasq-dns-76f4889f87-spp2z\" (UID: \"d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d\") " pod="openstack/dnsmasq-dns-76f4889f87-spp2z" Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.092647 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lndlr\" (UniqueName: \"kubernetes.io/projected/ba6b8094-e9cb-4335-8179-9e27c28a810d-kube-api-access-lndlr\") pod \"dnsmasq-dns-78dcc4d9b5-p9j84\" (UID: \"ba6b8094-e9cb-4335-8179-9e27c28a810d\") " pod="openstack/dnsmasq-dns-78dcc4d9b5-p9j84" Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.092720 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba6b8094-e9cb-4335-8179-9e27c28a810d-config\") pod \"dnsmasq-dns-78dcc4d9b5-p9j84\" (UID: \"ba6b8094-e9cb-4335-8179-9e27c28a810d\") " pod="openstack/dnsmasq-dns-78dcc4d9b5-p9j84" Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.092742 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d-config\") pod \"dnsmasq-dns-76f4889f87-spp2z\" (UID: \"d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d\") " pod="openstack/dnsmasq-dns-76f4889f87-spp2z" Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.194383 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d-dns-svc\") pod \"dnsmasq-dns-76f4889f87-spp2z\" (UID: \"d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d\") " pod="openstack/dnsmasq-dns-76f4889f87-spp2z" Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.194452 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jdc4\" (UniqueName: \"kubernetes.io/projected/d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d-kube-api-access-6jdc4\") pod \"dnsmasq-dns-76f4889f87-spp2z\" (UID: \"d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d\") " pod="openstack/dnsmasq-dns-76f4889f87-spp2z" Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.194473 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lndlr\" (UniqueName: \"kubernetes.io/projected/ba6b8094-e9cb-4335-8179-9e27c28a810d-kube-api-access-lndlr\") pod \"dnsmasq-dns-78dcc4d9b5-p9j84\" (UID: \"ba6b8094-e9cb-4335-8179-9e27c28a810d\") " pod="openstack/dnsmasq-dns-78dcc4d9b5-p9j84" Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.194510 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba6b8094-e9cb-4335-8179-9e27c28a810d-config\") pod \"dnsmasq-dns-78dcc4d9b5-p9j84\" (UID: \"ba6b8094-e9cb-4335-8179-9e27c28a810d\") " pod="openstack/dnsmasq-dns-78dcc4d9b5-p9j84" Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.194524 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d-config\") pod \"dnsmasq-dns-76f4889f87-spp2z\" (UID: \"d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d\") " pod="openstack/dnsmasq-dns-76f4889f87-spp2z" Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.195491 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d-config\") pod \"dnsmasq-dns-76f4889f87-spp2z\" (UID: \"d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d\") " pod="openstack/dnsmasq-dns-76f4889f87-spp2z" Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.195743 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d-dns-svc\") pod \"dnsmasq-dns-76f4889f87-spp2z\" (UID: \"d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d\") " pod="openstack/dnsmasq-dns-76f4889f87-spp2z" Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.196107 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba6b8094-e9cb-4335-8179-9e27c28a810d-config\") pod \"dnsmasq-dns-78dcc4d9b5-p9j84\" (UID: \"ba6b8094-e9cb-4335-8179-9e27c28a810d\") " pod="openstack/dnsmasq-dns-78dcc4d9b5-p9j84" Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.212788 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jdc4\" (UniqueName: \"kubernetes.io/projected/d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d-kube-api-access-6jdc4\") pod \"dnsmasq-dns-76f4889f87-spp2z\" (UID: \"d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d\") " pod="openstack/dnsmasq-dns-76f4889f87-spp2z" Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.214814 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lndlr\" (UniqueName: \"kubernetes.io/projected/ba6b8094-e9cb-4335-8179-9e27c28a810d-kube-api-access-lndlr\") pod \"dnsmasq-dns-78dcc4d9b5-p9j84\" (UID: \"ba6b8094-e9cb-4335-8179-9e27c28a810d\") " pod="openstack/dnsmasq-dns-78dcc4d9b5-p9j84" Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.257647 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dcc4d9b5-p9j84" Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.283189 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f4889f87-spp2z" Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.411006 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76f4889f87-spp2z"] Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.447777 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cfbf56dd9-5h5mm"] Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.449007 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfbf56dd9-5h5mm" Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.467705 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cfbf56dd9-5h5mm"] Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.601754 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14df27a3-8516-4fc0-aef9-6f0c852697bf-dns-svc\") pod \"dnsmasq-dns-6cfbf56dd9-5h5mm\" (UID: \"14df27a3-8516-4fc0-aef9-6f0c852697bf\") " pod="openstack/dnsmasq-dns-6cfbf56dd9-5h5mm" Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.601897 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14df27a3-8516-4fc0-aef9-6f0c852697bf-config\") pod \"dnsmasq-dns-6cfbf56dd9-5h5mm\" (UID: \"14df27a3-8516-4fc0-aef9-6f0c852697bf\") " pod="openstack/dnsmasq-dns-6cfbf56dd9-5h5mm" Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.601926 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lntbt\" (UniqueName: \"kubernetes.io/projected/14df27a3-8516-4fc0-aef9-6f0c852697bf-kube-api-access-lntbt\") pod \"dnsmasq-dns-6cfbf56dd9-5h5mm\" (UID: \"14df27a3-8516-4fc0-aef9-6f0c852697bf\") " pod="openstack/dnsmasq-dns-6cfbf56dd9-5h5mm" Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.712005 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14df27a3-8516-4fc0-aef9-6f0c852697bf-dns-svc\") pod \"dnsmasq-dns-6cfbf56dd9-5h5mm\" (UID: \"14df27a3-8516-4fc0-aef9-6f0c852697bf\") " pod="openstack/dnsmasq-dns-6cfbf56dd9-5h5mm" Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.712205 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14df27a3-8516-4fc0-aef9-6f0c852697bf-config\") pod \"dnsmasq-dns-6cfbf56dd9-5h5mm\" (UID: \"14df27a3-8516-4fc0-aef9-6f0c852697bf\") " pod="openstack/dnsmasq-dns-6cfbf56dd9-5h5mm" Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.712250 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lntbt\" (UniqueName: \"kubernetes.io/projected/14df27a3-8516-4fc0-aef9-6f0c852697bf-kube-api-access-lntbt\") pod \"dnsmasq-dns-6cfbf56dd9-5h5mm\" (UID: \"14df27a3-8516-4fc0-aef9-6f0c852697bf\") " pod="openstack/dnsmasq-dns-6cfbf56dd9-5h5mm" Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.713649 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14df27a3-8516-4fc0-aef9-6f0c852697bf-dns-svc\") pod \"dnsmasq-dns-6cfbf56dd9-5h5mm\" (UID: \"14df27a3-8516-4fc0-aef9-6f0c852697bf\") " pod="openstack/dnsmasq-dns-6cfbf56dd9-5h5mm" Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.717510 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14df27a3-8516-4fc0-aef9-6f0c852697bf-config\") pod \"dnsmasq-dns-6cfbf56dd9-5h5mm\" (UID: \"14df27a3-8516-4fc0-aef9-6f0c852697bf\") " pod="openstack/dnsmasq-dns-6cfbf56dd9-5h5mm" Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.779931 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lntbt\" (UniqueName: \"kubernetes.io/projected/14df27a3-8516-4fc0-aef9-6f0c852697bf-kube-api-access-lntbt\") pod \"dnsmasq-dns-6cfbf56dd9-5h5mm\" (UID: \"14df27a3-8516-4fc0-aef9-6f0c852697bf\") " pod="openstack/dnsmasq-dns-6cfbf56dd9-5h5mm" Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.856485 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dcc4d9b5-p9j84"] Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.887148 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfbf56dd9-5h5mm" Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.911695 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c95686bd5-fqzh7"] Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.913250 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c95686bd5-fqzh7" Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.925707 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c95686bd5-fqzh7"] Mar 18 19:25:17 crc kubenswrapper[4830]: I0318 19:25:17.953042 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76f4889f87-spp2z"] Mar 18 19:25:17 crc kubenswrapper[4830]: W0318 19:25:17.953476 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7d4ea7b_2f9e_406f_8eea_f4e906bdd07d.slice/crio-8dcfd2cce2c68bcfa48950a2b5371fd3ce91d76a8195cb1480cfc70ede77016b WatchSource:0}: Error finding container 8dcfd2cce2c68bcfa48950a2b5371fd3ce91d76a8195cb1480cfc70ede77016b: Status 404 returned error can't find the container with id 8dcfd2cce2c68bcfa48950a2b5371fd3ce91d76a8195cb1480cfc70ede77016b Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.014630 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dcc4d9b5-p9j84"] Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.015363 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1015e46e-5610-4002-b19f-52b3489eb469-dns-svc\") pod \"dnsmasq-dns-7c95686bd5-fqzh7\" (UID: \"1015e46e-5610-4002-b19f-52b3489eb469\") " pod="openstack/dnsmasq-dns-7c95686bd5-fqzh7" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.015426 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsrr2\" (UniqueName: \"kubernetes.io/projected/1015e46e-5610-4002-b19f-52b3489eb469-kube-api-access-qsrr2\") pod \"dnsmasq-dns-7c95686bd5-fqzh7\" (UID: \"1015e46e-5610-4002-b19f-52b3489eb469\") " pod="openstack/dnsmasq-dns-7c95686bd5-fqzh7" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.015450 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1015e46e-5610-4002-b19f-52b3489eb469-config\") pod \"dnsmasq-dns-7c95686bd5-fqzh7\" (UID: \"1015e46e-5610-4002-b19f-52b3489eb469\") " pod="openstack/dnsmasq-dns-7c95686bd5-fqzh7" Mar 18 19:25:18 crc kubenswrapper[4830]: W0318 19:25:18.033544 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba6b8094_e9cb_4335_8179_9e27c28a810d.slice/crio-8ebbbbbe05849fafe576bdecc79f5d385808b333b249fe7717a1c51160bb183a WatchSource:0}: Error finding container 8ebbbbbe05849fafe576bdecc79f5d385808b333b249fe7717a1c51160bb183a: Status 404 returned error can't find the container with id 8ebbbbbe05849fafe576bdecc79f5d385808b333b249fe7717a1c51160bb183a Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.116542 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1015e46e-5610-4002-b19f-52b3489eb469-dns-svc\") pod \"dnsmasq-dns-7c95686bd5-fqzh7\" (UID: \"1015e46e-5610-4002-b19f-52b3489eb469\") " pod="openstack/dnsmasq-dns-7c95686bd5-fqzh7" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.116603 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsrr2\" (UniqueName: \"kubernetes.io/projected/1015e46e-5610-4002-b19f-52b3489eb469-kube-api-access-qsrr2\") pod \"dnsmasq-dns-7c95686bd5-fqzh7\" (UID: \"1015e46e-5610-4002-b19f-52b3489eb469\") " pod="openstack/dnsmasq-dns-7c95686bd5-fqzh7" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.116625 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1015e46e-5610-4002-b19f-52b3489eb469-config\") pod \"dnsmasq-dns-7c95686bd5-fqzh7\" (UID: \"1015e46e-5610-4002-b19f-52b3489eb469\") " pod="openstack/dnsmasq-dns-7c95686bd5-fqzh7" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.117327 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1015e46e-5610-4002-b19f-52b3489eb469-dns-svc\") pod \"dnsmasq-dns-7c95686bd5-fqzh7\" (UID: \"1015e46e-5610-4002-b19f-52b3489eb469\") " pod="openstack/dnsmasq-dns-7c95686bd5-fqzh7" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.117351 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1015e46e-5610-4002-b19f-52b3489eb469-config\") pod \"dnsmasq-dns-7c95686bd5-fqzh7\" (UID: \"1015e46e-5610-4002-b19f-52b3489eb469\") " pod="openstack/dnsmasq-dns-7c95686bd5-fqzh7" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.136003 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsrr2\" (UniqueName: \"kubernetes.io/projected/1015e46e-5610-4002-b19f-52b3489eb469-kube-api-access-qsrr2\") pod \"dnsmasq-dns-7c95686bd5-fqzh7\" (UID: \"1015e46e-5610-4002-b19f-52b3489eb469\") " pod="openstack/dnsmasq-dns-7c95686bd5-fqzh7" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.239482 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c95686bd5-fqzh7" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.331420 4830 generic.go:334] "Generic (PLEG): container finished" podID="ba6b8094-e9cb-4335-8179-9e27c28a810d" containerID="c887620735fd206d148a885906aea53ba8d12bbeb26e8c65c9734ac55ac3a3ba" exitCode=0 Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.331471 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dcc4d9b5-p9j84" event={"ID":"ba6b8094-e9cb-4335-8179-9e27c28a810d","Type":"ContainerDied","Data":"c887620735fd206d148a885906aea53ba8d12bbeb26e8c65c9734ac55ac3a3ba"} Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.331528 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dcc4d9b5-p9j84" event={"ID":"ba6b8094-e9cb-4335-8179-9e27c28a810d","Type":"ContainerStarted","Data":"8ebbbbbe05849fafe576bdecc79f5d385808b333b249fe7717a1c51160bb183a"} Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.338137 4830 generic.go:334] "Generic (PLEG): container finished" podID="d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d" containerID="e1d72a55b5f975564fc69b07642c45caf66a3f8cda942e8ae6c77ffedada8df6" exitCode=0 Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.338182 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f4889f87-spp2z" event={"ID":"d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d","Type":"ContainerDied","Data":"e1d72a55b5f975564fc69b07642c45caf66a3f8cda942e8ae6c77ffedada8df6"} Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.338208 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f4889f87-spp2z" event={"ID":"d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d","Type":"ContainerStarted","Data":"8dcfd2cce2c68bcfa48950a2b5371fd3ce91d76a8195cb1480cfc70ede77016b"} Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.412899 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cfbf56dd9-5h5mm"] Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.634961 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.636567 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.642349 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.642545 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.642675 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.642842 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.643276 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.643518 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.644471 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-nffzv" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.648234 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dcc4d9b5-p9j84" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.659470 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.726729 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lndlr\" (UniqueName: \"kubernetes.io/projected/ba6b8094-e9cb-4335-8179-9e27c28a810d-kube-api-access-lndlr\") pod \"ba6b8094-e9cb-4335-8179-9e27c28a810d\" (UID: \"ba6b8094-e9cb-4335-8179-9e27c28a810d\") " Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.726851 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba6b8094-e9cb-4335-8179-9e27c28a810d-config\") pod \"ba6b8094-e9cb-4335-8179-9e27c28a810d\" (UID: \"ba6b8094-e9cb-4335-8179-9e27c28a810d\") " Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.727087 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.727131 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.727163 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.727181 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.727198 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbpgh\" (UniqueName: \"kubernetes.io/projected/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-kube-api-access-fbpgh\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.727217 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.727232 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.727267 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f6068b25-a136-4337-851f-286cb129b608\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f6068b25-a136-4337-851f-286cb129b608\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.727288 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.727303 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.727322 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.728247 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f4889f87-spp2z" Mar 18 19:25:18 crc kubenswrapper[4830]: W0318 19:25:18.732468 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1015e46e_5610_4002_b19f_52b3489eb469.slice/crio-63db4b24ff79c84497af31a0ac398c9e8d87e7f619ae7a301160a80a2b7eb37d WatchSource:0}: Error finding container 63db4b24ff79c84497af31a0ac398c9e8d87e7f619ae7a301160a80a2b7eb37d: Status 404 returned error can't find the container with id 63db4b24ff79c84497af31a0ac398c9e8d87e7f619ae7a301160a80a2b7eb37d Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.733239 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba6b8094-e9cb-4335-8179-9e27c28a810d-kube-api-access-lndlr" (OuterVolumeSpecName: "kube-api-access-lndlr") pod "ba6b8094-e9cb-4335-8179-9e27c28a810d" (UID: "ba6b8094-e9cb-4335-8179-9e27c28a810d"). InnerVolumeSpecName "kube-api-access-lndlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.735386 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c95686bd5-fqzh7"] Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.746211 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba6b8094-e9cb-4335-8179-9e27c28a810d-config" (OuterVolumeSpecName: "config") pod "ba6b8094-e9cb-4335-8179-9e27c28a810d" (UID: "ba6b8094-e9cb-4335-8179-9e27c28a810d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.827945 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d-config\") pod \"d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d\" (UID: \"d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d\") " Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.828008 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d-dns-svc\") pod \"d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d\" (UID: \"d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d\") " Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.828107 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jdc4\" (UniqueName: \"kubernetes.io/projected/d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d-kube-api-access-6jdc4\") pod \"d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d\" (UID: \"d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d\") " Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.828383 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f6068b25-a136-4337-851f-286cb129b608\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f6068b25-a136-4337-851f-286cb129b608\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.828431 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.828455 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.828481 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.828535 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.828580 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.828622 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.828652 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.828695 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbpgh\" (UniqueName: \"kubernetes.io/projected/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-kube-api-access-fbpgh\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.828726 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.828750 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.828822 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lndlr\" (UniqueName: \"kubernetes.io/projected/ba6b8094-e9cb-4335-8179-9e27c28a810d-kube-api-access-lndlr\") on node \"crc\" DevicePath \"\"" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.828842 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba6b8094-e9cb-4335-8179-9e27c28a810d-config\") on node \"crc\" DevicePath \"\"" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.829671 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.829765 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.830090 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.830288 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.830958 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.840079 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d-kube-api-access-6jdc4" (OuterVolumeSpecName: "kube-api-access-6jdc4") pod "d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d" (UID: "d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d"). InnerVolumeSpecName "kube-api-access-6jdc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.840681 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.840741 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.840742 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.840968 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.843855 4830 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.843917 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f6068b25-a136-4337-851f-286cb129b608\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f6068b25-a136-4337-851f-286cb129b608\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fce16860e1b71769ef6ee1bb5fe8ee71a0a195bdb1f0e797608caa82b04c1475/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.848821 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbpgh\" (UniqueName: \"kubernetes.io/projected/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-kube-api-access-fbpgh\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.861365 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d-config" (OuterVolumeSpecName: "config") pod "d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d" (UID: "d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.862240 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d" (UID: "d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.882585 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f6068b25-a136-4337-851f-286cb129b608\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f6068b25-a136-4337-851f-286cb129b608\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.930959 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jdc4\" (UniqueName: \"kubernetes.io/projected/d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d-kube-api-access-6jdc4\") on node \"crc\" DevicePath \"\"" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.931012 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d-config\") on node \"crc\" DevicePath \"\"" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.931034 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.984417 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 19:25:18 crc kubenswrapper[4830]: E0318 19:25:18.984747 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d" containerName="init" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.984764 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d" containerName="init" Mar 18 19:25:18 crc kubenswrapper[4830]: E0318 19:25:18.984802 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba6b8094-e9cb-4335-8179-9e27c28a810d" containerName="init" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.984812 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba6b8094-e9cb-4335-8179-9e27c28a810d" containerName="init" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.984948 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d" containerName="init" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.985262 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba6b8094-e9cb-4335-8179-9e27c28a810d" containerName="init" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.985956 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.990250 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.990533 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.990757 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-j7w4c" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.990896 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.991042 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.991162 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.992032 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 18 19:25:18 crc kubenswrapper[4830]: I0318 19:25:18.993000 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.015712 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.133721 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f91c39a4-66e6-4401-950d-88b4d7d2a851-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " pod="openstack/rabbitmq-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.134250 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-81b3621f-8ddd-4d7d-9865-b1bdebd60e8b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-81b3621f-8ddd-4d7d-9865-b1bdebd60e8b\") pod \"rabbitmq-server-0\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " pod="openstack/rabbitmq-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.134328 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f91c39a4-66e6-4401-950d-88b4d7d2a851-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " pod="openstack/rabbitmq-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.134355 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f91c39a4-66e6-4401-950d-88b4d7d2a851-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " pod="openstack/rabbitmq-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.134425 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f91c39a4-66e6-4401-950d-88b4d7d2a851-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " pod="openstack/rabbitmq-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.134451 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f91c39a4-66e6-4401-950d-88b4d7d2a851-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " pod="openstack/rabbitmq-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.134480 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f91c39a4-66e6-4401-950d-88b4d7d2a851-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " pod="openstack/rabbitmq-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.134503 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f91c39a4-66e6-4401-950d-88b4d7d2a851-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " pod="openstack/rabbitmq-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.134527 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f91c39a4-66e6-4401-950d-88b4d7d2a851-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " pod="openstack/rabbitmq-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.134548 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f91c39a4-66e6-4401-950d-88b4d7d2a851-config-data\") pod \"rabbitmq-server-0\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " pod="openstack/rabbitmq-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.134579 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7pnf\" (UniqueName: \"kubernetes.io/projected/f91c39a4-66e6-4401-950d-88b4d7d2a851-kube-api-access-f7pnf\") pod \"rabbitmq-server-0\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " pod="openstack/rabbitmq-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.235862 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7pnf\" (UniqueName: \"kubernetes.io/projected/f91c39a4-66e6-4401-950d-88b4d7d2a851-kube-api-access-f7pnf\") pod \"rabbitmq-server-0\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " pod="openstack/rabbitmq-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.235925 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f91c39a4-66e6-4401-950d-88b4d7d2a851-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " pod="openstack/rabbitmq-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.235948 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-81b3621f-8ddd-4d7d-9865-b1bdebd60e8b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-81b3621f-8ddd-4d7d-9865-b1bdebd60e8b\") pod \"rabbitmq-server-0\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " pod="openstack/rabbitmq-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.235983 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f91c39a4-66e6-4401-950d-88b4d7d2a851-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " pod="openstack/rabbitmq-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.236002 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f91c39a4-66e6-4401-950d-88b4d7d2a851-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " pod="openstack/rabbitmq-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.236048 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f91c39a4-66e6-4401-950d-88b4d7d2a851-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " pod="openstack/rabbitmq-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.236065 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f91c39a4-66e6-4401-950d-88b4d7d2a851-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " pod="openstack/rabbitmq-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.236087 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f91c39a4-66e6-4401-950d-88b4d7d2a851-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " pod="openstack/rabbitmq-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.236103 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f91c39a4-66e6-4401-950d-88b4d7d2a851-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " pod="openstack/rabbitmq-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.236119 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f91c39a4-66e6-4401-950d-88b4d7d2a851-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " pod="openstack/rabbitmq-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.236134 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f91c39a4-66e6-4401-950d-88b4d7d2a851-config-data\") pod \"rabbitmq-server-0\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " pod="openstack/rabbitmq-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.236764 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f91c39a4-66e6-4401-950d-88b4d7d2a851-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " pod="openstack/rabbitmq-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.237286 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f91c39a4-66e6-4401-950d-88b4d7d2a851-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " pod="openstack/rabbitmq-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.237378 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f91c39a4-66e6-4401-950d-88b4d7d2a851-config-data\") pod \"rabbitmq-server-0\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " pod="openstack/rabbitmq-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.237885 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f91c39a4-66e6-4401-950d-88b4d7d2a851-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " pod="openstack/rabbitmq-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.237979 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f91c39a4-66e6-4401-950d-88b4d7d2a851-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " pod="openstack/rabbitmq-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.239609 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f91c39a4-66e6-4401-950d-88b4d7d2a851-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " pod="openstack/rabbitmq-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.240032 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f91c39a4-66e6-4401-950d-88b4d7d2a851-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " pod="openstack/rabbitmq-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.246276 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f91c39a4-66e6-4401-950d-88b4d7d2a851-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " pod="openstack/rabbitmq-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.246570 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f91c39a4-66e6-4401-950d-88b4d7d2a851-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " pod="openstack/rabbitmq-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.246796 4830 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.246850 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-81b3621f-8ddd-4d7d-9865-b1bdebd60e8b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-81b3621f-8ddd-4d7d-9865-b1bdebd60e8b\") pod \"rabbitmq-server-0\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/295cf756ac76f5e0c3f6c5c915148602110b4493b539d2e8b27ffd4382606983/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.252886 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7pnf\" (UniqueName: \"kubernetes.io/projected/f91c39a4-66e6-4401-950d-88b4d7d2a851-kube-api-access-f7pnf\") pod \"rabbitmq-server-0\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " pod="openstack/rabbitmq-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.269133 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-81b3621f-8ddd-4d7d-9865-b1bdebd60e8b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-81b3621f-8ddd-4d7d-9865-b1bdebd60e8b\") pod \"rabbitmq-server-0\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " pod="openstack/rabbitmq-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.312637 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.345554 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f4889f87-spp2z" event={"ID":"d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d","Type":"ContainerDied","Data":"8dcfd2cce2c68bcfa48950a2b5371fd3ce91d76a8195cb1480cfc70ede77016b"} Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.345603 4830 scope.go:117] "RemoveContainer" containerID="e1d72a55b5f975564fc69b07642c45caf66a3f8cda942e8ae6c77ffedada8df6" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.345737 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f4889f87-spp2z" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.349096 4830 generic.go:334] "Generic (PLEG): container finished" podID="14df27a3-8516-4fc0-aef9-6f0c852697bf" containerID="c055a7a13638bbfdd7aca6b2860945547e7be838c083d49dbdfe09f3fef90314" exitCode=0 Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.349160 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfbf56dd9-5h5mm" event={"ID":"14df27a3-8516-4fc0-aef9-6f0c852697bf","Type":"ContainerDied","Data":"c055a7a13638bbfdd7aca6b2860945547e7be838c083d49dbdfe09f3fef90314"} Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.349189 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfbf56dd9-5h5mm" event={"ID":"14df27a3-8516-4fc0-aef9-6f0c852697bf","Type":"ContainerStarted","Data":"e6df852424e968436d6e6a47395e417156ce731424a150e6d48f746282f71d2c"} Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.360657 4830 generic.go:334] "Generic (PLEG): container finished" podID="1015e46e-5610-4002-b19f-52b3489eb469" containerID="58ae65aafccde69fe53f9ca835398c60a0db038f150198d5360d75c2842bf845" exitCode=0 Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.360725 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c95686bd5-fqzh7" event={"ID":"1015e46e-5610-4002-b19f-52b3489eb469","Type":"ContainerDied","Data":"58ae65aafccde69fe53f9ca835398c60a0db038f150198d5360d75c2842bf845"} Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.360752 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c95686bd5-fqzh7" event={"ID":"1015e46e-5610-4002-b19f-52b3489eb469","Type":"ContainerStarted","Data":"63db4b24ff79c84497af31a0ac398c9e8d87e7f619ae7a301160a80a2b7eb37d"} Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.362988 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dcc4d9b5-p9j84" event={"ID":"ba6b8094-e9cb-4335-8179-9e27c28a810d","Type":"ContainerDied","Data":"8ebbbbbe05849fafe576bdecc79f5d385808b333b249fe7717a1c51160bb183a"} Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.363055 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dcc4d9b5-p9j84" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.441388 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 19:25:19 crc kubenswrapper[4830]: W0318 19:25:19.446258 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ec6dc5f_d924_482c_b2a3_b5dd8ce95416.slice/crio-26ca6fb9338a0f6112c7be169117c520ec4e21372e4729e7741ffade01bff404 WatchSource:0}: Error finding container 26ca6fb9338a0f6112c7be169117c520ec4e21372e4729e7741ffade01bff404: Status 404 returned error can't find the container with id 26ca6fb9338a0f6112c7be169117c520ec4e21372e4729e7741ffade01bff404 Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.455927 4830 scope.go:117] "RemoveContainer" containerID="c887620735fd206d148a885906aea53ba8d12bbeb26e8c65c9734ac55ac3a3ba" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.485972 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76f4889f87-spp2z"] Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.492857 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76f4889f87-spp2z"] Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.514516 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dcc4d9b5-p9j84"] Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.532077 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dcc4d9b5-p9j84"] Mar 18 19:25:19 crc kubenswrapper[4830]: E0318 19:25:19.600688 4830 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 18 19:25:19 crc kubenswrapper[4830]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/14df27a3-8516-4fc0-aef9-6f0c852697bf/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 18 19:25:19 crc kubenswrapper[4830]: > podSandboxID="e6df852424e968436d6e6a47395e417156ce731424a150e6d48f746282f71d2c" Mar 18 19:25:19 crc kubenswrapper[4830]: E0318 19:25:19.600948 4830 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 19:25:19 crc kubenswrapper[4830]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nb6hc5h68h68h594h659hdbh679h65ch5f6hdch6h5b9h8fh55hfhf8h57fhc7h56ch687h669h559h678h5dhc7hf7h697h5d6h9ch669h54fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lntbt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6cfbf56dd9-5h5mm_openstack(14df27a3-8516-4fc0-aef9-6f0c852697bf): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/14df27a3-8516-4fc0-aef9-6f0c852697bf/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 18 19:25:19 crc kubenswrapper[4830]: > logger="UnhandledError" Mar 18 19:25:19 crc kubenswrapper[4830]: E0318 19:25:19.602123 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/14df27a3-8516-4fc0-aef9-6f0c852697bf/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-6cfbf56dd9-5h5mm" podUID="14df27a3-8516-4fc0-aef9-6f0c852697bf" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.669294 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.670387 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.675686 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.675985 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.676155 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-xgwhh" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.676325 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.682057 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.682635 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.745672 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b9f6fa82-cef7-4a49-a6ac-053e904d5142-kolla-config\") pod \"openstack-galera-0\" (UID: \"b9f6fa82-cef7-4a49-a6ac-053e904d5142\") " pod="openstack/openstack-galera-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.745714 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b9f6fa82-cef7-4a49-a6ac-053e904d5142-config-data-default\") pod \"openstack-galera-0\" (UID: \"b9f6fa82-cef7-4a49-a6ac-053e904d5142\") " pod="openstack/openstack-galera-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.745904 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9f6fa82-cef7-4a49-a6ac-053e904d5142-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b9f6fa82-cef7-4a49-a6ac-053e904d5142\") " pod="openstack/openstack-galera-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.745956 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmg4h\" (UniqueName: \"kubernetes.io/projected/b9f6fa82-cef7-4a49-a6ac-053e904d5142-kube-api-access-dmg4h\") pod \"openstack-galera-0\" (UID: \"b9f6fa82-cef7-4a49-a6ac-053e904d5142\") " pod="openstack/openstack-galera-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.746071 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b9f6fa82-cef7-4a49-a6ac-053e904d5142-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b9f6fa82-cef7-4a49-a6ac-053e904d5142\") " pod="openstack/openstack-galera-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.746120 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-064a0ffa-084b-44a5-b84d-96f5d3ce7d14\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-064a0ffa-084b-44a5-b84d-96f5d3ce7d14\") pod \"openstack-galera-0\" (UID: \"b9f6fa82-cef7-4a49-a6ac-053e904d5142\") " pod="openstack/openstack-galera-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.746154 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9f6fa82-cef7-4a49-a6ac-053e904d5142-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b9f6fa82-cef7-4a49-a6ac-053e904d5142\") " pod="openstack/openstack-galera-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.746180 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9f6fa82-cef7-4a49-a6ac-053e904d5142-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b9f6fa82-cef7-4a49-a6ac-053e904d5142\") " pod="openstack/openstack-galera-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.774578 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.847080 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmg4h\" (UniqueName: \"kubernetes.io/projected/b9f6fa82-cef7-4a49-a6ac-053e904d5142-kube-api-access-dmg4h\") pod \"openstack-galera-0\" (UID: \"b9f6fa82-cef7-4a49-a6ac-053e904d5142\") " pod="openstack/openstack-galera-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.847165 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b9f6fa82-cef7-4a49-a6ac-053e904d5142-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b9f6fa82-cef7-4a49-a6ac-053e904d5142\") " pod="openstack/openstack-galera-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.847198 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-064a0ffa-084b-44a5-b84d-96f5d3ce7d14\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-064a0ffa-084b-44a5-b84d-96f5d3ce7d14\") pod \"openstack-galera-0\" (UID: \"b9f6fa82-cef7-4a49-a6ac-053e904d5142\") " pod="openstack/openstack-galera-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.847222 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9f6fa82-cef7-4a49-a6ac-053e904d5142-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b9f6fa82-cef7-4a49-a6ac-053e904d5142\") " pod="openstack/openstack-galera-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.847242 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9f6fa82-cef7-4a49-a6ac-053e904d5142-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b9f6fa82-cef7-4a49-a6ac-053e904d5142\") " pod="openstack/openstack-galera-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.847299 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b9f6fa82-cef7-4a49-a6ac-053e904d5142-kolla-config\") pod \"openstack-galera-0\" (UID: \"b9f6fa82-cef7-4a49-a6ac-053e904d5142\") " pod="openstack/openstack-galera-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.847318 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b9f6fa82-cef7-4a49-a6ac-053e904d5142-config-data-default\") pod \"openstack-galera-0\" (UID: \"b9f6fa82-cef7-4a49-a6ac-053e904d5142\") " pod="openstack/openstack-galera-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.847342 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9f6fa82-cef7-4a49-a6ac-053e904d5142-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b9f6fa82-cef7-4a49-a6ac-053e904d5142\") " pod="openstack/openstack-galera-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.849307 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b9f6fa82-cef7-4a49-a6ac-053e904d5142-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b9f6fa82-cef7-4a49-a6ac-053e904d5142\") " pod="openstack/openstack-galera-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.851605 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9f6fa82-cef7-4a49-a6ac-053e904d5142-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b9f6fa82-cef7-4a49-a6ac-053e904d5142\") " pod="openstack/openstack-galera-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.852113 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b9f6fa82-cef7-4a49-a6ac-053e904d5142-kolla-config\") pod \"openstack-galera-0\" (UID: \"b9f6fa82-cef7-4a49-a6ac-053e904d5142\") " pod="openstack/openstack-galera-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.852355 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9f6fa82-cef7-4a49-a6ac-053e904d5142-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b9f6fa82-cef7-4a49-a6ac-053e904d5142\") " pod="openstack/openstack-galera-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.852503 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9f6fa82-cef7-4a49-a6ac-053e904d5142-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b9f6fa82-cef7-4a49-a6ac-053e904d5142\") " pod="openstack/openstack-galera-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.852702 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b9f6fa82-cef7-4a49-a6ac-053e904d5142-config-data-default\") pod \"openstack-galera-0\" (UID: \"b9f6fa82-cef7-4a49-a6ac-053e904d5142\") " pod="openstack/openstack-galera-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.855001 4830 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.855036 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-064a0ffa-084b-44a5-b84d-96f5d3ce7d14\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-064a0ffa-084b-44a5-b84d-96f5d3ce7d14\") pod \"openstack-galera-0\" (UID: \"b9f6fa82-cef7-4a49-a6ac-053e904d5142\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/09abcf4e3e4be895ae418e490813c594b019612b227d554562aadb5708ddd0e8/globalmount\"" pod="openstack/openstack-galera-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.870154 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmg4h\" (UniqueName: \"kubernetes.io/projected/b9f6fa82-cef7-4a49-a6ac-053e904d5142-kube-api-access-dmg4h\") pod \"openstack-galera-0\" (UID: \"b9f6fa82-cef7-4a49-a6ac-053e904d5142\") " pod="openstack/openstack-galera-0" Mar 18 19:25:19 crc kubenswrapper[4830]: I0318 19:25:19.889236 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-064a0ffa-084b-44a5-b84d-96f5d3ce7d14\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-064a0ffa-084b-44a5-b84d-96f5d3ce7d14\") pod \"openstack-galera-0\" (UID: \"b9f6fa82-cef7-4a49-a6ac-053e904d5142\") " pod="openstack/openstack-galera-0" Mar 18 19:25:20 crc kubenswrapper[4830]: I0318 19:25:20.014057 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 19:25:20 crc kubenswrapper[4830]: I0318 19:25:20.243852 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba6b8094-e9cb-4335-8179-9e27c28a810d" path="/var/lib/kubelet/pods/ba6b8094-e9cb-4335-8179-9e27c28a810d/volumes" Mar 18 19:25:20 crc kubenswrapper[4830]: I0318 19:25:20.245390 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d" path="/var/lib/kubelet/pods/d7d4ea7b-2f9e-406f-8eea-f4e906bdd07d/volumes" Mar 18 19:25:20 crc kubenswrapper[4830]: I0318 19:25:20.372828 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416","Type":"ContainerStarted","Data":"26ca6fb9338a0f6112c7be169117c520ec4e21372e4729e7741ffade01bff404"} Mar 18 19:25:20 crc kubenswrapper[4830]: I0318 19:25:20.373484 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f91c39a4-66e6-4401-950d-88b4d7d2a851","Type":"ContainerStarted","Data":"c6b5943e57c0336bb4058e28c1bec41b744583dbc0a7c8c58eb5aa9888fe2b07"} Mar 18 19:25:20 crc kubenswrapper[4830]: I0318 19:25:20.376665 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c95686bd5-fqzh7" event={"ID":"1015e46e-5610-4002-b19f-52b3489eb469","Type":"ContainerStarted","Data":"b5723c0651cdf28aed2faa8dc7b263887ef80f916118d406723537f6e13ca8eb"} Mar 18 19:25:20 crc kubenswrapper[4830]: I0318 19:25:20.376755 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c95686bd5-fqzh7" Mar 18 19:25:20 crc kubenswrapper[4830]: I0318 19:25:20.426994 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c95686bd5-fqzh7" podStartSLOduration=3.426977705 podStartE2EDuration="3.426977705s" podCreationTimestamp="2026-03-18 19:25:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:25:20.423377542 +0000 UTC m=+4954.991007894" watchObservedRunningTime="2026-03-18 19:25:20.426977705 +0000 UTC m=+4954.994608037" Mar 18 19:25:20 crc kubenswrapper[4830]: I0318 19:25:20.522566 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 19:25:20 crc kubenswrapper[4830]: W0318 19:25:20.580403 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9f6fa82_cef7_4a49_a6ac_053e904d5142.slice/crio-e4231904f695d6851be8de237860ac6c795ccc4a3fcaa164dacb20ec8f80d599 WatchSource:0}: Error finding container e4231904f695d6851be8de237860ac6c795ccc4a3fcaa164dacb20ec8f80d599: Status 404 returned error can't find the container with id e4231904f695d6851be8de237860ac6c795ccc4a3fcaa164dacb20ec8f80d599 Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.324145 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.331085 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.333914 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.333997 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.335357 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.335390 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-pd4fk" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.357498 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.369961 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-52a5f6b7-d8fe-48ac-bf59-4d0eaa1e8b36\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-52a5f6b7-d8fe-48ac-bf59-4d0eaa1e8b36\") pod \"openstack-cell1-galera-0\" (UID: \"06e64bc2-e23b-4b88-8e5e-87d979fd10f3\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.370103 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/06e64bc2-e23b-4b88-8e5e-87d979fd10f3-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"06e64bc2-e23b-4b88-8e5e-87d979fd10f3\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.370147 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/06e64bc2-e23b-4b88-8e5e-87d979fd10f3-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"06e64bc2-e23b-4b88-8e5e-87d979fd10f3\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.370263 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/06e64bc2-e23b-4b88-8e5e-87d979fd10f3-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"06e64bc2-e23b-4b88-8e5e-87d979fd10f3\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.370300 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2wsp\" (UniqueName: \"kubernetes.io/projected/06e64bc2-e23b-4b88-8e5e-87d979fd10f3-kube-api-access-s2wsp\") pod \"openstack-cell1-galera-0\" (UID: \"06e64bc2-e23b-4b88-8e5e-87d979fd10f3\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.370325 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e64bc2-e23b-4b88-8e5e-87d979fd10f3-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"06e64bc2-e23b-4b88-8e5e-87d979fd10f3\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.370357 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/06e64bc2-e23b-4b88-8e5e-87d979fd10f3-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"06e64bc2-e23b-4b88-8e5e-87d979fd10f3\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.370429 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06e64bc2-e23b-4b88-8e5e-87d979fd10f3-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"06e64bc2-e23b-4b88-8e5e-87d979fd10f3\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.384087 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416","Type":"ContainerStarted","Data":"a75d8aa328fd1d05da436c8fa09e658c91c183a3900c586a777ce66d7fb0875f"} Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.388431 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfbf56dd9-5h5mm" event={"ID":"14df27a3-8516-4fc0-aef9-6f0c852697bf","Type":"ContainerStarted","Data":"e57920adab1c76070376709c0d7fc36dee3204357ea6469a09c29def63cb8c6d"} Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.388649 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cfbf56dd9-5h5mm" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.390465 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b9f6fa82-cef7-4a49-a6ac-053e904d5142","Type":"ContainerStarted","Data":"51b5b1d67481c580655c26d6a9c298f8e5803503b1a5619fb0aa40670530c6f1"} Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.390502 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b9f6fa82-cef7-4a49-a6ac-053e904d5142","Type":"ContainerStarted","Data":"e4231904f695d6851be8de237860ac6c795ccc4a3fcaa164dacb20ec8f80d599"} Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.392348 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f91c39a4-66e6-4401-950d-88b4d7d2a851","Type":"ContainerStarted","Data":"5a9e9e51604631c5ca3ba0b641689b74975dcffd4b2d35f095b8a98ebf3fb3f6"} Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.430234 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6cfbf56dd9-5h5mm" podStartSLOduration=4.430215587 podStartE2EDuration="4.430215587s" podCreationTimestamp="2026-03-18 19:25:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:25:21.424829794 +0000 UTC m=+4955.992460136" watchObservedRunningTime="2026-03-18 19:25:21.430215587 +0000 UTC m=+4955.997845919" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.473221 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/06e64bc2-e23b-4b88-8e5e-87d979fd10f3-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"06e64bc2-e23b-4b88-8e5e-87d979fd10f3\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.473296 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/06e64bc2-e23b-4b88-8e5e-87d979fd10f3-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"06e64bc2-e23b-4b88-8e5e-87d979fd10f3\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.473778 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/06e64bc2-e23b-4b88-8e5e-87d979fd10f3-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"06e64bc2-e23b-4b88-8e5e-87d979fd10f3\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.474036 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/06e64bc2-e23b-4b88-8e5e-87d979fd10f3-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"06e64bc2-e23b-4b88-8e5e-87d979fd10f3\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.474372 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2wsp\" (UniqueName: \"kubernetes.io/projected/06e64bc2-e23b-4b88-8e5e-87d979fd10f3-kube-api-access-s2wsp\") pod \"openstack-cell1-galera-0\" (UID: \"06e64bc2-e23b-4b88-8e5e-87d979fd10f3\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.474390 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e64bc2-e23b-4b88-8e5e-87d979fd10f3-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"06e64bc2-e23b-4b88-8e5e-87d979fd10f3\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.474425 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/06e64bc2-e23b-4b88-8e5e-87d979fd10f3-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"06e64bc2-e23b-4b88-8e5e-87d979fd10f3\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.474466 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06e64bc2-e23b-4b88-8e5e-87d979fd10f3-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"06e64bc2-e23b-4b88-8e5e-87d979fd10f3\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.474596 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-52a5f6b7-d8fe-48ac-bf59-4d0eaa1e8b36\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-52a5f6b7-d8fe-48ac-bf59-4d0eaa1e8b36\") pod \"openstack-cell1-galera-0\" (UID: \"06e64bc2-e23b-4b88-8e5e-87d979fd10f3\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.474080 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/06e64bc2-e23b-4b88-8e5e-87d979fd10f3-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"06e64bc2-e23b-4b88-8e5e-87d979fd10f3\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.474988 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/06e64bc2-e23b-4b88-8e5e-87d979fd10f3-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"06e64bc2-e23b-4b88-8e5e-87d979fd10f3\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.477283 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06e64bc2-e23b-4b88-8e5e-87d979fd10f3-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"06e64bc2-e23b-4b88-8e5e-87d979fd10f3\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.480293 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e64bc2-e23b-4b88-8e5e-87d979fd10f3-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"06e64bc2-e23b-4b88-8e5e-87d979fd10f3\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.492458 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/06e64bc2-e23b-4b88-8e5e-87d979fd10f3-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"06e64bc2-e23b-4b88-8e5e-87d979fd10f3\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.493748 4830 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.493778 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-52a5f6b7-d8fe-48ac-bf59-4d0eaa1e8b36\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-52a5f6b7-d8fe-48ac-bf59-4d0eaa1e8b36\") pod \"openstack-cell1-galera-0\" (UID: \"06e64bc2-e23b-4b88-8e5e-87d979fd10f3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0b76f01063436710ba6301a23dd2f876e60961144cd17180c6f94f1d66ee89e0/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.500901 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2wsp\" (UniqueName: \"kubernetes.io/projected/06e64bc2-e23b-4b88-8e5e-87d979fd10f3-kube-api-access-s2wsp\") pod \"openstack-cell1-galera-0\" (UID: \"06e64bc2-e23b-4b88-8e5e-87d979fd10f3\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.521140 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-52a5f6b7-d8fe-48ac-bf59-4d0eaa1e8b36\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-52a5f6b7-d8fe-48ac-bf59-4d0eaa1e8b36\") pod \"openstack-cell1-galera-0\" (UID: \"06e64bc2-e23b-4b88-8e5e-87d979fd10f3\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.627567 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.628497 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.630751 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-mzfdw" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.630815 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.636242 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.637363 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.658135 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.780088 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae6d3440-d0f8-4bf4-a1cd-0a9cf63bc92d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ae6d3440-d0f8-4bf4-a1cd-0a9cf63bc92d\") " pod="openstack/memcached-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.780147 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trnxk\" (UniqueName: \"kubernetes.io/projected/ae6d3440-d0f8-4bf4-a1cd-0a9cf63bc92d-kube-api-access-trnxk\") pod \"memcached-0\" (UID: \"ae6d3440-d0f8-4bf4-a1cd-0a9cf63bc92d\") " pod="openstack/memcached-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.780176 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ae6d3440-d0f8-4bf4-a1cd-0a9cf63bc92d-kolla-config\") pod \"memcached-0\" (UID: \"ae6d3440-d0f8-4bf4-a1cd-0a9cf63bc92d\") " pod="openstack/memcached-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.780292 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae6d3440-d0f8-4bf4-a1cd-0a9cf63bc92d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ae6d3440-d0f8-4bf4-a1cd-0a9cf63bc92d\") " pod="openstack/memcached-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.780319 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ae6d3440-d0f8-4bf4-a1cd-0a9cf63bc92d-config-data\") pod \"memcached-0\" (UID: \"ae6d3440-d0f8-4bf4-a1cd-0a9cf63bc92d\") " pod="openstack/memcached-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.881717 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae6d3440-d0f8-4bf4-a1cd-0a9cf63bc92d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ae6d3440-d0f8-4bf4-a1cd-0a9cf63bc92d\") " pod="openstack/memcached-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.882050 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ae6d3440-d0f8-4bf4-a1cd-0a9cf63bc92d-config-data\") pod \"memcached-0\" (UID: \"ae6d3440-d0f8-4bf4-a1cd-0a9cf63bc92d\") " pod="openstack/memcached-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.882082 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae6d3440-d0f8-4bf4-a1cd-0a9cf63bc92d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ae6d3440-d0f8-4bf4-a1cd-0a9cf63bc92d\") " pod="openstack/memcached-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.882106 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trnxk\" (UniqueName: \"kubernetes.io/projected/ae6d3440-d0f8-4bf4-a1cd-0a9cf63bc92d-kube-api-access-trnxk\") pod \"memcached-0\" (UID: \"ae6d3440-d0f8-4bf4-a1cd-0a9cf63bc92d\") " pod="openstack/memcached-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.882129 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ae6d3440-d0f8-4bf4-a1cd-0a9cf63bc92d-kolla-config\") pod \"memcached-0\" (UID: \"ae6d3440-d0f8-4bf4-a1cd-0a9cf63bc92d\") " pod="openstack/memcached-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.882981 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ae6d3440-d0f8-4bf4-a1cd-0a9cf63bc92d-kolla-config\") pod \"memcached-0\" (UID: \"ae6d3440-d0f8-4bf4-a1cd-0a9cf63bc92d\") " pod="openstack/memcached-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.883788 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ae6d3440-d0f8-4bf4-a1cd-0a9cf63bc92d-config-data\") pod \"memcached-0\" (UID: \"ae6d3440-d0f8-4bf4-a1cd-0a9cf63bc92d\") " pod="openstack/memcached-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.887034 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae6d3440-d0f8-4bf4-a1cd-0a9cf63bc92d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ae6d3440-d0f8-4bf4-a1cd-0a9cf63bc92d\") " pod="openstack/memcached-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.887165 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae6d3440-d0f8-4bf4-a1cd-0a9cf63bc92d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ae6d3440-d0f8-4bf4-a1cd-0a9cf63bc92d\") " pod="openstack/memcached-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.901072 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trnxk\" (UniqueName: \"kubernetes.io/projected/ae6d3440-d0f8-4bf4-a1cd-0a9cf63bc92d-kube-api-access-trnxk\") pod \"memcached-0\" (UID: \"ae6d3440-d0f8-4bf4-a1cd-0a9cf63bc92d\") " pod="openstack/memcached-0" Mar 18 19:25:21 crc kubenswrapper[4830]: I0318 19:25:21.951819 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 19:25:22 crc kubenswrapper[4830]: W0318 19:25:22.085729 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06e64bc2_e23b_4b88_8e5e_87d979fd10f3.slice/crio-8a32c729d26c80c9d8e0b1ef3a7cf4b37528683ef12c4e0486d15960f0f876cb WatchSource:0}: Error finding container 8a32c729d26c80c9d8e0b1ef3a7cf4b37528683ef12c4e0486d15960f0f876cb: Status 404 returned error can't find the container with id 8a32c729d26c80c9d8e0b1ef3a7cf4b37528683ef12c4e0486d15960f0f876cb Mar 18 19:25:22 crc kubenswrapper[4830]: I0318 19:25:22.091539 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 19:25:22 crc kubenswrapper[4830]: I0318 19:25:22.392467 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 19:25:22 crc kubenswrapper[4830]: W0318 19:25:22.393539 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae6d3440_d0f8_4bf4_a1cd_0a9cf63bc92d.slice/crio-7593eaa4603eb8ba9762f6a2ff016fb35736ea6d76ccd3b7d4e0bbffaeffcb75 WatchSource:0}: Error finding container 7593eaa4603eb8ba9762f6a2ff016fb35736ea6d76ccd3b7d4e0bbffaeffcb75: Status 404 returned error can't find the container with id 7593eaa4603eb8ba9762f6a2ff016fb35736ea6d76ccd3b7d4e0bbffaeffcb75 Mar 18 19:25:22 crc kubenswrapper[4830]: I0318 19:25:22.407488 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"06e64bc2-e23b-4b88-8e5e-87d979fd10f3","Type":"ContainerStarted","Data":"4c03585403c99ce160cdd00cd2dded961182e6724203883c9be97b91749ae20c"} Mar 18 19:25:22 crc kubenswrapper[4830]: I0318 19:25:22.407543 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"06e64bc2-e23b-4b88-8e5e-87d979fd10f3","Type":"ContainerStarted","Data":"8a32c729d26c80c9d8e0b1ef3a7cf4b37528683ef12c4e0486d15960f0f876cb"} Mar 18 19:25:23 crc kubenswrapper[4830]: I0318 19:25:23.416832 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ae6d3440-d0f8-4bf4-a1cd-0a9cf63bc92d","Type":"ContainerStarted","Data":"0da5a27f40474f8aa283ef68279714a6e6bbc770b6312705d2bcb6a214e52f52"} Mar 18 19:25:23 crc kubenswrapper[4830]: I0318 19:25:23.417182 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ae6d3440-d0f8-4bf4-a1cd-0a9cf63bc92d","Type":"ContainerStarted","Data":"7593eaa4603eb8ba9762f6a2ff016fb35736ea6d76ccd3b7d4e0bbffaeffcb75"} Mar 18 19:25:23 crc kubenswrapper[4830]: I0318 19:25:23.443516 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.4434976649999998 podStartE2EDuration="2.443497665s" podCreationTimestamp="2026-03-18 19:25:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:25:23.435708874 +0000 UTC m=+4958.003339216" watchObservedRunningTime="2026-03-18 19:25:23.443497665 +0000 UTC m=+4958.011128017" Mar 18 19:25:24 crc kubenswrapper[4830]: I0318 19:25:24.426731 4830 generic.go:334] "Generic (PLEG): container finished" podID="b9f6fa82-cef7-4a49-a6ac-053e904d5142" containerID="51b5b1d67481c580655c26d6a9c298f8e5803503b1a5619fb0aa40670530c6f1" exitCode=0 Mar 18 19:25:24 crc kubenswrapper[4830]: I0318 19:25:24.426826 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b9f6fa82-cef7-4a49-a6ac-053e904d5142","Type":"ContainerDied","Data":"51b5b1d67481c580655c26d6a9c298f8e5803503b1a5619fb0aa40670530c6f1"} Mar 18 19:25:24 crc kubenswrapper[4830]: I0318 19:25:24.427364 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 18 19:25:25 crc kubenswrapper[4830]: I0318 19:25:25.438462 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b9f6fa82-cef7-4a49-a6ac-053e904d5142","Type":"ContainerStarted","Data":"f141d59d0c28f2ed74bd4387515202931f06cdd57cda771ff3484a9a70e4ac5d"} Mar 18 19:25:25 crc kubenswrapper[4830]: I0318 19:25:25.467564 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=7.467542488 podStartE2EDuration="7.467542488s" podCreationTimestamp="2026-03-18 19:25:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:25:25.460311782 +0000 UTC m=+4960.027942154" watchObservedRunningTime="2026-03-18 19:25:25.467542488 +0000 UTC m=+4960.035172830" Mar 18 19:25:26 crc kubenswrapper[4830]: I0318 19:25:26.450618 4830 generic.go:334] "Generic (PLEG): container finished" podID="06e64bc2-e23b-4b88-8e5e-87d979fd10f3" containerID="4c03585403c99ce160cdd00cd2dded961182e6724203883c9be97b91749ae20c" exitCode=0 Mar 18 19:25:26 crc kubenswrapper[4830]: I0318 19:25:26.450812 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"06e64bc2-e23b-4b88-8e5e-87d979fd10f3","Type":"ContainerDied","Data":"4c03585403c99ce160cdd00cd2dded961182e6724203883c9be97b91749ae20c"} Mar 18 19:25:27 crc kubenswrapper[4830]: I0318 19:25:27.464890 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"06e64bc2-e23b-4b88-8e5e-87d979fd10f3","Type":"ContainerStarted","Data":"811c24d9bba927d2a0454129ff65371f03a9cf19f1eb64461ec0af208c92d618"} Mar 18 19:25:27 crc kubenswrapper[4830]: I0318 19:25:27.497075 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.497029685 podStartE2EDuration="7.497029685s" podCreationTimestamp="2026-03-18 19:25:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:25:27.49193253 +0000 UTC m=+4962.059562892" watchObservedRunningTime="2026-03-18 19:25:27.497029685 +0000 UTC m=+4962.064660047" Mar 18 19:25:27 crc kubenswrapper[4830]: E0318 19:25:27.808964 4830 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.39:57532->38.102.83.39:36855: write tcp 192.168.126.11:10250->192.168.126.11:47720: write: broken pipe Mar 18 19:25:27 crc kubenswrapper[4830]: I0318 19:25:27.889032 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6cfbf56dd9-5h5mm" Mar 18 19:25:28 crc kubenswrapper[4830]: I0318 19:25:28.242870 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c95686bd5-fqzh7" Mar 18 19:25:28 crc kubenswrapper[4830]: I0318 19:25:28.294365 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cfbf56dd9-5h5mm"] Mar 18 19:25:28 crc kubenswrapper[4830]: I0318 19:25:28.470616 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6cfbf56dd9-5h5mm" podUID="14df27a3-8516-4fc0-aef9-6f0c852697bf" containerName="dnsmasq-dns" containerID="cri-o://e57920adab1c76070376709c0d7fc36dee3204357ea6469a09c29def63cb8c6d" gracePeriod=10 Mar 18 19:25:28 crc kubenswrapper[4830]: I0318 19:25:28.927542 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfbf56dd9-5h5mm" Mar 18 19:25:28 crc kubenswrapper[4830]: I0318 19:25:28.999839 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lntbt\" (UniqueName: \"kubernetes.io/projected/14df27a3-8516-4fc0-aef9-6f0c852697bf-kube-api-access-lntbt\") pod \"14df27a3-8516-4fc0-aef9-6f0c852697bf\" (UID: \"14df27a3-8516-4fc0-aef9-6f0c852697bf\") " Mar 18 19:25:28 crc kubenswrapper[4830]: I0318 19:25:28.999876 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14df27a3-8516-4fc0-aef9-6f0c852697bf-config\") pod \"14df27a3-8516-4fc0-aef9-6f0c852697bf\" (UID: \"14df27a3-8516-4fc0-aef9-6f0c852697bf\") " Mar 18 19:25:29 crc kubenswrapper[4830]: I0318 19:25:28.999912 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14df27a3-8516-4fc0-aef9-6f0c852697bf-dns-svc\") pod \"14df27a3-8516-4fc0-aef9-6f0c852697bf\" (UID: \"14df27a3-8516-4fc0-aef9-6f0c852697bf\") " Mar 18 19:25:29 crc kubenswrapper[4830]: I0318 19:25:29.014954 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14df27a3-8516-4fc0-aef9-6f0c852697bf-kube-api-access-lntbt" (OuterVolumeSpecName: "kube-api-access-lntbt") pod "14df27a3-8516-4fc0-aef9-6f0c852697bf" (UID: "14df27a3-8516-4fc0-aef9-6f0c852697bf"). InnerVolumeSpecName "kube-api-access-lntbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:25:29 crc kubenswrapper[4830]: I0318 19:25:29.038149 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14df27a3-8516-4fc0-aef9-6f0c852697bf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "14df27a3-8516-4fc0-aef9-6f0c852697bf" (UID: "14df27a3-8516-4fc0-aef9-6f0c852697bf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:25:29 crc kubenswrapper[4830]: I0318 19:25:29.056274 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14df27a3-8516-4fc0-aef9-6f0c852697bf-config" (OuterVolumeSpecName: "config") pod "14df27a3-8516-4fc0-aef9-6f0c852697bf" (UID: "14df27a3-8516-4fc0-aef9-6f0c852697bf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:25:29 crc kubenswrapper[4830]: I0318 19:25:29.102050 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lntbt\" (UniqueName: \"kubernetes.io/projected/14df27a3-8516-4fc0-aef9-6f0c852697bf-kube-api-access-lntbt\") on node \"crc\" DevicePath \"\"" Mar 18 19:25:29 crc kubenswrapper[4830]: I0318 19:25:29.102091 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14df27a3-8516-4fc0-aef9-6f0c852697bf-config\") on node \"crc\" DevicePath \"\"" Mar 18 19:25:29 crc kubenswrapper[4830]: I0318 19:25:29.102102 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14df27a3-8516-4fc0-aef9-6f0c852697bf-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 19:25:29 crc kubenswrapper[4830]: I0318 19:25:29.483261 4830 generic.go:334] "Generic (PLEG): container finished" podID="14df27a3-8516-4fc0-aef9-6f0c852697bf" containerID="e57920adab1c76070376709c0d7fc36dee3204357ea6469a09c29def63cb8c6d" exitCode=0 Mar 18 19:25:29 crc kubenswrapper[4830]: I0318 19:25:29.483315 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfbf56dd9-5h5mm" event={"ID":"14df27a3-8516-4fc0-aef9-6f0c852697bf","Type":"ContainerDied","Data":"e57920adab1c76070376709c0d7fc36dee3204357ea6469a09c29def63cb8c6d"} Mar 18 19:25:29 crc kubenswrapper[4830]: I0318 19:25:29.483348 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfbf56dd9-5h5mm" event={"ID":"14df27a3-8516-4fc0-aef9-6f0c852697bf","Type":"ContainerDied","Data":"e6df852424e968436d6e6a47395e417156ce731424a150e6d48f746282f71d2c"} Mar 18 19:25:29 crc kubenswrapper[4830]: I0318 19:25:29.483371 4830 scope.go:117] "RemoveContainer" containerID="e57920adab1c76070376709c0d7fc36dee3204357ea6469a09c29def63cb8c6d" Mar 18 19:25:29 crc kubenswrapper[4830]: I0318 19:25:29.483371 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfbf56dd9-5h5mm" Mar 18 19:25:29 crc kubenswrapper[4830]: I0318 19:25:29.505300 4830 scope.go:117] "RemoveContainer" containerID="c055a7a13638bbfdd7aca6b2860945547e7be838c083d49dbdfe09f3fef90314" Mar 18 19:25:29 crc kubenswrapper[4830]: I0318 19:25:29.527184 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cfbf56dd9-5h5mm"] Mar 18 19:25:29 crc kubenswrapper[4830]: I0318 19:25:29.537445 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cfbf56dd9-5h5mm"] Mar 18 19:25:29 crc kubenswrapper[4830]: I0318 19:25:29.540533 4830 scope.go:117] "RemoveContainer" containerID="e57920adab1c76070376709c0d7fc36dee3204357ea6469a09c29def63cb8c6d" Mar 18 19:25:29 crc kubenswrapper[4830]: E0318 19:25:29.541138 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e57920adab1c76070376709c0d7fc36dee3204357ea6469a09c29def63cb8c6d\": container with ID starting with e57920adab1c76070376709c0d7fc36dee3204357ea6469a09c29def63cb8c6d not found: ID does not exist" containerID="e57920adab1c76070376709c0d7fc36dee3204357ea6469a09c29def63cb8c6d" Mar 18 19:25:29 crc kubenswrapper[4830]: I0318 19:25:29.541197 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e57920adab1c76070376709c0d7fc36dee3204357ea6469a09c29def63cb8c6d"} err="failed to get container status \"e57920adab1c76070376709c0d7fc36dee3204357ea6469a09c29def63cb8c6d\": rpc error: code = NotFound desc = could not find container \"e57920adab1c76070376709c0d7fc36dee3204357ea6469a09c29def63cb8c6d\": container with ID starting with e57920adab1c76070376709c0d7fc36dee3204357ea6469a09c29def63cb8c6d not found: ID does not exist" Mar 18 19:25:29 crc kubenswrapper[4830]: I0318 19:25:29.541226 4830 scope.go:117] "RemoveContainer" containerID="c055a7a13638bbfdd7aca6b2860945547e7be838c083d49dbdfe09f3fef90314" Mar 18 19:25:29 crc kubenswrapper[4830]: E0318 19:25:29.541596 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c055a7a13638bbfdd7aca6b2860945547e7be838c083d49dbdfe09f3fef90314\": container with ID starting with c055a7a13638bbfdd7aca6b2860945547e7be838c083d49dbdfe09f3fef90314 not found: ID does not exist" containerID="c055a7a13638bbfdd7aca6b2860945547e7be838c083d49dbdfe09f3fef90314" Mar 18 19:25:29 crc kubenswrapper[4830]: I0318 19:25:29.541620 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c055a7a13638bbfdd7aca6b2860945547e7be838c083d49dbdfe09f3fef90314"} err="failed to get container status \"c055a7a13638bbfdd7aca6b2860945547e7be838c083d49dbdfe09f3fef90314\": rpc error: code = NotFound desc = could not find container \"c055a7a13638bbfdd7aca6b2860945547e7be838c083d49dbdfe09f3fef90314\": container with ID starting with c055a7a13638bbfdd7aca6b2860945547e7be838c083d49dbdfe09f3fef90314 not found: ID does not exist" Mar 18 19:25:30 crc kubenswrapper[4830]: I0318 19:25:30.015122 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 18 19:25:30 crc kubenswrapper[4830]: I0318 19:25:30.015249 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 18 19:25:30 crc kubenswrapper[4830]: I0318 19:25:30.245997 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14df27a3-8516-4fc0-aef9-6f0c852697bf" path="/var/lib/kubelet/pods/14df27a3-8516-4fc0-aef9-6f0c852697bf/volumes" Mar 18 19:25:31 crc kubenswrapper[4830]: I0318 19:25:31.658943 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 18 19:25:31 crc kubenswrapper[4830]: I0318 19:25:31.659261 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 18 19:25:31 crc kubenswrapper[4830]: I0318 19:25:31.772536 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 18 19:25:31 crc kubenswrapper[4830]: I0318 19:25:31.953759 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 18 19:25:32 crc kubenswrapper[4830]: I0318 19:25:32.344127 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 18 19:25:32 crc kubenswrapper[4830]: I0318 19:25:32.448635 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 18 19:25:32 crc kubenswrapper[4830]: I0318 19:25:32.598909 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 18 19:25:38 crc kubenswrapper[4830]: I0318 19:25:38.639681 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-b4p68"] Mar 18 19:25:38 crc kubenswrapper[4830]: E0318 19:25:38.640513 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14df27a3-8516-4fc0-aef9-6f0c852697bf" containerName="dnsmasq-dns" Mar 18 19:25:38 crc kubenswrapper[4830]: I0318 19:25:38.640527 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="14df27a3-8516-4fc0-aef9-6f0c852697bf" containerName="dnsmasq-dns" Mar 18 19:25:38 crc kubenswrapper[4830]: E0318 19:25:38.640560 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14df27a3-8516-4fc0-aef9-6f0c852697bf" containerName="init" Mar 18 19:25:38 crc kubenswrapper[4830]: I0318 19:25:38.640568 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="14df27a3-8516-4fc0-aef9-6f0c852697bf" containerName="init" Mar 18 19:25:38 crc kubenswrapper[4830]: I0318 19:25:38.640731 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="14df27a3-8516-4fc0-aef9-6f0c852697bf" containerName="dnsmasq-dns" Mar 18 19:25:38 crc kubenswrapper[4830]: I0318 19:25:38.641299 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b4p68" Mar 18 19:25:38 crc kubenswrapper[4830]: I0318 19:25:38.645873 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 18 19:25:38 crc kubenswrapper[4830]: I0318 19:25:38.655893 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-b4p68"] Mar 18 19:25:38 crc kubenswrapper[4830]: I0318 19:25:38.769531 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f64d7355-21ea-4ea5-911f-d066f9cb5cda-operator-scripts\") pod \"root-account-create-update-b4p68\" (UID: \"f64d7355-21ea-4ea5-911f-d066f9cb5cda\") " pod="openstack/root-account-create-update-b4p68" Mar 18 19:25:38 crc kubenswrapper[4830]: I0318 19:25:38.769720 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55s5b\" (UniqueName: \"kubernetes.io/projected/f64d7355-21ea-4ea5-911f-d066f9cb5cda-kube-api-access-55s5b\") pod \"root-account-create-update-b4p68\" (UID: \"f64d7355-21ea-4ea5-911f-d066f9cb5cda\") " pod="openstack/root-account-create-update-b4p68" Mar 18 19:25:38 crc kubenswrapper[4830]: I0318 19:25:38.871937 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f64d7355-21ea-4ea5-911f-d066f9cb5cda-operator-scripts\") pod \"root-account-create-update-b4p68\" (UID: \"f64d7355-21ea-4ea5-911f-d066f9cb5cda\") " pod="openstack/root-account-create-update-b4p68" Mar 18 19:25:38 crc kubenswrapper[4830]: I0318 19:25:38.872052 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55s5b\" (UniqueName: \"kubernetes.io/projected/f64d7355-21ea-4ea5-911f-d066f9cb5cda-kube-api-access-55s5b\") pod \"root-account-create-update-b4p68\" (UID: \"f64d7355-21ea-4ea5-911f-d066f9cb5cda\") " pod="openstack/root-account-create-update-b4p68" Mar 18 19:25:38 crc kubenswrapper[4830]: I0318 19:25:38.873734 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f64d7355-21ea-4ea5-911f-d066f9cb5cda-operator-scripts\") pod \"root-account-create-update-b4p68\" (UID: \"f64d7355-21ea-4ea5-911f-d066f9cb5cda\") " pod="openstack/root-account-create-update-b4p68" Mar 18 19:25:38 crc kubenswrapper[4830]: I0318 19:25:38.908583 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55s5b\" (UniqueName: \"kubernetes.io/projected/f64d7355-21ea-4ea5-911f-d066f9cb5cda-kube-api-access-55s5b\") pod \"root-account-create-update-b4p68\" (UID: \"f64d7355-21ea-4ea5-911f-d066f9cb5cda\") " pod="openstack/root-account-create-update-b4p68" Mar 18 19:25:38 crc kubenswrapper[4830]: I0318 19:25:38.977535 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b4p68" Mar 18 19:25:39 crc kubenswrapper[4830]: I0318 19:25:39.518349 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-b4p68"] Mar 18 19:25:39 crc kubenswrapper[4830]: I0318 19:25:39.589976 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-b4p68" event={"ID":"f64d7355-21ea-4ea5-911f-d066f9cb5cda","Type":"ContainerStarted","Data":"efeda4d22e13fed4d71930234ac4f087a214d6ad54bf2a6118c9d6e1b9b21498"} Mar 18 19:25:40 crc kubenswrapper[4830]: I0318 19:25:40.602386 4830 generic.go:334] "Generic (PLEG): container finished" podID="f64d7355-21ea-4ea5-911f-d066f9cb5cda" containerID="f7f83d76884de6e54ab330c49fc08f69f2b52725fcd7cac00f09aa160afd4f80" exitCode=0 Mar 18 19:25:40 crc kubenswrapper[4830]: I0318 19:25:40.602509 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-b4p68" event={"ID":"f64d7355-21ea-4ea5-911f-d066f9cb5cda","Type":"ContainerDied","Data":"f7f83d76884de6e54ab330c49fc08f69f2b52725fcd7cac00f09aa160afd4f80"} Mar 18 19:25:42 crc kubenswrapper[4830]: I0318 19:25:42.021132 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b4p68" Mar 18 19:25:42 crc kubenswrapper[4830]: I0318 19:25:42.126631 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f64d7355-21ea-4ea5-911f-d066f9cb5cda-operator-scripts\") pod \"f64d7355-21ea-4ea5-911f-d066f9cb5cda\" (UID: \"f64d7355-21ea-4ea5-911f-d066f9cb5cda\") " Mar 18 19:25:42 crc kubenswrapper[4830]: I0318 19:25:42.126733 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55s5b\" (UniqueName: \"kubernetes.io/projected/f64d7355-21ea-4ea5-911f-d066f9cb5cda-kube-api-access-55s5b\") pod \"f64d7355-21ea-4ea5-911f-d066f9cb5cda\" (UID: \"f64d7355-21ea-4ea5-911f-d066f9cb5cda\") " Mar 18 19:25:42 crc kubenswrapper[4830]: I0318 19:25:42.128795 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f64d7355-21ea-4ea5-911f-d066f9cb5cda-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f64d7355-21ea-4ea5-911f-d066f9cb5cda" (UID: "f64d7355-21ea-4ea5-911f-d066f9cb5cda"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:25:42 crc kubenswrapper[4830]: I0318 19:25:42.139117 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f64d7355-21ea-4ea5-911f-d066f9cb5cda-kube-api-access-55s5b" (OuterVolumeSpecName: "kube-api-access-55s5b") pod "f64d7355-21ea-4ea5-911f-d066f9cb5cda" (UID: "f64d7355-21ea-4ea5-911f-d066f9cb5cda"). InnerVolumeSpecName "kube-api-access-55s5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:25:42 crc kubenswrapper[4830]: I0318 19:25:42.228037 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f64d7355-21ea-4ea5-911f-d066f9cb5cda-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 19:25:42 crc kubenswrapper[4830]: I0318 19:25:42.228085 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55s5b\" (UniqueName: \"kubernetes.io/projected/f64d7355-21ea-4ea5-911f-d066f9cb5cda-kube-api-access-55s5b\") on node \"crc\" DevicePath \"\"" Mar 18 19:25:42 crc kubenswrapper[4830]: I0318 19:25:42.622077 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-b4p68" event={"ID":"f64d7355-21ea-4ea5-911f-d066f9cb5cda","Type":"ContainerDied","Data":"efeda4d22e13fed4d71930234ac4f087a214d6ad54bf2a6118c9d6e1b9b21498"} Mar 18 19:25:42 crc kubenswrapper[4830]: I0318 19:25:42.622118 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efeda4d22e13fed4d71930234ac4f087a214d6ad54bf2a6118c9d6e1b9b21498" Mar 18 19:25:42 crc kubenswrapper[4830]: I0318 19:25:42.622195 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b4p68" Mar 18 19:25:45 crc kubenswrapper[4830]: I0318 19:25:45.290424 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-b4p68"] Mar 18 19:25:45 crc kubenswrapper[4830]: I0318 19:25:45.300036 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-b4p68"] Mar 18 19:25:46 crc kubenswrapper[4830]: I0318 19:25:46.256364 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f64d7355-21ea-4ea5-911f-d066f9cb5cda" path="/var/lib/kubelet/pods/f64d7355-21ea-4ea5-911f-d066f9cb5cda/volumes" Mar 18 19:25:50 crc kubenswrapper[4830]: I0318 19:25:50.303540 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-xf6rc"] Mar 18 19:25:50 crc kubenswrapper[4830]: E0318 19:25:50.304133 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f64d7355-21ea-4ea5-911f-d066f9cb5cda" containerName="mariadb-account-create-update" Mar 18 19:25:50 crc kubenswrapper[4830]: I0318 19:25:50.304145 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f64d7355-21ea-4ea5-911f-d066f9cb5cda" containerName="mariadb-account-create-update" Mar 18 19:25:50 crc kubenswrapper[4830]: I0318 19:25:50.304274 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f64d7355-21ea-4ea5-911f-d066f9cb5cda" containerName="mariadb-account-create-update" Mar 18 19:25:50 crc kubenswrapper[4830]: I0318 19:25:50.304802 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xf6rc" Mar 18 19:25:50 crc kubenswrapper[4830]: I0318 19:25:50.307700 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 18 19:25:50 crc kubenswrapper[4830]: I0318 19:25:50.313606 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-xf6rc"] Mar 18 19:25:50 crc kubenswrapper[4830]: I0318 19:25:50.364139 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2vhs\" (UniqueName: \"kubernetes.io/projected/f6cac547-ed4f-439e-80d4-2deb7c49dec7-kube-api-access-n2vhs\") pod \"root-account-create-update-xf6rc\" (UID: \"f6cac547-ed4f-439e-80d4-2deb7c49dec7\") " pod="openstack/root-account-create-update-xf6rc" Mar 18 19:25:50 crc kubenswrapper[4830]: I0318 19:25:50.364284 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6cac547-ed4f-439e-80d4-2deb7c49dec7-operator-scripts\") pod \"root-account-create-update-xf6rc\" (UID: \"f6cac547-ed4f-439e-80d4-2deb7c49dec7\") " pod="openstack/root-account-create-update-xf6rc" Mar 18 19:25:50 crc kubenswrapper[4830]: I0318 19:25:50.466592 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2vhs\" (UniqueName: \"kubernetes.io/projected/f6cac547-ed4f-439e-80d4-2deb7c49dec7-kube-api-access-n2vhs\") pod \"root-account-create-update-xf6rc\" (UID: \"f6cac547-ed4f-439e-80d4-2deb7c49dec7\") " pod="openstack/root-account-create-update-xf6rc" Mar 18 19:25:50 crc kubenswrapper[4830]: I0318 19:25:50.466708 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6cac547-ed4f-439e-80d4-2deb7c49dec7-operator-scripts\") pod \"root-account-create-update-xf6rc\" (UID: \"f6cac547-ed4f-439e-80d4-2deb7c49dec7\") " pod="openstack/root-account-create-update-xf6rc" Mar 18 19:25:50 crc kubenswrapper[4830]: I0318 19:25:50.468220 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6cac547-ed4f-439e-80d4-2deb7c49dec7-operator-scripts\") pod \"root-account-create-update-xf6rc\" (UID: \"f6cac547-ed4f-439e-80d4-2deb7c49dec7\") " pod="openstack/root-account-create-update-xf6rc" Mar 18 19:25:50 crc kubenswrapper[4830]: I0318 19:25:50.488201 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2vhs\" (UniqueName: \"kubernetes.io/projected/f6cac547-ed4f-439e-80d4-2deb7c49dec7-kube-api-access-n2vhs\") pod \"root-account-create-update-xf6rc\" (UID: \"f6cac547-ed4f-439e-80d4-2deb7c49dec7\") " pod="openstack/root-account-create-update-xf6rc" Mar 18 19:25:50 crc kubenswrapper[4830]: I0318 19:25:50.630152 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xf6rc" Mar 18 19:25:51 crc kubenswrapper[4830]: I0318 19:25:51.108920 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-xf6rc"] Mar 18 19:25:51 crc kubenswrapper[4830]: I0318 19:25:51.711188 4830 generic.go:334] "Generic (PLEG): container finished" podID="f6cac547-ed4f-439e-80d4-2deb7c49dec7" containerID="745b75b99bb44c6f5cc8cbffd58e409153b4b2829442ea7e5bbbd3bcfddb1377" exitCode=0 Mar 18 19:25:51 crc kubenswrapper[4830]: I0318 19:25:51.711239 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xf6rc" event={"ID":"f6cac547-ed4f-439e-80d4-2deb7c49dec7","Type":"ContainerDied","Data":"745b75b99bb44c6f5cc8cbffd58e409153b4b2829442ea7e5bbbd3bcfddb1377"} Mar 18 19:25:51 crc kubenswrapper[4830]: I0318 19:25:51.711287 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xf6rc" event={"ID":"f6cac547-ed4f-439e-80d4-2deb7c49dec7","Type":"ContainerStarted","Data":"623446cf1d3eadad132f9aaf3a185bd2388b31f2c5edd04174e2af906e148402"} Mar 18 19:25:52 crc kubenswrapper[4830]: I0318 19:25:52.725068 4830 generic.go:334] "Generic (PLEG): container finished" podID="5ec6dc5f-d924-482c-b2a3-b5dd8ce95416" containerID="a75d8aa328fd1d05da436c8fa09e658c91c183a3900c586a777ce66d7fb0875f" exitCode=0 Mar 18 19:25:52 crc kubenswrapper[4830]: I0318 19:25:52.725211 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416","Type":"ContainerDied","Data":"a75d8aa328fd1d05da436c8fa09e658c91c183a3900c586a777ce66d7fb0875f"} Mar 18 19:25:53 crc kubenswrapper[4830]: I0318 19:25:53.064272 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xf6rc" Mar 18 19:25:53 crc kubenswrapper[4830]: I0318 19:25:53.110204 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6cac547-ed4f-439e-80d4-2deb7c49dec7-operator-scripts\") pod \"f6cac547-ed4f-439e-80d4-2deb7c49dec7\" (UID: \"f6cac547-ed4f-439e-80d4-2deb7c49dec7\") " Mar 18 19:25:53 crc kubenswrapper[4830]: I0318 19:25:53.110384 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2vhs\" (UniqueName: \"kubernetes.io/projected/f6cac547-ed4f-439e-80d4-2deb7c49dec7-kube-api-access-n2vhs\") pod \"f6cac547-ed4f-439e-80d4-2deb7c49dec7\" (UID: \"f6cac547-ed4f-439e-80d4-2deb7c49dec7\") " Mar 18 19:25:53 crc kubenswrapper[4830]: I0318 19:25:53.110822 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6cac547-ed4f-439e-80d4-2deb7c49dec7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f6cac547-ed4f-439e-80d4-2deb7c49dec7" (UID: "f6cac547-ed4f-439e-80d4-2deb7c49dec7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:25:53 crc kubenswrapper[4830]: I0318 19:25:53.111104 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6cac547-ed4f-439e-80d4-2deb7c49dec7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 19:25:53 crc kubenswrapper[4830]: I0318 19:25:53.115715 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6cac547-ed4f-439e-80d4-2deb7c49dec7-kube-api-access-n2vhs" (OuterVolumeSpecName: "kube-api-access-n2vhs") pod "f6cac547-ed4f-439e-80d4-2deb7c49dec7" (UID: "f6cac547-ed4f-439e-80d4-2deb7c49dec7"). InnerVolumeSpecName "kube-api-access-n2vhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:25:53 crc kubenswrapper[4830]: I0318 19:25:53.213144 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2vhs\" (UniqueName: \"kubernetes.io/projected/f6cac547-ed4f-439e-80d4-2deb7c49dec7-kube-api-access-n2vhs\") on node \"crc\" DevicePath \"\"" Mar 18 19:25:53 crc kubenswrapper[4830]: I0318 19:25:53.738754 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416","Type":"ContainerStarted","Data":"5b8e70422c4b67b50e12aa55b5d8fd5e4a5d1467f41f73922b90bb0ef567415b"} Mar 18 19:25:53 crc kubenswrapper[4830]: I0318 19:25:53.739575 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:25:53 crc kubenswrapper[4830]: I0318 19:25:53.742477 4830 generic.go:334] "Generic (PLEG): container finished" podID="f91c39a4-66e6-4401-950d-88b4d7d2a851" containerID="5a9e9e51604631c5ca3ba0b641689b74975dcffd4b2d35f095b8a98ebf3fb3f6" exitCode=0 Mar 18 19:25:53 crc kubenswrapper[4830]: I0318 19:25:53.742557 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f91c39a4-66e6-4401-950d-88b4d7d2a851","Type":"ContainerDied","Data":"5a9e9e51604631c5ca3ba0b641689b74975dcffd4b2d35f095b8a98ebf3fb3f6"} Mar 18 19:25:53 crc kubenswrapper[4830]: I0318 19:25:53.746997 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xf6rc" event={"ID":"f6cac547-ed4f-439e-80d4-2deb7c49dec7","Type":"ContainerDied","Data":"623446cf1d3eadad132f9aaf3a185bd2388b31f2c5edd04174e2af906e148402"} Mar 18 19:25:53 crc kubenswrapper[4830]: I0318 19:25:53.747047 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="623446cf1d3eadad132f9aaf3a185bd2388b31f2c5edd04174e2af906e148402" Mar 18 19:25:53 crc kubenswrapper[4830]: I0318 19:25:53.747119 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xf6rc" Mar 18 19:25:53 crc kubenswrapper[4830]: I0318 19:25:53.777633 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.777605723 podStartE2EDuration="36.777605723s" podCreationTimestamp="2026-03-18 19:25:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:25:53.775266047 +0000 UTC m=+4988.342896409" watchObservedRunningTime="2026-03-18 19:25:53.777605723 +0000 UTC m=+4988.345236055" Mar 18 19:25:54 crc kubenswrapper[4830]: I0318 19:25:54.369226 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9jwl7"] Mar 18 19:25:54 crc kubenswrapper[4830]: E0318 19:25:54.369609 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6cac547-ed4f-439e-80d4-2deb7c49dec7" containerName="mariadb-account-create-update" Mar 18 19:25:54 crc kubenswrapper[4830]: I0318 19:25:54.369624 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6cac547-ed4f-439e-80d4-2deb7c49dec7" containerName="mariadb-account-create-update" Mar 18 19:25:54 crc kubenswrapper[4830]: I0318 19:25:54.369813 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6cac547-ed4f-439e-80d4-2deb7c49dec7" containerName="mariadb-account-create-update" Mar 18 19:25:54 crc kubenswrapper[4830]: I0318 19:25:54.371206 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9jwl7" Mar 18 19:25:54 crc kubenswrapper[4830]: I0318 19:25:54.399694 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9jwl7"] Mar 18 19:25:54 crc kubenswrapper[4830]: I0318 19:25:54.435357 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb629544-20f1-419d-a3fc-c99f0018f801-catalog-content\") pod \"redhat-operators-9jwl7\" (UID: \"bb629544-20f1-419d-a3fc-c99f0018f801\") " pod="openshift-marketplace/redhat-operators-9jwl7" Mar 18 19:25:54 crc kubenswrapper[4830]: I0318 19:25:54.435441 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4zpr\" (UniqueName: \"kubernetes.io/projected/bb629544-20f1-419d-a3fc-c99f0018f801-kube-api-access-r4zpr\") pod \"redhat-operators-9jwl7\" (UID: \"bb629544-20f1-419d-a3fc-c99f0018f801\") " pod="openshift-marketplace/redhat-operators-9jwl7" Mar 18 19:25:54 crc kubenswrapper[4830]: I0318 19:25:54.435477 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb629544-20f1-419d-a3fc-c99f0018f801-utilities\") pod \"redhat-operators-9jwl7\" (UID: \"bb629544-20f1-419d-a3fc-c99f0018f801\") " pod="openshift-marketplace/redhat-operators-9jwl7" Mar 18 19:25:54 crc kubenswrapper[4830]: I0318 19:25:54.537107 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4zpr\" (UniqueName: \"kubernetes.io/projected/bb629544-20f1-419d-a3fc-c99f0018f801-kube-api-access-r4zpr\") pod \"redhat-operators-9jwl7\" (UID: \"bb629544-20f1-419d-a3fc-c99f0018f801\") " pod="openshift-marketplace/redhat-operators-9jwl7" Mar 18 19:25:54 crc kubenswrapper[4830]: I0318 19:25:54.537471 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb629544-20f1-419d-a3fc-c99f0018f801-utilities\") pod \"redhat-operators-9jwl7\" (UID: \"bb629544-20f1-419d-a3fc-c99f0018f801\") " pod="openshift-marketplace/redhat-operators-9jwl7" Mar 18 19:25:54 crc kubenswrapper[4830]: I0318 19:25:54.537602 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb629544-20f1-419d-a3fc-c99f0018f801-catalog-content\") pod \"redhat-operators-9jwl7\" (UID: \"bb629544-20f1-419d-a3fc-c99f0018f801\") " pod="openshift-marketplace/redhat-operators-9jwl7" Mar 18 19:25:54 crc kubenswrapper[4830]: I0318 19:25:54.538175 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb629544-20f1-419d-a3fc-c99f0018f801-utilities\") pod \"redhat-operators-9jwl7\" (UID: \"bb629544-20f1-419d-a3fc-c99f0018f801\") " pod="openshift-marketplace/redhat-operators-9jwl7" Mar 18 19:25:54 crc kubenswrapper[4830]: I0318 19:25:54.538230 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb629544-20f1-419d-a3fc-c99f0018f801-catalog-content\") pod \"redhat-operators-9jwl7\" (UID: \"bb629544-20f1-419d-a3fc-c99f0018f801\") " pod="openshift-marketplace/redhat-operators-9jwl7" Mar 18 19:25:54 crc kubenswrapper[4830]: I0318 19:25:54.563603 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4zpr\" (UniqueName: \"kubernetes.io/projected/bb629544-20f1-419d-a3fc-c99f0018f801-kube-api-access-r4zpr\") pod \"redhat-operators-9jwl7\" (UID: \"bb629544-20f1-419d-a3fc-c99f0018f801\") " pod="openshift-marketplace/redhat-operators-9jwl7" Mar 18 19:25:54 crc kubenswrapper[4830]: I0318 19:25:54.692061 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9jwl7" Mar 18 19:25:54 crc kubenswrapper[4830]: I0318 19:25:54.757623 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f91c39a4-66e6-4401-950d-88b4d7d2a851","Type":"ContainerStarted","Data":"2f791ea09c6c645df0c80f104b52f59cc0c63ea0d9ecfd4d38d52de6416e6d02"} Mar 18 19:25:54 crc kubenswrapper[4830]: I0318 19:25:54.758551 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 19:25:54 crc kubenswrapper[4830]: I0318 19:25:54.784916 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.784902206 podStartE2EDuration="37.784902206s" podCreationTimestamp="2026-03-18 19:25:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:25:54.783514717 +0000 UTC m=+4989.351145059" watchObservedRunningTime="2026-03-18 19:25:54.784902206 +0000 UTC m=+4989.352532548" Mar 18 19:25:54 crc kubenswrapper[4830]: I0318 19:25:54.975925 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9jwl7"] Mar 18 19:25:55 crc kubenswrapper[4830]: I0318 19:25:55.786209 4830 generic.go:334] "Generic (PLEG): container finished" podID="bb629544-20f1-419d-a3fc-c99f0018f801" containerID="e5a6fe944fbd5b97dbd4078f74f60d417cb88a0af5fac9ea444758dc77841e7a" exitCode=0 Mar 18 19:25:55 crc kubenswrapper[4830]: I0318 19:25:55.786330 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jwl7" event={"ID":"bb629544-20f1-419d-a3fc-c99f0018f801","Type":"ContainerDied","Data":"e5a6fe944fbd5b97dbd4078f74f60d417cb88a0af5fac9ea444758dc77841e7a"} Mar 18 19:25:55 crc kubenswrapper[4830]: I0318 19:25:55.786843 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jwl7" event={"ID":"bb629544-20f1-419d-a3fc-c99f0018f801","Type":"ContainerStarted","Data":"eabcd7a8349b3f4a284f9b47e2611dce2f676397cd65172bb01f577cae421133"} Mar 18 19:25:55 crc kubenswrapper[4830]: I0318 19:25:55.789319 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 19:25:56 crc kubenswrapper[4830]: I0318 19:25:56.796486 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jwl7" event={"ID":"bb629544-20f1-419d-a3fc-c99f0018f801","Type":"ContainerStarted","Data":"d2b99c131b9c703f80d72a1d2fa14288dd8c4e31cfc321583812199631b66eb1"} Mar 18 19:25:57 crc kubenswrapper[4830]: I0318 19:25:57.807091 4830 generic.go:334] "Generic (PLEG): container finished" podID="bb629544-20f1-419d-a3fc-c99f0018f801" containerID="d2b99c131b9c703f80d72a1d2fa14288dd8c4e31cfc321583812199631b66eb1" exitCode=0 Mar 18 19:25:57 crc kubenswrapper[4830]: I0318 19:25:57.807163 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jwl7" event={"ID":"bb629544-20f1-419d-a3fc-c99f0018f801","Type":"ContainerDied","Data":"d2b99c131b9c703f80d72a1d2fa14288dd8c4e31cfc321583812199631b66eb1"} Mar 18 19:25:58 crc kubenswrapper[4830]: I0318 19:25:58.818082 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jwl7" event={"ID":"bb629544-20f1-419d-a3fc-c99f0018f801","Type":"ContainerStarted","Data":"29d071874aa53a75d3b9fb862d23a2e9293a3bc90b89be6630f311617c2002d4"} Mar 18 19:25:58 crc kubenswrapper[4830]: I0318 19:25:58.838034 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9jwl7" podStartSLOduration=2.365979042 podStartE2EDuration="4.838015001s" podCreationTimestamp="2026-03-18 19:25:54 +0000 UTC" firstStartedPulling="2026-03-18 19:25:55.788901626 +0000 UTC m=+4990.356531958" lastFinishedPulling="2026-03-18 19:25:58.260937575 +0000 UTC m=+4992.828567917" observedRunningTime="2026-03-18 19:25:58.835040657 +0000 UTC m=+4993.402671029" watchObservedRunningTime="2026-03-18 19:25:58.838015001 +0000 UTC m=+4993.405645343" Mar 18 19:25:59 crc kubenswrapper[4830]: I0318 19:25:59.509242 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:25:59 crc kubenswrapper[4830]: I0318 19:25:59.509291 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:26:00 crc kubenswrapper[4830]: I0318 19:26:00.141078 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564366-dwmzn"] Mar 18 19:26:00 crc kubenswrapper[4830]: I0318 19:26:00.142205 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564366-dwmzn" Mar 18 19:26:00 crc kubenswrapper[4830]: I0318 19:26:00.145224 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 19:26:00 crc kubenswrapper[4830]: I0318 19:26:00.145376 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:26:00 crc kubenswrapper[4830]: I0318 19:26:00.145562 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:26:00 crc kubenswrapper[4830]: I0318 19:26:00.149576 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564366-dwmzn"] Mar 18 19:26:00 crc kubenswrapper[4830]: I0318 19:26:00.318863 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwfrb\" (UniqueName: \"kubernetes.io/projected/84e011eb-ce09-43f1-82ec-f3d3c1b025b4-kube-api-access-zwfrb\") pod \"auto-csr-approver-29564366-dwmzn\" (UID: \"84e011eb-ce09-43f1-82ec-f3d3c1b025b4\") " pod="openshift-infra/auto-csr-approver-29564366-dwmzn" Mar 18 19:26:00 crc kubenswrapper[4830]: I0318 19:26:00.420756 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwfrb\" (UniqueName: \"kubernetes.io/projected/84e011eb-ce09-43f1-82ec-f3d3c1b025b4-kube-api-access-zwfrb\") pod \"auto-csr-approver-29564366-dwmzn\" (UID: \"84e011eb-ce09-43f1-82ec-f3d3c1b025b4\") " pod="openshift-infra/auto-csr-approver-29564366-dwmzn" Mar 18 19:26:00 crc kubenswrapper[4830]: I0318 19:26:00.457221 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwfrb\" (UniqueName: \"kubernetes.io/projected/84e011eb-ce09-43f1-82ec-f3d3c1b025b4-kube-api-access-zwfrb\") pod \"auto-csr-approver-29564366-dwmzn\" (UID: \"84e011eb-ce09-43f1-82ec-f3d3c1b025b4\") " pod="openshift-infra/auto-csr-approver-29564366-dwmzn" Mar 18 19:26:00 crc kubenswrapper[4830]: I0318 19:26:00.462481 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564366-dwmzn" Mar 18 19:26:00 crc kubenswrapper[4830]: W0318 19:26:00.932592 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84e011eb_ce09_43f1_82ec_f3d3c1b025b4.slice/crio-2561b8161258a97a878e1e931d18736ac2fb8f8f87387a33d5a3a78a57e59c67 WatchSource:0}: Error finding container 2561b8161258a97a878e1e931d18736ac2fb8f8f87387a33d5a3a78a57e59c67: Status 404 returned error can't find the container with id 2561b8161258a97a878e1e931d18736ac2fb8f8f87387a33d5a3a78a57e59c67 Mar 18 19:26:00 crc kubenswrapper[4830]: I0318 19:26:00.934442 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564366-dwmzn"] Mar 18 19:26:01 crc kubenswrapper[4830]: I0318 19:26:01.844543 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564366-dwmzn" event={"ID":"84e011eb-ce09-43f1-82ec-f3d3c1b025b4","Type":"ContainerStarted","Data":"2561b8161258a97a878e1e931d18736ac2fb8f8f87387a33d5a3a78a57e59c67"} Mar 18 19:26:02 crc kubenswrapper[4830]: I0318 19:26:02.856735 4830 generic.go:334] "Generic (PLEG): container finished" podID="84e011eb-ce09-43f1-82ec-f3d3c1b025b4" containerID="de68858792bec333aecb9155a45dab5286433cd2198d3c7e9d45ccb6a050742a" exitCode=0 Mar 18 19:26:02 crc kubenswrapper[4830]: I0318 19:26:02.857202 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564366-dwmzn" event={"ID":"84e011eb-ce09-43f1-82ec-f3d3c1b025b4","Type":"ContainerDied","Data":"de68858792bec333aecb9155a45dab5286433cd2198d3c7e9d45ccb6a050742a"} Mar 18 19:26:04 crc kubenswrapper[4830]: I0318 19:26:04.195128 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564366-dwmzn" Mar 18 19:26:04 crc kubenswrapper[4830]: I0318 19:26:04.290287 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwfrb\" (UniqueName: \"kubernetes.io/projected/84e011eb-ce09-43f1-82ec-f3d3c1b025b4-kube-api-access-zwfrb\") pod \"84e011eb-ce09-43f1-82ec-f3d3c1b025b4\" (UID: \"84e011eb-ce09-43f1-82ec-f3d3c1b025b4\") " Mar 18 19:26:04 crc kubenswrapper[4830]: I0318 19:26:04.295465 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84e011eb-ce09-43f1-82ec-f3d3c1b025b4-kube-api-access-zwfrb" (OuterVolumeSpecName: "kube-api-access-zwfrb") pod "84e011eb-ce09-43f1-82ec-f3d3c1b025b4" (UID: "84e011eb-ce09-43f1-82ec-f3d3c1b025b4"). InnerVolumeSpecName "kube-api-access-zwfrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:26:04 crc kubenswrapper[4830]: I0318 19:26:04.391930 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwfrb\" (UniqueName: \"kubernetes.io/projected/84e011eb-ce09-43f1-82ec-f3d3c1b025b4-kube-api-access-zwfrb\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:04 crc kubenswrapper[4830]: I0318 19:26:04.692650 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9jwl7" Mar 18 19:26:04 crc kubenswrapper[4830]: I0318 19:26:04.692714 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9jwl7" Mar 18 19:26:04 crc kubenswrapper[4830]: I0318 19:26:04.875291 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564366-dwmzn" event={"ID":"84e011eb-ce09-43f1-82ec-f3d3c1b025b4","Type":"ContainerDied","Data":"2561b8161258a97a878e1e931d18736ac2fb8f8f87387a33d5a3a78a57e59c67"} Mar 18 19:26:04 crc kubenswrapper[4830]: I0318 19:26:04.875337 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2561b8161258a97a878e1e931d18736ac2fb8f8f87387a33d5a3a78a57e59c67" Mar 18 19:26:04 crc kubenswrapper[4830]: I0318 19:26:04.875374 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564366-dwmzn" Mar 18 19:26:05 crc kubenswrapper[4830]: I0318 19:26:05.293576 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564360-kxczb"] Mar 18 19:26:05 crc kubenswrapper[4830]: I0318 19:26:05.326818 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564360-kxczb"] Mar 18 19:26:05 crc kubenswrapper[4830]: I0318 19:26:05.757487 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9jwl7" podUID="bb629544-20f1-419d-a3fc-c99f0018f801" containerName="registry-server" probeResult="failure" output=< Mar 18 19:26:05 crc kubenswrapper[4830]: timeout: failed to connect service ":50051" within 1s Mar 18 19:26:05 crc kubenswrapper[4830]: > Mar 18 19:26:06 crc kubenswrapper[4830]: I0318 19:26:06.250147 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3d2b36a-4163-4218-915b-92c6ec36414d" path="/var/lib/kubelet/pods/b3d2b36a-4163-4218-915b-92c6ec36414d/volumes" Mar 18 19:26:09 crc kubenswrapper[4830]: I0318 19:26:09.021145 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:09 crc kubenswrapper[4830]: I0318 19:26:09.316985 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 18 19:26:14 crc kubenswrapper[4830]: I0318 19:26:14.768029 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9jwl7" Mar 18 19:26:14 crc kubenswrapper[4830]: I0318 19:26:14.844988 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9jwl7" Mar 18 19:26:15 crc kubenswrapper[4830]: I0318 19:26:15.025025 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9jwl7"] Mar 18 19:26:15 crc kubenswrapper[4830]: I0318 19:26:15.979526 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9jwl7" podUID="bb629544-20f1-419d-a3fc-c99f0018f801" containerName="registry-server" containerID="cri-o://29d071874aa53a75d3b9fb862d23a2e9293a3bc90b89be6630f311617c2002d4" gracePeriod=2 Mar 18 19:26:16 crc kubenswrapper[4830]: I0318 19:26:16.461245 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9jwl7" Mar 18 19:26:16 crc kubenswrapper[4830]: I0318 19:26:16.604841 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb629544-20f1-419d-a3fc-c99f0018f801-utilities\") pod \"bb629544-20f1-419d-a3fc-c99f0018f801\" (UID: \"bb629544-20f1-419d-a3fc-c99f0018f801\") " Mar 18 19:26:16 crc kubenswrapper[4830]: I0318 19:26:16.604917 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb629544-20f1-419d-a3fc-c99f0018f801-catalog-content\") pod \"bb629544-20f1-419d-a3fc-c99f0018f801\" (UID: \"bb629544-20f1-419d-a3fc-c99f0018f801\") " Mar 18 19:26:16 crc kubenswrapper[4830]: I0318 19:26:16.605019 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4zpr\" (UniqueName: \"kubernetes.io/projected/bb629544-20f1-419d-a3fc-c99f0018f801-kube-api-access-r4zpr\") pod \"bb629544-20f1-419d-a3fc-c99f0018f801\" (UID: \"bb629544-20f1-419d-a3fc-c99f0018f801\") " Mar 18 19:26:16 crc kubenswrapper[4830]: I0318 19:26:16.606387 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb629544-20f1-419d-a3fc-c99f0018f801-utilities" (OuterVolumeSpecName: "utilities") pod "bb629544-20f1-419d-a3fc-c99f0018f801" (UID: "bb629544-20f1-419d-a3fc-c99f0018f801"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:26:16 crc kubenswrapper[4830]: I0318 19:26:16.613844 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb629544-20f1-419d-a3fc-c99f0018f801-kube-api-access-r4zpr" (OuterVolumeSpecName: "kube-api-access-r4zpr") pod "bb629544-20f1-419d-a3fc-c99f0018f801" (UID: "bb629544-20f1-419d-a3fc-c99f0018f801"). InnerVolumeSpecName "kube-api-access-r4zpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:26:16 crc kubenswrapper[4830]: I0318 19:26:16.707202 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4zpr\" (UniqueName: \"kubernetes.io/projected/bb629544-20f1-419d-a3fc-c99f0018f801-kube-api-access-r4zpr\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:16 crc kubenswrapper[4830]: I0318 19:26:16.707263 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb629544-20f1-419d-a3fc-c99f0018f801-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:16 crc kubenswrapper[4830]: I0318 19:26:16.822626 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb629544-20f1-419d-a3fc-c99f0018f801-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb629544-20f1-419d-a3fc-c99f0018f801" (UID: "bb629544-20f1-419d-a3fc-c99f0018f801"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:26:16 crc kubenswrapper[4830]: I0318 19:26:16.911558 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb629544-20f1-419d-a3fc-c99f0018f801-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:16 crc kubenswrapper[4830]: I0318 19:26:16.996068 4830 generic.go:334] "Generic (PLEG): container finished" podID="bb629544-20f1-419d-a3fc-c99f0018f801" containerID="29d071874aa53a75d3b9fb862d23a2e9293a3bc90b89be6630f311617c2002d4" exitCode=0 Mar 18 19:26:16 crc kubenswrapper[4830]: I0318 19:26:16.996171 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9jwl7" Mar 18 19:26:16 crc kubenswrapper[4830]: I0318 19:26:16.996162 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jwl7" event={"ID":"bb629544-20f1-419d-a3fc-c99f0018f801","Type":"ContainerDied","Data":"29d071874aa53a75d3b9fb862d23a2e9293a3bc90b89be6630f311617c2002d4"} Mar 18 19:26:16 crc kubenswrapper[4830]: I0318 19:26:16.996247 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jwl7" event={"ID":"bb629544-20f1-419d-a3fc-c99f0018f801","Type":"ContainerDied","Data":"eabcd7a8349b3f4a284f9b47e2611dce2f676397cd65172bb01f577cae421133"} Mar 18 19:26:16 crc kubenswrapper[4830]: I0318 19:26:16.996311 4830 scope.go:117] "RemoveContainer" containerID="29d071874aa53a75d3b9fb862d23a2e9293a3bc90b89be6630f311617c2002d4" Mar 18 19:26:17 crc kubenswrapper[4830]: I0318 19:26:17.047399 4830 scope.go:117] "RemoveContainer" containerID="d2b99c131b9c703f80d72a1d2fa14288dd8c4e31cfc321583812199631b66eb1" Mar 18 19:26:17 crc kubenswrapper[4830]: I0318 19:26:17.064083 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9jwl7"] Mar 18 19:26:17 crc kubenswrapper[4830]: I0318 19:26:17.076730 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9jwl7"] Mar 18 19:26:17 crc kubenswrapper[4830]: I0318 19:26:17.088368 4830 scope.go:117] "RemoveContainer" containerID="e5a6fe944fbd5b97dbd4078f74f60d417cb88a0af5fac9ea444758dc77841e7a" Mar 18 19:26:17 crc kubenswrapper[4830]: I0318 19:26:17.125671 4830 scope.go:117] "RemoveContainer" containerID="29d071874aa53a75d3b9fb862d23a2e9293a3bc90b89be6630f311617c2002d4" Mar 18 19:26:17 crc kubenswrapper[4830]: E0318 19:26:17.126203 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29d071874aa53a75d3b9fb862d23a2e9293a3bc90b89be6630f311617c2002d4\": container with ID starting with 29d071874aa53a75d3b9fb862d23a2e9293a3bc90b89be6630f311617c2002d4 not found: ID does not exist" containerID="29d071874aa53a75d3b9fb862d23a2e9293a3bc90b89be6630f311617c2002d4" Mar 18 19:26:17 crc kubenswrapper[4830]: I0318 19:26:17.126255 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29d071874aa53a75d3b9fb862d23a2e9293a3bc90b89be6630f311617c2002d4"} err="failed to get container status \"29d071874aa53a75d3b9fb862d23a2e9293a3bc90b89be6630f311617c2002d4\": rpc error: code = NotFound desc = could not find container \"29d071874aa53a75d3b9fb862d23a2e9293a3bc90b89be6630f311617c2002d4\": container with ID starting with 29d071874aa53a75d3b9fb862d23a2e9293a3bc90b89be6630f311617c2002d4 not found: ID does not exist" Mar 18 19:26:17 crc kubenswrapper[4830]: I0318 19:26:17.126286 4830 scope.go:117] "RemoveContainer" containerID="d2b99c131b9c703f80d72a1d2fa14288dd8c4e31cfc321583812199631b66eb1" Mar 18 19:26:17 crc kubenswrapper[4830]: E0318 19:26:17.127275 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2b99c131b9c703f80d72a1d2fa14288dd8c4e31cfc321583812199631b66eb1\": container with ID starting with d2b99c131b9c703f80d72a1d2fa14288dd8c4e31cfc321583812199631b66eb1 not found: ID does not exist" containerID="d2b99c131b9c703f80d72a1d2fa14288dd8c4e31cfc321583812199631b66eb1" Mar 18 19:26:17 crc kubenswrapper[4830]: I0318 19:26:17.127351 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2b99c131b9c703f80d72a1d2fa14288dd8c4e31cfc321583812199631b66eb1"} err="failed to get container status \"d2b99c131b9c703f80d72a1d2fa14288dd8c4e31cfc321583812199631b66eb1\": rpc error: code = NotFound desc = could not find container \"d2b99c131b9c703f80d72a1d2fa14288dd8c4e31cfc321583812199631b66eb1\": container with ID starting with d2b99c131b9c703f80d72a1d2fa14288dd8c4e31cfc321583812199631b66eb1 not found: ID does not exist" Mar 18 19:26:17 crc kubenswrapper[4830]: I0318 19:26:17.127397 4830 scope.go:117] "RemoveContainer" containerID="e5a6fe944fbd5b97dbd4078f74f60d417cb88a0af5fac9ea444758dc77841e7a" Mar 18 19:26:17 crc kubenswrapper[4830]: E0318 19:26:17.127866 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5a6fe944fbd5b97dbd4078f74f60d417cb88a0af5fac9ea444758dc77841e7a\": container with ID starting with e5a6fe944fbd5b97dbd4078f74f60d417cb88a0af5fac9ea444758dc77841e7a not found: ID does not exist" containerID="e5a6fe944fbd5b97dbd4078f74f60d417cb88a0af5fac9ea444758dc77841e7a" Mar 18 19:26:17 crc kubenswrapper[4830]: I0318 19:26:17.127931 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5a6fe944fbd5b97dbd4078f74f60d417cb88a0af5fac9ea444758dc77841e7a"} err="failed to get container status \"e5a6fe944fbd5b97dbd4078f74f60d417cb88a0af5fac9ea444758dc77841e7a\": rpc error: code = NotFound desc = could not find container \"e5a6fe944fbd5b97dbd4078f74f60d417cb88a0af5fac9ea444758dc77841e7a\": container with ID starting with e5a6fe944fbd5b97dbd4078f74f60d417cb88a0af5fac9ea444758dc77841e7a not found: ID does not exist" Mar 18 19:26:18 crc kubenswrapper[4830]: I0318 19:26:18.247644 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb629544-20f1-419d-a3fc-c99f0018f801" path="/var/lib/kubelet/pods/bb629544-20f1-419d-a3fc-c99f0018f801/volumes" Mar 18 19:26:18 crc kubenswrapper[4830]: I0318 19:26:18.984904 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-684c864bc9-grnt7"] Mar 18 19:26:18 crc kubenswrapper[4830]: E0318 19:26:18.985654 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84e011eb-ce09-43f1-82ec-f3d3c1b025b4" containerName="oc" Mar 18 19:26:18 crc kubenswrapper[4830]: I0318 19:26:18.985685 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="84e011eb-ce09-43f1-82ec-f3d3c1b025b4" containerName="oc" Mar 18 19:26:18 crc kubenswrapper[4830]: E0318 19:26:18.985705 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb629544-20f1-419d-a3fc-c99f0018f801" containerName="extract-utilities" Mar 18 19:26:18 crc kubenswrapper[4830]: I0318 19:26:18.985716 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb629544-20f1-419d-a3fc-c99f0018f801" containerName="extract-utilities" Mar 18 19:26:18 crc kubenswrapper[4830]: E0318 19:26:18.985741 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb629544-20f1-419d-a3fc-c99f0018f801" containerName="extract-content" Mar 18 19:26:18 crc kubenswrapper[4830]: I0318 19:26:18.985754 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb629544-20f1-419d-a3fc-c99f0018f801" containerName="extract-content" Mar 18 19:26:18 crc kubenswrapper[4830]: E0318 19:26:18.985810 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb629544-20f1-419d-a3fc-c99f0018f801" containerName="registry-server" Mar 18 19:26:18 crc kubenswrapper[4830]: I0318 19:26:18.985823 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb629544-20f1-419d-a3fc-c99f0018f801" containerName="registry-server" Mar 18 19:26:18 crc kubenswrapper[4830]: I0318 19:26:18.986068 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="84e011eb-ce09-43f1-82ec-f3d3c1b025b4" containerName="oc" Mar 18 19:26:18 crc kubenswrapper[4830]: I0318 19:26:18.986113 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb629544-20f1-419d-a3fc-c99f0018f801" containerName="registry-server" Mar 18 19:26:18 crc kubenswrapper[4830]: I0318 19:26:18.987359 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-684c864bc9-grnt7" Mar 18 19:26:19 crc kubenswrapper[4830]: I0318 19:26:19.005490 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-684c864bc9-grnt7"] Mar 18 19:26:19 crc kubenswrapper[4830]: I0318 19:26:19.149806 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05dc5c44-41f3-44d6-ab05-9054e98e2523-config\") pod \"dnsmasq-dns-684c864bc9-grnt7\" (UID: \"05dc5c44-41f3-44d6-ab05-9054e98e2523\") " pod="openstack/dnsmasq-dns-684c864bc9-grnt7" Mar 18 19:26:19 crc kubenswrapper[4830]: I0318 19:26:19.149933 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmfcr\" (UniqueName: \"kubernetes.io/projected/05dc5c44-41f3-44d6-ab05-9054e98e2523-kube-api-access-jmfcr\") pod \"dnsmasq-dns-684c864bc9-grnt7\" (UID: \"05dc5c44-41f3-44d6-ab05-9054e98e2523\") " pod="openstack/dnsmasq-dns-684c864bc9-grnt7" Mar 18 19:26:19 crc kubenswrapper[4830]: I0318 19:26:19.149975 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05dc5c44-41f3-44d6-ab05-9054e98e2523-dns-svc\") pod \"dnsmasq-dns-684c864bc9-grnt7\" (UID: \"05dc5c44-41f3-44d6-ab05-9054e98e2523\") " pod="openstack/dnsmasq-dns-684c864bc9-grnt7" Mar 18 19:26:19 crc kubenswrapper[4830]: I0318 19:26:19.251599 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05dc5c44-41f3-44d6-ab05-9054e98e2523-dns-svc\") pod \"dnsmasq-dns-684c864bc9-grnt7\" (UID: \"05dc5c44-41f3-44d6-ab05-9054e98e2523\") " pod="openstack/dnsmasq-dns-684c864bc9-grnt7" Mar 18 19:26:19 crc kubenswrapper[4830]: I0318 19:26:19.251705 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05dc5c44-41f3-44d6-ab05-9054e98e2523-config\") pod \"dnsmasq-dns-684c864bc9-grnt7\" (UID: \"05dc5c44-41f3-44d6-ab05-9054e98e2523\") " pod="openstack/dnsmasq-dns-684c864bc9-grnt7" Mar 18 19:26:19 crc kubenswrapper[4830]: I0318 19:26:19.251803 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmfcr\" (UniqueName: \"kubernetes.io/projected/05dc5c44-41f3-44d6-ab05-9054e98e2523-kube-api-access-jmfcr\") pod \"dnsmasq-dns-684c864bc9-grnt7\" (UID: \"05dc5c44-41f3-44d6-ab05-9054e98e2523\") " pod="openstack/dnsmasq-dns-684c864bc9-grnt7" Mar 18 19:26:19 crc kubenswrapper[4830]: I0318 19:26:19.253108 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05dc5c44-41f3-44d6-ab05-9054e98e2523-config\") pod \"dnsmasq-dns-684c864bc9-grnt7\" (UID: \"05dc5c44-41f3-44d6-ab05-9054e98e2523\") " pod="openstack/dnsmasq-dns-684c864bc9-grnt7" Mar 18 19:26:19 crc kubenswrapper[4830]: I0318 19:26:19.253241 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05dc5c44-41f3-44d6-ab05-9054e98e2523-dns-svc\") pod \"dnsmasq-dns-684c864bc9-grnt7\" (UID: \"05dc5c44-41f3-44d6-ab05-9054e98e2523\") " pod="openstack/dnsmasq-dns-684c864bc9-grnt7" Mar 18 19:26:19 crc kubenswrapper[4830]: I0318 19:26:19.275451 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmfcr\" (UniqueName: \"kubernetes.io/projected/05dc5c44-41f3-44d6-ab05-9054e98e2523-kube-api-access-jmfcr\") pod \"dnsmasq-dns-684c864bc9-grnt7\" (UID: \"05dc5c44-41f3-44d6-ab05-9054e98e2523\") " pod="openstack/dnsmasq-dns-684c864bc9-grnt7" Mar 18 19:26:19 crc kubenswrapper[4830]: I0318 19:26:19.316193 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-684c864bc9-grnt7" Mar 18 19:26:19 crc kubenswrapper[4830]: I0318 19:26:19.788902 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-684c864bc9-grnt7"] Mar 18 19:26:19 crc kubenswrapper[4830]: W0318 19:26:19.794967 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05dc5c44_41f3_44d6_ab05_9054e98e2523.slice/crio-ac53c57fe0a41b48573a8d4513e73c2ccfa99456d441423e52199a33183e2893 WatchSource:0}: Error finding container ac53c57fe0a41b48573a8d4513e73c2ccfa99456d441423e52199a33183e2893: Status 404 returned error can't find the container with id ac53c57fe0a41b48573a8d4513e73c2ccfa99456d441423e52199a33183e2893 Mar 18 19:26:19 crc kubenswrapper[4830]: I0318 19:26:19.955256 4830 scope.go:117] "RemoveContainer" containerID="ffb9cea5eeb3bd91132199e378645b5e519aa3b9f41e7af03f50b6cc0f445973" Mar 18 19:26:20 crc kubenswrapper[4830]: I0318 19:26:20.045045 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-684c864bc9-grnt7" event={"ID":"05dc5c44-41f3-44d6-ab05-9054e98e2523","Type":"ContainerStarted","Data":"27abcad188fd5fd45b22ca44b46ac2c199a9392eae18ca84e606a09043a6dd09"} Mar 18 19:26:20 crc kubenswrapper[4830]: I0318 19:26:20.045133 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-684c864bc9-grnt7" event={"ID":"05dc5c44-41f3-44d6-ab05-9054e98e2523","Type":"ContainerStarted","Data":"ac53c57fe0a41b48573a8d4513e73c2ccfa99456d441423e52199a33183e2893"} Mar 18 19:26:20 crc kubenswrapper[4830]: I0318 19:26:20.258246 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 19:26:21 crc kubenswrapper[4830]: I0318 19:26:21.052154 4830 generic.go:334] "Generic (PLEG): container finished" podID="05dc5c44-41f3-44d6-ab05-9054e98e2523" containerID="27abcad188fd5fd45b22ca44b46ac2c199a9392eae18ca84e606a09043a6dd09" exitCode=0 Mar 18 19:26:21 crc kubenswrapper[4830]: I0318 19:26:21.052242 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-684c864bc9-grnt7" event={"ID":"05dc5c44-41f3-44d6-ab05-9054e98e2523","Type":"ContainerDied","Data":"27abcad188fd5fd45b22ca44b46ac2c199a9392eae18ca84e606a09043a6dd09"} Mar 18 19:26:21 crc kubenswrapper[4830]: I0318 19:26:21.252154 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 19:26:22 crc kubenswrapper[4830]: I0318 19:26:22.065418 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-684c864bc9-grnt7" event={"ID":"05dc5c44-41f3-44d6-ab05-9054e98e2523","Type":"ContainerStarted","Data":"d60876bc67bb8b3d2d1c869fd3b1cc8a48a268e96e515e37bd5f92f9279e68ed"} Mar 18 19:26:22 crc kubenswrapper[4830]: I0318 19:26:22.065763 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-684c864bc9-grnt7" Mar 18 19:26:22 crc kubenswrapper[4830]: I0318 19:26:22.084386 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-684c864bc9-grnt7" podStartSLOduration=4.084363006 podStartE2EDuration="4.084363006s" podCreationTimestamp="2026-03-18 19:26:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:26:22.08310835 +0000 UTC m=+5016.650738672" watchObservedRunningTime="2026-03-18 19:26:22.084363006 +0000 UTC m=+5016.651993348" Mar 18 19:26:24 crc kubenswrapper[4830]: I0318 19:26:24.459781 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="f91c39a4-66e6-4401-950d-88b4d7d2a851" containerName="rabbitmq" containerID="cri-o://2f791ea09c6c645df0c80f104b52f59cc0c63ea0d9ecfd4d38d52de6416e6d02" gracePeriod=604796 Mar 18 19:26:25 crc kubenswrapper[4830]: I0318 19:26:25.402897 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="5ec6dc5f-d924-482c-b2a3-b5dd8ce95416" containerName="rabbitmq" containerID="cri-o://5b8e70422c4b67b50e12aa55b5d8fd5e4a5d1467f41f73922b90bb0ef567415b" gracePeriod=604796 Mar 18 19:26:29 crc kubenswrapper[4830]: I0318 19:26:29.016690 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="5ec6dc5f-d924-482c-b2a3-b5dd8ce95416" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.30:5671: connect: connection refused" Mar 18 19:26:29 crc kubenswrapper[4830]: I0318 19:26:29.314485 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="f91c39a4-66e6-4401-950d-88b4d7d2a851" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.31:5671: connect: connection refused" Mar 18 19:26:29 crc kubenswrapper[4830]: I0318 19:26:29.317981 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-684c864bc9-grnt7" Mar 18 19:26:29 crc kubenswrapper[4830]: I0318 19:26:29.387061 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c95686bd5-fqzh7"] Mar 18 19:26:29 crc kubenswrapper[4830]: I0318 19:26:29.387579 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c95686bd5-fqzh7" podUID="1015e46e-5610-4002-b19f-52b3489eb469" containerName="dnsmasq-dns" containerID="cri-o://b5723c0651cdf28aed2faa8dc7b263887ef80f916118d406723537f6e13ca8eb" gracePeriod=10 Mar 18 19:26:29 crc kubenswrapper[4830]: I0318 19:26:29.519346 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:26:29 crc kubenswrapper[4830]: I0318 19:26:29.519991 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:26:29 crc kubenswrapper[4830]: I0318 19:26:29.831838 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c95686bd5-fqzh7" Mar 18 19:26:29 crc kubenswrapper[4830]: I0318 19:26:29.937613 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1015e46e-5610-4002-b19f-52b3489eb469-config\") pod \"1015e46e-5610-4002-b19f-52b3489eb469\" (UID: \"1015e46e-5610-4002-b19f-52b3489eb469\") " Mar 18 19:26:29 crc kubenswrapper[4830]: I0318 19:26:29.937707 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsrr2\" (UniqueName: \"kubernetes.io/projected/1015e46e-5610-4002-b19f-52b3489eb469-kube-api-access-qsrr2\") pod \"1015e46e-5610-4002-b19f-52b3489eb469\" (UID: \"1015e46e-5610-4002-b19f-52b3489eb469\") " Mar 18 19:26:29 crc kubenswrapper[4830]: I0318 19:26:29.937738 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1015e46e-5610-4002-b19f-52b3489eb469-dns-svc\") pod \"1015e46e-5610-4002-b19f-52b3489eb469\" (UID: \"1015e46e-5610-4002-b19f-52b3489eb469\") " Mar 18 19:26:29 crc kubenswrapper[4830]: I0318 19:26:29.942944 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1015e46e-5610-4002-b19f-52b3489eb469-kube-api-access-qsrr2" (OuterVolumeSpecName: "kube-api-access-qsrr2") pod "1015e46e-5610-4002-b19f-52b3489eb469" (UID: "1015e46e-5610-4002-b19f-52b3489eb469"). InnerVolumeSpecName "kube-api-access-qsrr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:26:29 crc kubenswrapper[4830]: I0318 19:26:29.986108 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1015e46e-5610-4002-b19f-52b3489eb469-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1015e46e-5610-4002-b19f-52b3489eb469" (UID: "1015e46e-5610-4002-b19f-52b3489eb469"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:26:30 crc kubenswrapper[4830]: I0318 19:26:30.000840 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1015e46e-5610-4002-b19f-52b3489eb469-config" (OuterVolumeSpecName: "config") pod "1015e46e-5610-4002-b19f-52b3489eb469" (UID: "1015e46e-5610-4002-b19f-52b3489eb469"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:26:30 crc kubenswrapper[4830]: I0318 19:26:30.039373 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1015e46e-5610-4002-b19f-52b3489eb469-config\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:30 crc kubenswrapper[4830]: I0318 19:26:30.039405 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsrr2\" (UniqueName: \"kubernetes.io/projected/1015e46e-5610-4002-b19f-52b3489eb469-kube-api-access-qsrr2\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:30 crc kubenswrapper[4830]: I0318 19:26:30.039420 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1015e46e-5610-4002-b19f-52b3489eb469-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:30 crc kubenswrapper[4830]: I0318 19:26:30.140628 4830 generic.go:334] "Generic (PLEG): container finished" podID="1015e46e-5610-4002-b19f-52b3489eb469" containerID="b5723c0651cdf28aed2faa8dc7b263887ef80f916118d406723537f6e13ca8eb" exitCode=0 Mar 18 19:26:30 crc kubenswrapper[4830]: I0318 19:26:30.140679 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c95686bd5-fqzh7" event={"ID":"1015e46e-5610-4002-b19f-52b3489eb469","Type":"ContainerDied","Data":"b5723c0651cdf28aed2faa8dc7b263887ef80f916118d406723537f6e13ca8eb"} Mar 18 19:26:30 crc kubenswrapper[4830]: I0318 19:26:30.141013 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c95686bd5-fqzh7" event={"ID":"1015e46e-5610-4002-b19f-52b3489eb469","Type":"ContainerDied","Data":"63db4b24ff79c84497af31a0ac398c9e8d87e7f619ae7a301160a80a2b7eb37d"} Mar 18 19:26:30 crc kubenswrapper[4830]: I0318 19:26:30.141033 4830 scope.go:117] "RemoveContainer" containerID="b5723c0651cdf28aed2faa8dc7b263887ef80f916118d406723537f6e13ca8eb" Mar 18 19:26:30 crc kubenswrapper[4830]: I0318 19:26:30.140711 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c95686bd5-fqzh7" Mar 18 19:26:30 crc kubenswrapper[4830]: I0318 19:26:30.168275 4830 scope.go:117] "RemoveContainer" containerID="58ae65aafccde69fe53f9ca835398c60a0db038f150198d5360d75c2842bf845" Mar 18 19:26:30 crc kubenswrapper[4830]: I0318 19:26:30.173006 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c95686bd5-fqzh7"] Mar 18 19:26:30 crc kubenswrapper[4830]: I0318 19:26:30.187985 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c95686bd5-fqzh7"] Mar 18 19:26:30 crc kubenswrapper[4830]: I0318 19:26:30.201339 4830 scope.go:117] "RemoveContainer" containerID="b5723c0651cdf28aed2faa8dc7b263887ef80f916118d406723537f6e13ca8eb" Mar 18 19:26:30 crc kubenswrapper[4830]: E0318 19:26:30.201848 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5723c0651cdf28aed2faa8dc7b263887ef80f916118d406723537f6e13ca8eb\": container with ID starting with b5723c0651cdf28aed2faa8dc7b263887ef80f916118d406723537f6e13ca8eb not found: ID does not exist" containerID="b5723c0651cdf28aed2faa8dc7b263887ef80f916118d406723537f6e13ca8eb" Mar 18 19:26:30 crc kubenswrapper[4830]: I0318 19:26:30.201900 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5723c0651cdf28aed2faa8dc7b263887ef80f916118d406723537f6e13ca8eb"} err="failed to get container status \"b5723c0651cdf28aed2faa8dc7b263887ef80f916118d406723537f6e13ca8eb\": rpc error: code = NotFound desc = could not find container \"b5723c0651cdf28aed2faa8dc7b263887ef80f916118d406723537f6e13ca8eb\": container with ID starting with b5723c0651cdf28aed2faa8dc7b263887ef80f916118d406723537f6e13ca8eb not found: ID does not exist" Mar 18 19:26:30 crc kubenswrapper[4830]: I0318 19:26:30.201936 4830 scope.go:117] "RemoveContainer" containerID="58ae65aafccde69fe53f9ca835398c60a0db038f150198d5360d75c2842bf845" Mar 18 19:26:30 crc kubenswrapper[4830]: E0318 19:26:30.202362 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58ae65aafccde69fe53f9ca835398c60a0db038f150198d5360d75c2842bf845\": container with ID starting with 58ae65aafccde69fe53f9ca835398c60a0db038f150198d5360d75c2842bf845 not found: ID does not exist" containerID="58ae65aafccde69fe53f9ca835398c60a0db038f150198d5360d75c2842bf845" Mar 18 19:26:30 crc kubenswrapper[4830]: I0318 19:26:30.202429 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58ae65aafccde69fe53f9ca835398c60a0db038f150198d5360d75c2842bf845"} err="failed to get container status \"58ae65aafccde69fe53f9ca835398c60a0db038f150198d5360d75c2842bf845\": rpc error: code = NotFound desc = could not find container \"58ae65aafccde69fe53f9ca835398c60a0db038f150198d5360d75c2842bf845\": container with ID starting with 58ae65aafccde69fe53f9ca835398c60a0db038f150198d5360d75c2842bf845 not found: ID does not exist" Mar 18 19:26:30 crc kubenswrapper[4830]: I0318 19:26:30.249501 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1015e46e-5610-4002-b19f-52b3489eb469" path="/var/lib/kubelet/pods/1015e46e-5610-4002-b19f-52b3489eb469/volumes" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.050654 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.152920 4830 generic.go:334] "Generic (PLEG): container finished" podID="f91c39a4-66e6-4401-950d-88b4d7d2a851" containerID="2f791ea09c6c645df0c80f104b52f59cc0c63ea0d9ecfd4d38d52de6416e6d02" exitCode=0 Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.153008 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.153022 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f91c39a4-66e6-4401-950d-88b4d7d2a851","Type":"ContainerDied","Data":"2f791ea09c6c645df0c80f104b52f59cc0c63ea0d9ecfd4d38d52de6416e6d02"} Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.153417 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f91c39a4-66e6-4401-950d-88b4d7d2a851","Type":"ContainerDied","Data":"c6b5943e57c0336bb4058e28c1bec41b744583dbc0a7c8c58eb5aa9888fe2b07"} Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.153440 4830 scope.go:117] "RemoveContainer" containerID="2f791ea09c6c645df0c80f104b52f59cc0c63ea0d9ecfd4d38d52de6416e6d02" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.155563 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f91c39a4-66e6-4401-950d-88b4d7d2a851-config-data\") pod \"f91c39a4-66e6-4401-950d-88b4d7d2a851\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.155610 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f91c39a4-66e6-4401-950d-88b4d7d2a851-pod-info\") pod \"f91c39a4-66e6-4401-950d-88b4d7d2a851\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.155648 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f91c39a4-66e6-4401-950d-88b4d7d2a851-rabbitmq-erlang-cookie\") pod \"f91c39a4-66e6-4401-950d-88b4d7d2a851\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.155877 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-81b3621f-8ddd-4d7d-9865-b1bdebd60e8b\") pod \"f91c39a4-66e6-4401-950d-88b4d7d2a851\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.155926 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f91c39a4-66e6-4401-950d-88b4d7d2a851-rabbitmq-plugins\") pod \"f91c39a4-66e6-4401-950d-88b4d7d2a851\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.155958 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f91c39a4-66e6-4401-950d-88b4d7d2a851-rabbitmq-confd\") pod \"f91c39a4-66e6-4401-950d-88b4d7d2a851\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.155983 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f91c39a4-66e6-4401-950d-88b4d7d2a851-erlang-cookie-secret\") pod \"f91c39a4-66e6-4401-950d-88b4d7d2a851\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.156003 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f91c39a4-66e6-4401-950d-88b4d7d2a851-server-conf\") pod \"f91c39a4-66e6-4401-950d-88b4d7d2a851\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.156075 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7pnf\" (UniqueName: \"kubernetes.io/projected/f91c39a4-66e6-4401-950d-88b4d7d2a851-kube-api-access-f7pnf\") pod \"f91c39a4-66e6-4401-950d-88b4d7d2a851\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.156129 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f91c39a4-66e6-4401-950d-88b4d7d2a851-plugins-conf\") pod \"f91c39a4-66e6-4401-950d-88b4d7d2a851\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.156174 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f91c39a4-66e6-4401-950d-88b4d7d2a851-rabbitmq-tls\") pod \"f91c39a4-66e6-4401-950d-88b4d7d2a851\" (UID: \"f91c39a4-66e6-4401-950d-88b4d7d2a851\") " Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.161246 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f91c39a4-66e6-4401-950d-88b4d7d2a851-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "f91c39a4-66e6-4401-950d-88b4d7d2a851" (UID: "f91c39a4-66e6-4401-950d-88b4d7d2a851"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.161377 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f91c39a4-66e6-4401-950d-88b4d7d2a851-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "f91c39a4-66e6-4401-950d-88b4d7d2a851" (UID: "f91c39a4-66e6-4401-950d-88b4d7d2a851"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.161609 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f91c39a4-66e6-4401-950d-88b4d7d2a851-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "f91c39a4-66e6-4401-950d-88b4d7d2a851" (UID: "f91c39a4-66e6-4401-950d-88b4d7d2a851"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.163962 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f91c39a4-66e6-4401-950d-88b4d7d2a851-kube-api-access-f7pnf" (OuterVolumeSpecName: "kube-api-access-f7pnf") pod "f91c39a4-66e6-4401-950d-88b4d7d2a851" (UID: "f91c39a4-66e6-4401-950d-88b4d7d2a851"). InnerVolumeSpecName "kube-api-access-f7pnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.166977 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f91c39a4-66e6-4401-950d-88b4d7d2a851-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "f91c39a4-66e6-4401-950d-88b4d7d2a851" (UID: "f91c39a4-66e6-4401-950d-88b4d7d2a851"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.168726 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f91c39a4-66e6-4401-950d-88b4d7d2a851-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "f91c39a4-66e6-4401-950d-88b4d7d2a851" (UID: "f91c39a4-66e6-4401-950d-88b4d7d2a851"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.174826 4830 scope.go:117] "RemoveContainer" containerID="5a9e9e51604631c5ca3ba0b641689b74975dcffd4b2d35f095b8a98ebf3fb3f6" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.175151 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f91c39a4-66e6-4401-950d-88b4d7d2a851-config-data" (OuterVolumeSpecName: "config-data") pod "f91c39a4-66e6-4401-950d-88b4d7d2a851" (UID: "f91c39a4-66e6-4401-950d-88b4d7d2a851"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.181175 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-81b3621f-8ddd-4d7d-9865-b1bdebd60e8b" (OuterVolumeSpecName: "persistence") pod "f91c39a4-66e6-4401-950d-88b4d7d2a851" (UID: "f91c39a4-66e6-4401-950d-88b4d7d2a851"). InnerVolumeSpecName "pvc-81b3621f-8ddd-4d7d-9865-b1bdebd60e8b". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.188443 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/f91c39a4-66e6-4401-950d-88b4d7d2a851-pod-info" (OuterVolumeSpecName: "pod-info") pod "f91c39a4-66e6-4401-950d-88b4d7d2a851" (UID: "f91c39a4-66e6-4401-950d-88b4d7d2a851"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.210257 4830 scope.go:117] "RemoveContainer" containerID="2f791ea09c6c645df0c80f104b52f59cc0c63ea0d9ecfd4d38d52de6416e6d02" Mar 18 19:26:31 crc kubenswrapper[4830]: E0318 19:26:31.210763 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f791ea09c6c645df0c80f104b52f59cc0c63ea0d9ecfd4d38d52de6416e6d02\": container with ID starting with 2f791ea09c6c645df0c80f104b52f59cc0c63ea0d9ecfd4d38d52de6416e6d02 not found: ID does not exist" containerID="2f791ea09c6c645df0c80f104b52f59cc0c63ea0d9ecfd4d38d52de6416e6d02" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.210937 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f791ea09c6c645df0c80f104b52f59cc0c63ea0d9ecfd4d38d52de6416e6d02"} err="failed to get container status \"2f791ea09c6c645df0c80f104b52f59cc0c63ea0d9ecfd4d38d52de6416e6d02\": rpc error: code = NotFound desc = could not find container \"2f791ea09c6c645df0c80f104b52f59cc0c63ea0d9ecfd4d38d52de6416e6d02\": container with ID starting with 2f791ea09c6c645df0c80f104b52f59cc0c63ea0d9ecfd4d38d52de6416e6d02 not found: ID does not exist" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.211055 4830 scope.go:117] "RemoveContainer" containerID="5a9e9e51604631c5ca3ba0b641689b74975dcffd4b2d35f095b8a98ebf3fb3f6" Mar 18 19:26:31 crc kubenswrapper[4830]: E0318 19:26:31.212354 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a9e9e51604631c5ca3ba0b641689b74975dcffd4b2d35f095b8a98ebf3fb3f6\": container with ID starting with 5a9e9e51604631c5ca3ba0b641689b74975dcffd4b2d35f095b8a98ebf3fb3f6 not found: ID does not exist" containerID="5a9e9e51604631c5ca3ba0b641689b74975dcffd4b2d35f095b8a98ebf3fb3f6" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.212429 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a9e9e51604631c5ca3ba0b641689b74975dcffd4b2d35f095b8a98ebf3fb3f6"} err="failed to get container status \"5a9e9e51604631c5ca3ba0b641689b74975dcffd4b2d35f095b8a98ebf3fb3f6\": rpc error: code = NotFound desc = could not find container \"5a9e9e51604631c5ca3ba0b641689b74975dcffd4b2d35f095b8a98ebf3fb3f6\": container with ID starting with 5a9e9e51604631c5ca3ba0b641689b74975dcffd4b2d35f095b8a98ebf3fb3f6 not found: ID does not exist" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.223851 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f91c39a4-66e6-4401-950d-88b4d7d2a851-server-conf" (OuterVolumeSpecName: "server-conf") pod "f91c39a4-66e6-4401-950d-88b4d7d2a851" (UID: "f91c39a4-66e6-4401-950d-88b4d7d2a851"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.230741 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f91c39a4-66e6-4401-950d-88b4d7d2a851-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "f91c39a4-66e6-4401-950d-88b4d7d2a851" (UID: "f91c39a4-66e6-4401-950d-88b4d7d2a851"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.258031 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7pnf\" (UniqueName: \"kubernetes.io/projected/f91c39a4-66e6-4401-950d-88b4d7d2a851-kube-api-access-f7pnf\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.258066 4830 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f91c39a4-66e6-4401-950d-88b4d7d2a851-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.258079 4830 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f91c39a4-66e6-4401-950d-88b4d7d2a851-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.258092 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f91c39a4-66e6-4401-950d-88b4d7d2a851-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.258103 4830 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f91c39a4-66e6-4401-950d-88b4d7d2a851-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.258115 4830 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f91c39a4-66e6-4401-950d-88b4d7d2a851-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.258152 4830 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-81b3621f-8ddd-4d7d-9865-b1bdebd60e8b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-81b3621f-8ddd-4d7d-9865-b1bdebd60e8b\") on node \"crc\" " Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.258168 4830 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f91c39a4-66e6-4401-950d-88b4d7d2a851-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.258178 4830 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f91c39a4-66e6-4401-950d-88b4d7d2a851-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.258186 4830 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f91c39a4-66e6-4401-950d-88b4d7d2a851-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.258195 4830 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f91c39a4-66e6-4401-950d-88b4d7d2a851-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.278209 4830 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.278372 4830 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-81b3621f-8ddd-4d7d-9865-b1bdebd60e8b" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-81b3621f-8ddd-4d7d-9865-b1bdebd60e8b") on node "crc" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.359979 4830 reconciler_common.go:293] "Volume detached for volume \"pvc-81b3621f-8ddd-4d7d-9865-b1bdebd60e8b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-81b3621f-8ddd-4d7d-9865-b1bdebd60e8b\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.511272 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.521485 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.555838 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 19:26:31 crc kubenswrapper[4830]: E0318 19:26:31.556229 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f91c39a4-66e6-4401-950d-88b4d7d2a851" containerName="setup-container" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.556253 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f91c39a4-66e6-4401-950d-88b4d7d2a851" containerName="setup-container" Mar 18 19:26:31 crc kubenswrapper[4830]: E0318 19:26:31.556270 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1015e46e-5610-4002-b19f-52b3489eb469" containerName="dnsmasq-dns" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.556279 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1015e46e-5610-4002-b19f-52b3489eb469" containerName="dnsmasq-dns" Mar 18 19:26:31 crc kubenswrapper[4830]: E0318 19:26:31.556292 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1015e46e-5610-4002-b19f-52b3489eb469" containerName="init" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.556300 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1015e46e-5610-4002-b19f-52b3489eb469" containerName="init" Mar 18 19:26:31 crc kubenswrapper[4830]: E0318 19:26:31.556322 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f91c39a4-66e6-4401-950d-88b4d7d2a851" containerName="rabbitmq" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.556330 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f91c39a4-66e6-4401-950d-88b4d7d2a851" containerName="rabbitmq" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.556618 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="1015e46e-5610-4002-b19f-52b3489eb469" containerName="dnsmasq-dns" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.556642 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f91c39a4-66e6-4401-950d-88b4d7d2a851" containerName="rabbitmq" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.558177 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.560711 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.565815 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-j7w4c" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.565889 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.566256 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.566509 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.566293 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.566981 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.593108 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.662707 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4cd410e3-2ad4-4616-86ca-a5423a651ab7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4cd410e3-2ad4-4616-86ca-a5423a651ab7\") " pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.662759 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-81b3621f-8ddd-4d7d-9865-b1bdebd60e8b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-81b3621f-8ddd-4d7d-9865-b1bdebd60e8b\") pod \"rabbitmq-server-0\" (UID: \"4cd410e3-2ad4-4616-86ca-a5423a651ab7\") " pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.662804 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4cd410e3-2ad4-4616-86ca-a5423a651ab7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4cd410e3-2ad4-4616-86ca-a5423a651ab7\") " pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.662845 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvrfx\" (UniqueName: \"kubernetes.io/projected/4cd410e3-2ad4-4616-86ca-a5423a651ab7-kube-api-access-bvrfx\") pod \"rabbitmq-server-0\" (UID: \"4cd410e3-2ad4-4616-86ca-a5423a651ab7\") " pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.662884 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4cd410e3-2ad4-4616-86ca-a5423a651ab7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4cd410e3-2ad4-4616-86ca-a5423a651ab7\") " pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.662905 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4cd410e3-2ad4-4616-86ca-a5423a651ab7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4cd410e3-2ad4-4616-86ca-a5423a651ab7\") " pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.662931 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4cd410e3-2ad4-4616-86ca-a5423a651ab7-config-data\") pod \"rabbitmq-server-0\" (UID: \"4cd410e3-2ad4-4616-86ca-a5423a651ab7\") " pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.662951 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4cd410e3-2ad4-4616-86ca-a5423a651ab7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4cd410e3-2ad4-4616-86ca-a5423a651ab7\") " pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.662969 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4cd410e3-2ad4-4616-86ca-a5423a651ab7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4cd410e3-2ad4-4616-86ca-a5423a651ab7\") " pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.662986 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4cd410e3-2ad4-4616-86ca-a5423a651ab7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4cd410e3-2ad4-4616-86ca-a5423a651ab7\") " pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.663012 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4cd410e3-2ad4-4616-86ca-a5423a651ab7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4cd410e3-2ad4-4616-86ca-a5423a651ab7\") " pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.763929 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvrfx\" (UniqueName: \"kubernetes.io/projected/4cd410e3-2ad4-4616-86ca-a5423a651ab7-kube-api-access-bvrfx\") pod \"rabbitmq-server-0\" (UID: \"4cd410e3-2ad4-4616-86ca-a5423a651ab7\") " pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.763991 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4cd410e3-2ad4-4616-86ca-a5423a651ab7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4cd410e3-2ad4-4616-86ca-a5423a651ab7\") " pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.764020 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4cd410e3-2ad4-4616-86ca-a5423a651ab7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4cd410e3-2ad4-4616-86ca-a5423a651ab7\") " pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.764065 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4cd410e3-2ad4-4616-86ca-a5423a651ab7-config-data\") pod \"rabbitmq-server-0\" (UID: \"4cd410e3-2ad4-4616-86ca-a5423a651ab7\") " pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.764089 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4cd410e3-2ad4-4616-86ca-a5423a651ab7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4cd410e3-2ad4-4616-86ca-a5423a651ab7\") " pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.764114 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4cd410e3-2ad4-4616-86ca-a5423a651ab7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4cd410e3-2ad4-4616-86ca-a5423a651ab7\") " pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.764137 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4cd410e3-2ad4-4616-86ca-a5423a651ab7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4cd410e3-2ad4-4616-86ca-a5423a651ab7\") " pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.764176 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4cd410e3-2ad4-4616-86ca-a5423a651ab7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4cd410e3-2ad4-4616-86ca-a5423a651ab7\") " pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.764242 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4cd410e3-2ad4-4616-86ca-a5423a651ab7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4cd410e3-2ad4-4616-86ca-a5423a651ab7\") " pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.764289 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-81b3621f-8ddd-4d7d-9865-b1bdebd60e8b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-81b3621f-8ddd-4d7d-9865-b1bdebd60e8b\") pod \"rabbitmq-server-0\" (UID: \"4cd410e3-2ad4-4616-86ca-a5423a651ab7\") " pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.764322 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4cd410e3-2ad4-4616-86ca-a5423a651ab7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4cd410e3-2ad4-4616-86ca-a5423a651ab7\") " pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.764529 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4cd410e3-2ad4-4616-86ca-a5423a651ab7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4cd410e3-2ad4-4616-86ca-a5423a651ab7\") " pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.764644 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4cd410e3-2ad4-4616-86ca-a5423a651ab7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4cd410e3-2ad4-4616-86ca-a5423a651ab7\") " pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.765789 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4cd410e3-2ad4-4616-86ca-a5423a651ab7-config-data\") pod \"rabbitmq-server-0\" (UID: \"4cd410e3-2ad4-4616-86ca-a5423a651ab7\") " pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.766377 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4cd410e3-2ad4-4616-86ca-a5423a651ab7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4cd410e3-2ad4-4616-86ca-a5423a651ab7\") " pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.766765 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4cd410e3-2ad4-4616-86ca-a5423a651ab7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4cd410e3-2ad4-4616-86ca-a5423a651ab7\") " pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.768814 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4cd410e3-2ad4-4616-86ca-a5423a651ab7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4cd410e3-2ad4-4616-86ca-a5423a651ab7\") " pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.769241 4830 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.769252 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4cd410e3-2ad4-4616-86ca-a5423a651ab7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4cd410e3-2ad4-4616-86ca-a5423a651ab7\") " pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.769285 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-81b3621f-8ddd-4d7d-9865-b1bdebd60e8b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-81b3621f-8ddd-4d7d-9865-b1bdebd60e8b\") pod \"rabbitmq-server-0\" (UID: \"4cd410e3-2ad4-4616-86ca-a5423a651ab7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/295cf756ac76f5e0c3f6c5c915148602110b4493b539d2e8b27ffd4382606983/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.771333 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4cd410e3-2ad4-4616-86ca-a5423a651ab7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4cd410e3-2ad4-4616-86ca-a5423a651ab7\") " pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.773319 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4cd410e3-2ad4-4616-86ca-a5423a651ab7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4cd410e3-2ad4-4616-86ca-a5423a651ab7\") " pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.781035 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvrfx\" (UniqueName: \"kubernetes.io/projected/4cd410e3-2ad4-4616-86ca-a5423a651ab7-kube-api-access-bvrfx\") pod \"rabbitmq-server-0\" (UID: \"4cd410e3-2ad4-4616-86ca-a5423a651ab7\") " pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.814206 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-81b3621f-8ddd-4d7d-9865-b1bdebd60e8b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-81b3621f-8ddd-4d7d-9865-b1bdebd60e8b\") pod \"rabbitmq-server-0\" (UID: \"4cd410e3-2ad4-4616-86ca-a5423a651ab7\") " pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.874709 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 19:26:31 crc kubenswrapper[4830]: I0318 19:26:31.924387 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.068608 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-pod-info\") pod \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.068995 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-rabbitmq-tls\") pod \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.069041 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbpgh\" (UniqueName: \"kubernetes.io/projected/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-kube-api-access-fbpgh\") pod \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.069103 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-rabbitmq-erlang-cookie\") pod \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.069144 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-rabbitmq-confd\") pod \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.069181 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-server-conf\") pod \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.069222 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-rabbitmq-plugins\") pod \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.069248 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-plugins-conf\") pod \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.069268 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-erlang-cookie-secret\") pod \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.069859 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "5ec6dc5f-d924-482c-b2a3-b5dd8ce95416" (UID: "5ec6dc5f-d924-482c-b2a3-b5dd8ce95416"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.069351 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f6068b25-a136-4337-851f-286cb129b608\") pod \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.070157 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-config-data\") pod \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\" (UID: \"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416\") " Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.070831 4830 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.071906 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "5ec6dc5f-d924-482c-b2a3-b5dd8ce95416" (UID: "5ec6dc5f-d924-482c-b2a3-b5dd8ce95416"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.073410 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "5ec6dc5f-d924-482c-b2a3-b5dd8ce95416" (UID: "5ec6dc5f-d924-482c-b2a3-b5dd8ce95416"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.074801 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-pod-info" (OuterVolumeSpecName: "pod-info") pod "5ec6dc5f-d924-482c-b2a3-b5dd8ce95416" (UID: "5ec6dc5f-d924-482c-b2a3-b5dd8ce95416"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.075919 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-kube-api-access-fbpgh" (OuterVolumeSpecName: "kube-api-access-fbpgh") pod "5ec6dc5f-d924-482c-b2a3-b5dd8ce95416" (UID: "5ec6dc5f-d924-482c-b2a3-b5dd8ce95416"). InnerVolumeSpecName "kube-api-access-fbpgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.077698 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "5ec6dc5f-d924-482c-b2a3-b5dd8ce95416" (UID: "5ec6dc5f-d924-482c-b2a3-b5dd8ce95416"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.077818 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "5ec6dc5f-d924-482c-b2a3-b5dd8ce95416" (UID: "5ec6dc5f-d924-482c-b2a3-b5dd8ce95416"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.084277 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f6068b25-a136-4337-851f-286cb129b608" (OuterVolumeSpecName: "persistence") pod "5ec6dc5f-d924-482c-b2a3-b5dd8ce95416" (UID: "5ec6dc5f-d924-482c-b2a3-b5dd8ce95416"). InnerVolumeSpecName "pvc-f6068b25-a136-4337-851f-286cb129b608". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.107844 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-config-data" (OuterVolumeSpecName: "config-data") pod "5ec6dc5f-d924-482c-b2a3-b5dd8ce95416" (UID: "5ec6dc5f-d924-482c-b2a3-b5dd8ce95416"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.138092 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-server-conf" (OuterVolumeSpecName: "server-conf") pod "5ec6dc5f-d924-482c-b2a3-b5dd8ce95416" (UID: "5ec6dc5f-d924-482c-b2a3-b5dd8ce95416"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.159589 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "5ec6dc5f-d924-482c-b2a3-b5dd8ce95416" (UID: "5ec6dc5f-d924-482c-b2a3-b5dd8ce95416"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.162353 4830 generic.go:334] "Generic (PLEG): container finished" podID="5ec6dc5f-d924-482c-b2a3-b5dd8ce95416" containerID="5b8e70422c4b67b50e12aa55b5d8fd5e4a5d1467f41f73922b90bb0ef567415b" exitCode=0 Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.162425 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416","Type":"ContainerDied","Data":"5b8e70422c4b67b50e12aa55b5d8fd5e4a5d1467f41f73922b90bb0ef567415b"} Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.162457 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5ec6dc5f-d924-482c-b2a3-b5dd8ce95416","Type":"ContainerDied","Data":"26ca6fb9338a0f6112c7be169117c520ec4e21372e4729e7741ffade01bff404"} Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.162478 4830 scope.go:117] "RemoveContainer" containerID="5b8e70422c4b67b50e12aa55b5d8fd5e4a5d1467f41f73922b90bb0ef567415b" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.162585 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.175030 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.175228 4830 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.175299 4830 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.175357 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbpgh\" (UniqueName: \"kubernetes.io/projected/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-kube-api-access-fbpgh\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.175443 4830 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.175508 4830 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.175563 4830 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.175616 4830 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.175677 4830 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.175779 4830 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-f6068b25-a136-4337-851f-286cb129b608\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f6068b25-a136-4337-851f-286cb129b608\") on node \"crc\" " Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.185043 4830 scope.go:117] "RemoveContainer" containerID="a75d8aa328fd1d05da436c8fa09e658c91c183a3900c586a777ce66d7fb0875f" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.204599 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.204829 4830 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.205010 4830 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-f6068b25-a136-4337-851f-286cb129b608" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f6068b25-a136-4337-851f-286cb129b608") on node "crc" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.223075 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.223192 4830 scope.go:117] "RemoveContainer" containerID="5b8e70422c4b67b50e12aa55b5d8fd5e4a5d1467f41f73922b90bb0ef567415b" Mar 18 19:26:32 crc kubenswrapper[4830]: E0318 19:26:32.223650 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b8e70422c4b67b50e12aa55b5d8fd5e4a5d1467f41f73922b90bb0ef567415b\": container with ID starting with 5b8e70422c4b67b50e12aa55b5d8fd5e4a5d1467f41f73922b90bb0ef567415b not found: ID does not exist" containerID="5b8e70422c4b67b50e12aa55b5d8fd5e4a5d1467f41f73922b90bb0ef567415b" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.223684 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b8e70422c4b67b50e12aa55b5d8fd5e4a5d1467f41f73922b90bb0ef567415b"} err="failed to get container status \"5b8e70422c4b67b50e12aa55b5d8fd5e4a5d1467f41f73922b90bb0ef567415b\": rpc error: code = NotFound desc = could not find container \"5b8e70422c4b67b50e12aa55b5d8fd5e4a5d1467f41f73922b90bb0ef567415b\": container with ID starting with 5b8e70422c4b67b50e12aa55b5d8fd5e4a5d1467f41f73922b90bb0ef567415b not found: ID does not exist" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.223710 4830 scope.go:117] "RemoveContainer" containerID="a75d8aa328fd1d05da436c8fa09e658c91c183a3900c586a777ce66d7fb0875f" Mar 18 19:26:32 crc kubenswrapper[4830]: E0318 19:26:32.224282 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a75d8aa328fd1d05da436c8fa09e658c91c183a3900c586a777ce66d7fb0875f\": container with ID starting with a75d8aa328fd1d05da436c8fa09e658c91c183a3900c586a777ce66d7fb0875f not found: ID does not exist" containerID="a75d8aa328fd1d05da436c8fa09e658c91c183a3900c586a777ce66d7fb0875f" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.224418 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a75d8aa328fd1d05da436c8fa09e658c91c183a3900c586a777ce66d7fb0875f"} err="failed to get container status \"a75d8aa328fd1d05da436c8fa09e658c91c183a3900c586a777ce66d7fb0875f\": rpc error: code = NotFound desc = could not find container \"a75d8aa328fd1d05da436c8fa09e658c91c183a3900c586a777ce66d7fb0875f\": container with ID starting with a75d8aa328fd1d05da436c8fa09e658c91c183a3900c586a777ce66d7fb0875f not found: ID does not exist" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.231902 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 19:26:32 crc kubenswrapper[4830]: E0318 19:26:32.232508 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec6dc5f-d924-482c-b2a3-b5dd8ce95416" containerName="setup-container" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.232604 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec6dc5f-d924-482c-b2a3-b5dd8ce95416" containerName="setup-container" Mar 18 19:26:32 crc kubenswrapper[4830]: E0318 19:26:32.232712 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec6dc5f-d924-482c-b2a3-b5dd8ce95416" containerName="rabbitmq" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.232804 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec6dc5f-d924-482c-b2a3-b5dd8ce95416" containerName="rabbitmq" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.233097 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ec6dc5f-d924-482c-b2a3-b5dd8ce95416" containerName="rabbitmq" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.234350 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.236014 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.236402 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.236630 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.237433 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.237474 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.237494 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.238336 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-nffzv" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.255542 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ec6dc5f-d924-482c-b2a3-b5dd8ce95416" path="/var/lib/kubelet/pods/5ec6dc5f-d924-482c-b2a3-b5dd8ce95416/volumes" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.256287 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f91c39a4-66e6-4401-950d-88b4d7d2a851" path="/var/lib/kubelet/pods/f91c39a4-66e6-4401-950d-88b4d7d2a851/volumes" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.256746 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.276883 4830 reconciler_common.go:293] "Volume detached for volume \"pvc-f6068b25-a136-4337-851f-286cb129b608\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f6068b25-a136-4337-851f-286cb129b608\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.357895 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 19:26:32 crc kubenswrapper[4830]: W0318 19:26:32.367168 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cd410e3_2ad4_4616_86ca_a5423a651ab7.slice/crio-409155c510e0d9550dcac70e4184f5043b4cbf1bd1f22e2c572e55ef4c54eef1 WatchSource:0}: Error finding container 409155c510e0d9550dcac70e4184f5043b4cbf1bd1f22e2c572e55ef4c54eef1: Status 404 returned error can't find the container with id 409155c510e0d9550dcac70e4184f5043b4cbf1bd1f22e2c572e55ef4c54eef1 Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.377603 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxf5l\" (UniqueName: \"kubernetes.io/projected/26291628-d7b3-47e1-a7a4-81c569506ff8-kube-api-access-wxf5l\") pod \"rabbitmq-cell1-server-0\" (UID: \"26291628-d7b3-47e1-a7a4-81c569506ff8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.377660 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/26291628-d7b3-47e1-a7a4-81c569506ff8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"26291628-d7b3-47e1-a7a4-81c569506ff8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.377707 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/26291628-d7b3-47e1-a7a4-81c569506ff8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"26291628-d7b3-47e1-a7a4-81c569506ff8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.377726 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/26291628-d7b3-47e1-a7a4-81c569506ff8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"26291628-d7b3-47e1-a7a4-81c569506ff8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.377753 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f6068b25-a136-4337-851f-286cb129b608\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f6068b25-a136-4337-851f-286cb129b608\") pod \"rabbitmq-cell1-server-0\" (UID: \"26291628-d7b3-47e1-a7a4-81c569506ff8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.377800 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/26291628-d7b3-47e1-a7a4-81c569506ff8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"26291628-d7b3-47e1-a7a4-81c569506ff8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.377823 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/26291628-d7b3-47e1-a7a4-81c569506ff8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"26291628-d7b3-47e1-a7a4-81c569506ff8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.377885 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/26291628-d7b3-47e1-a7a4-81c569506ff8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"26291628-d7b3-47e1-a7a4-81c569506ff8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.378006 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/26291628-d7b3-47e1-a7a4-81c569506ff8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"26291628-d7b3-47e1-a7a4-81c569506ff8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.378055 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/26291628-d7b3-47e1-a7a4-81c569506ff8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"26291628-d7b3-47e1-a7a4-81c569506ff8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.378180 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/26291628-d7b3-47e1-a7a4-81c569506ff8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"26291628-d7b3-47e1-a7a4-81c569506ff8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.479297 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxf5l\" (UniqueName: \"kubernetes.io/projected/26291628-d7b3-47e1-a7a4-81c569506ff8-kube-api-access-wxf5l\") pod \"rabbitmq-cell1-server-0\" (UID: \"26291628-d7b3-47e1-a7a4-81c569506ff8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.479688 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/26291628-d7b3-47e1-a7a4-81c569506ff8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"26291628-d7b3-47e1-a7a4-81c569506ff8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.479748 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/26291628-d7b3-47e1-a7a4-81c569506ff8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"26291628-d7b3-47e1-a7a4-81c569506ff8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.479811 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/26291628-d7b3-47e1-a7a4-81c569506ff8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"26291628-d7b3-47e1-a7a4-81c569506ff8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.479844 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f6068b25-a136-4337-851f-286cb129b608\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f6068b25-a136-4337-851f-286cb129b608\") pod \"rabbitmq-cell1-server-0\" (UID: \"26291628-d7b3-47e1-a7a4-81c569506ff8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.479886 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/26291628-d7b3-47e1-a7a4-81c569506ff8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"26291628-d7b3-47e1-a7a4-81c569506ff8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.479912 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/26291628-d7b3-47e1-a7a4-81c569506ff8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"26291628-d7b3-47e1-a7a4-81c569506ff8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.479937 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/26291628-d7b3-47e1-a7a4-81c569506ff8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"26291628-d7b3-47e1-a7a4-81c569506ff8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.479987 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/26291628-d7b3-47e1-a7a4-81c569506ff8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"26291628-d7b3-47e1-a7a4-81c569506ff8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.480011 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/26291628-d7b3-47e1-a7a4-81c569506ff8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"26291628-d7b3-47e1-a7a4-81c569506ff8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.480059 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/26291628-d7b3-47e1-a7a4-81c569506ff8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"26291628-d7b3-47e1-a7a4-81c569506ff8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.481485 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/26291628-d7b3-47e1-a7a4-81c569506ff8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"26291628-d7b3-47e1-a7a4-81c569506ff8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.482400 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/26291628-d7b3-47e1-a7a4-81c569506ff8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"26291628-d7b3-47e1-a7a4-81c569506ff8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.482404 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/26291628-d7b3-47e1-a7a4-81c569506ff8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"26291628-d7b3-47e1-a7a4-81c569506ff8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.483373 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/26291628-d7b3-47e1-a7a4-81c569506ff8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"26291628-d7b3-47e1-a7a4-81c569506ff8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.483493 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/26291628-d7b3-47e1-a7a4-81c569506ff8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"26291628-d7b3-47e1-a7a4-81c569506ff8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.484589 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/26291628-d7b3-47e1-a7a4-81c569506ff8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"26291628-d7b3-47e1-a7a4-81c569506ff8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.484847 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/26291628-d7b3-47e1-a7a4-81c569506ff8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"26291628-d7b3-47e1-a7a4-81c569506ff8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.485971 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/26291628-d7b3-47e1-a7a4-81c569506ff8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"26291628-d7b3-47e1-a7a4-81c569506ff8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.486543 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/26291628-d7b3-47e1-a7a4-81c569506ff8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"26291628-d7b3-47e1-a7a4-81c569506ff8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.487666 4830 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.487692 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f6068b25-a136-4337-851f-286cb129b608\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f6068b25-a136-4337-851f-286cb129b608\") pod \"rabbitmq-cell1-server-0\" (UID: \"26291628-d7b3-47e1-a7a4-81c569506ff8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fce16860e1b71769ef6ee1bb5fe8ee71a0a195bdb1f0e797608caa82b04c1475/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.503201 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxf5l\" (UniqueName: \"kubernetes.io/projected/26291628-d7b3-47e1-a7a4-81c569506ff8-kube-api-access-wxf5l\") pod \"rabbitmq-cell1-server-0\" (UID: \"26291628-d7b3-47e1-a7a4-81c569506ff8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.531673 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f6068b25-a136-4337-851f-286cb129b608\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f6068b25-a136-4337-851f-286cb129b608\") pod \"rabbitmq-cell1-server-0\" (UID: \"26291628-d7b3-47e1-a7a4-81c569506ff8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.553306 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:26:32 crc kubenswrapper[4830]: I0318 19:26:32.806143 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 19:26:33 crc kubenswrapper[4830]: I0318 19:26:33.174739 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"26291628-d7b3-47e1-a7a4-81c569506ff8","Type":"ContainerStarted","Data":"ab25ac3b81cffc2cae5ddc8bcf078072b3543f2c12f3f6aba070496afbdc69b3"} Mar 18 19:26:33 crc kubenswrapper[4830]: I0318 19:26:33.177632 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4cd410e3-2ad4-4616-86ca-a5423a651ab7","Type":"ContainerStarted","Data":"409155c510e0d9550dcac70e4184f5043b4cbf1bd1f22e2c572e55ef4c54eef1"} Mar 18 19:26:34 crc kubenswrapper[4830]: I0318 19:26:34.191535 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4cd410e3-2ad4-4616-86ca-a5423a651ab7","Type":"ContainerStarted","Data":"5a847081164db9b081c72cb8e6834ae1010a8fcb5133827771ce86267548dbd2"} Mar 18 19:26:35 crc kubenswrapper[4830]: I0318 19:26:35.203691 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"26291628-d7b3-47e1-a7a4-81c569506ff8","Type":"ContainerStarted","Data":"7e4af35e12cf2dda9f5af0fcafbd7612e3635f6206149aec2964f4a832ed31d8"} Mar 18 19:26:59 crc kubenswrapper[4830]: I0318 19:26:59.509524 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:26:59 crc kubenswrapper[4830]: I0318 19:26:59.510168 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:26:59 crc kubenswrapper[4830]: I0318 19:26:59.510223 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" Mar 18 19:26:59 crc kubenswrapper[4830]: I0318 19:26:59.510929 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2b2583ffa620998bcee9d8c36a0271eaafa77acd768af552c84519fa8e9cd8a5"} pod="openshift-machine-config-operator/machine-config-daemon-plzpb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 19:26:59 crc kubenswrapper[4830]: I0318 19:26:59.510982 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" containerID="cri-o://2b2583ffa620998bcee9d8c36a0271eaafa77acd768af552c84519fa8e9cd8a5" gracePeriod=600 Mar 18 19:26:59 crc kubenswrapper[4830]: E0318 19:26:59.654363 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:27:00 crc kubenswrapper[4830]: I0318 19:27:00.462674 4830 generic.go:334] "Generic (PLEG): container finished" podID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerID="2b2583ffa620998bcee9d8c36a0271eaafa77acd768af552c84519fa8e9cd8a5" exitCode=0 Mar 18 19:27:00 crc kubenswrapper[4830]: I0318 19:27:00.462739 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerDied","Data":"2b2583ffa620998bcee9d8c36a0271eaafa77acd768af552c84519fa8e9cd8a5"} Mar 18 19:27:00 crc kubenswrapper[4830]: I0318 19:27:00.462825 4830 scope.go:117] "RemoveContainer" containerID="8ab0e5f91a7dbe7856c1160fe1f3fd41d957d07b4601ca351dadf0e31da9f972" Mar 18 19:27:00 crc kubenswrapper[4830]: I0318 19:27:00.463611 4830 scope.go:117] "RemoveContainer" containerID="2b2583ffa620998bcee9d8c36a0271eaafa77acd768af552c84519fa8e9cd8a5" Mar 18 19:27:00 crc kubenswrapper[4830]: E0318 19:27:00.464168 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:27:08 crc kubenswrapper[4830]: I0318 19:27:08.545119 4830 generic.go:334] "Generic (PLEG): container finished" podID="26291628-d7b3-47e1-a7a4-81c569506ff8" containerID="7e4af35e12cf2dda9f5af0fcafbd7612e3635f6206149aec2964f4a832ed31d8" exitCode=0 Mar 18 19:27:08 crc kubenswrapper[4830]: I0318 19:27:08.545249 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"26291628-d7b3-47e1-a7a4-81c569506ff8","Type":"ContainerDied","Data":"7e4af35e12cf2dda9f5af0fcafbd7612e3635f6206149aec2964f4a832ed31d8"} Mar 18 19:27:08 crc kubenswrapper[4830]: I0318 19:27:08.550221 4830 generic.go:334] "Generic (PLEG): container finished" podID="4cd410e3-2ad4-4616-86ca-a5423a651ab7" containerID="5a847081164db9b081c72cb8e6834ae1010a8fcb5133827771ce86267548dbd2" exitCode=0 Mar 18 19:27:08 crc kubenswrapper[4830]: I0318 19:27:08.550281 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4cd410e3-2ad4-4616-86ca-a5423a651ab7","Type":"ContainerDied","Data":"5a847081164db9b081c72cb8e6834ae1010a8fcb5133827771ce86267548dbd2"} Mar 18 19:27:09 crc kubenswrapper[4830]: I0318 19:27:09.563254 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"26291628-d7b3-47e1-a7a4-81c569506ff8","Type":"ContainerStarted","Data":"173bce179908cad9f228eb6fe104ffc9e8f3af5f3648461480de0c75be94264d"} Mar 18 19:27:09 crc kubenswrapper[4830]: I0318 19:27:09.563869 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:27:09 crc kubenswrapper[4830]: I0318 19:27:09.567025 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4cd410e3-2ad4-4616-86ca-a5423a651ab7","Type":"ContainerStarted","Data":"3ec0cbc15c48d3d232d5645ba8475db34bd66d3c1b88a54aec74c9585b462853"} Mar 18 19:27:09 crc kubenswrapper[4830]: I0318 19:27:09.567369 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 19:27:09 crc kubenswrapper[4830]: I0318 19:27:09.594446 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.594427803 podStartE2EDuration="37.594427803s" podCreationTimestamp="2026-03-18 19:26:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:27:09.593142946 +0000 UTC m=+5064.160773318" watchObservedRunningTime="2026-03-18 19:27:09.594427803 +0000 UTC m=+5064.162058135" Mar 18 19:27:09 crc kubenswrapper[4830]: I0318 19:27:09.623878 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.623848862 podStartE2EDuration="38.623848862s" podCreationTimestamp="2026-03-18 19:26:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:27:09.620517288 +0000 UTC m=+5064.188147660" watchObservedRunningTime="2026-03-18 19:27:09.623848862 +0000 UTC m=+5064.191479244" Mar 18 19:27:14 crc kubenswrapper[4830]: I0318 19:27:14.235398 4830 scope.go:117] "RemoveContainer" containerID="2b2583ffa620998bcee9d8c36a0271eaafa77acd768af552c84519fa8e9cd8a5" Mar 18 19:27:14 crc kubenswrapper[4830]: E0318 19:27:14.236189 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:27:21 crc kubenswrapper[4830]: I0318 19:27:21.878413 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 18 19:27:22 crc kubenswrapper[4830]: I0318 19:27:22.558872 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:27:26 crc kubenswrapper[4830]: I0318 19:27:26.242930 4830 scope.go:117] "RemoveContainer" containerID="2b2583ffa620998bcee9d8c36a0271eaafa77acd768af552c84519fa8e9cd8a5" Mar 18 19:27:26 crc kubenswrapper[4830]: E0318 19:27:26.244098 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:27:26 crc kubenswrapper[4830]: I0318 19:27:26.986071 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 18 19:27:26 crc kubenswrapper[4830]: I0318 19:27:26.987620 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 19:27:26 crc kubenswrapper[4830]: I0318 19:27:26.990162 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-n6rfz" Mar 18 19:27:26 crc kubenswrapper[4830]: I0318 19:27:26.990724 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 18 19:27:27 crc kubenswrapper[4830]: I0318 19:27:27.111091 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk5c2\" (UniqueName: \"kubernetes.io/projected/be6f87e7-6b08-44db-8451-7d098673af36-kube-api-access-qk5c2\") pod \"mariadb-client\" (UID: \"be6f87e7-6b08-44db-8451-7d098673af36\") " pod="openstack/mariadb-client" Mar 18 19:27:27 crc kubenswrapper[4830]: I0318 19:27:27.213277 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk5c2\" (UniqueName: \"kubernetes.io/projected/be6f87e7-6b08-44db-8451-7d098673af36-kube-api-access-qk5c2\") pod \"mariadb-client\" (UID: \"be6f87e7-6b08-44db-8451-7d098673af36\") " pod="openstack/mariadb-client" Mar 18 19:27:27 crc kubenswrapper[4830]: I0318 19:27:27.250817 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk5c2\" (UniqueName: \"kubernetes.io/projected/be6f87e7-6b08-44db-8451-7d098673af36-kube-api-access-qk5c2\") pod \"mariadb-client\" (UID: \"be6f87e7-6b08-44db-8451-7d098673af36\") " pod="openstack/mariadb-client" Mar 18 19:27:27 crc kubenswrapper[4830]: I0318 19:27:27.315122 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 19:27:27 crc kubenswrapper[4830]: W0318 19:27:27.937577 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe6f87e7_6b08_44db_8451_7d098673af36.slice/crio-9949be272aa628615381aab09d930f14324f52a83a5b4c011a9035ec9b7663fa WatchSource:0}: Error finding container 9949be272aa628615381aab09d930f14324f52a83a5b4c011a9035ec9b7663fa: Status 404 returned error can't find the container with id 9949be272aa628615381aab09d930f14324f52a83a5b4c011a9035ec9b7663fa Mar 18 19:27:27 crc kubenswrapper[4830]: I0318 19:27:27.947574 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 18 19:27:28 crc kubenswrapper[4830]: I0318 19:27:28.731650 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"be6f87e7-6b08-44db-8451-7d098673af36","Type":"ContainerStarted","Data":"9949be272aa628615381aab09d930f14324f52a83a5b4c011a9035ec9b7663fa"} Mar 18 19:27:34 crc kubenswrapper[4830]: E0318 19:27:34.438837 4830 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.39:59724->38.102.83.39:36855: read tcp 38.102.83.39:59724->38.102.83.39:36855: read: connection reset by peer Mar 18 19:27:34 crc kubenswrapper[4830]: E0318 19:27:34.438975 4830 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.39:59724->38.102.83.39:36855: write tcp 38.102.83.39:59724->38.102.83.39:36855: write: broken pipe Mar 18 19:27:34 crc kubenswrapper[4830]: I0318 19:27:34.781911 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"be6f87e7-6b08-44db-8451-7d098673af36","Type":"ContainerStarted","Data":"c44d37eafca987c1374fe72f9cd8e5f26d81f51686e4ee3cb13fb03c442e5ea8"} Mar 18 19:27:34 crc kubenswrapper[4830]: I0318 19:27:34.802733 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client" podStartSLOduration=2.935990795 podStartE2EDuration="8.80270863s" podCreationTimestamp="2026-03-18 19:27:26 +0000 UTC" firstStartedPulling="2026-03-18 19:27:27.93890674 +0000 UTC m=+5082.506537082" lastFinishedPulling="2026-03-18 19:27:33.805624575 +0000 UTC m=+5088.373254917" observedRunningTime="2026-03-18 19:27:34.801207838 +0000 UTC m=+5089.368838250" watchObservedRunningTime="2026-03-18 19:27:34.80270863 +0000 UTC m=+5089.370339002" Mar 18 19:27:38 crc kubenswrapper[4830]: I0318 19:27:38.235411 4830 scope.go:117] "RemoveContainer" containerID="2b2583ffa620998bcee9d8c36a0271eaafa77acd768af552c84519fa8e9cd8a5" Mar 18 19:27:38 crc kubenswrapper[4830]: E0318 19:27:38.237334 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:27:46 crc kubenswrapper[4830]: I0318 19:27:46.148676 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 18 19:27:46 crc kubenswrapper[4830]: I0318 19:27:46.149928 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-client" podUID="be6f87e7-6b08-44db-8451-7d098673af36" containerName="mariadb-client" containerID="cri-o://c44d37eafca987c1374fe72f9cd8e5f26d81f51686e4ee3cb13fb03c442e5ea8" gracePeriod=30 Mar 18 19:27:46 crc kubenswrapper[4830]: I0318 19:27:46.725659 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 19:27:46 crc kubenswrapper[4830]: I0318 19:27:46.842645 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qk5c2\" (UniqueName: \"kubernetes.io/projected/be6f87e7-6b08-44db-8451-7d098673af36-kube-api-access-qk5c2\") pod \"be6f87e7-6b08-44db-8451-7d098673af36\" (UID: \"be6f87e7-6b08-44db-8451-7d098673af36\") " Mar 18 19:27:46 crc kubenswrapper[4830]: I0318 19:27:46.849625 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be6f87e7-6b08-44db-8451-7d098673af36-kube-api-access-qk5c2" (OuterVolumeSpecName: "kube-api-access-qk5c2") pod "be6f87e7-6b08-44db-8451-7d098673af36" (UID: "be6f87e7-6b08-44db-8451-7d098673af36"). InnerVolumeSpecName "kube-api-access-qk5c2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:27:46 crc kubenswrapper[4830]: I0318 19:27:46.916560 4830 generic.go:334] "Generic (PLEG): container finished" podID="be6f87e7-6b08-44db-8451-7d098673af36" containerID="c44d37eafca987c1374fe72f9cd8e5f26d81f51686e4ee3cb13fb03c442e5ea8" exitCode=143 Mar 18 19:27:46 crc kubenswrapper[4830]: I0318 19:27:46.916621 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"be6f87e7-6b08-44db-8451-7d098673af36","Type":"ContainerDied","Data":"c44d37eafca987c1374fe72f9cd8e5f26d81f51686e4ee3cb13fb03c442e5ea8"} Mar 18 19:27:46 crc kubenswrapper[4830]: I0318 19:27:46.916653 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"be6f87e7-6b08-44db-8451-7d098673af36","Type":"ContainerDied","Data":"9949be272aa628615381aab09d930f14324f52a83a5b4c011a9035ec9b7663fa"} Mar 18 19:27:46 crc kubenswrapper[4830]: I0318 19:27:46.916658 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 19:27:46 crc kubenswrapper[4830]: I0318 19:27:46.916674 4830 scope.go:117] "RemoveContainer" containerID="c44d37eafca987c1374fe72f9cd8e5f26d81f51686e4ee3cb13fb03c442e5ea8" Mar 18 19:27:46 crc kubenswrapper[4830]: I0318 19:27:46.936875 4830 scope.go:117] "RemoveContainer" containerID="c44d37eafca987c1374fe72f9cd8e5f26d81f51686e4ee3cb13fb03c442e5ea8" Mar 18 19:27:46 crc kubenswrapper[4830]: E0318 19:27:46.937340 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c44d37eafca987c1374fe72f9cd8e5f26d81f51686e4ee3cb13fb03c442e5ea8\": container with ID starting with c44d37eafca987c1374fe72f9cd8e5f26d81f51686e4ee3cb13fb03c442e5ea8 not found: ID does not exist" containerID="c44d37eafca987c1374fe72f9cd8e5f26d81f51686e4ee3cb13fb03c442e5ea8" Mar 18 19:27:46 crc kubenswrapper[4830]: I0318 19:27:46.937413 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c44d37eafca987c1374fe72f9cd8e5f26d81f51686e4ee3cb13fb03c442e5ea8"} err="failed to get container status \"c44d37eafca987c1374fe72f9cd8e5f26d81f51686e4ee3cb13fb03c442e5ea8\": rpc error: code = NotFound desc = could not find container \"c44d37eafca987c1374fe72f9cd8e5f26d81f51686e4ee3cb13fb03c442e5ea8\": container with ID starting with c44d37eafca987c1374fe72f9cd8e5f26d81f51686e4ee3cb13fb03c442e5ea8 not found: ID does not exist" Mar 18 19:27:46 crc kubenswrapper[4830]: I0318 19:27:46.945504 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qk5c2\" (UniqueName: \"kubernetes.io/projected/be6f87e7-6b08-44db-8451-7d098673af36-kube-api-access-qk5c2\") on node \"crc\" DevicePath \"\"" Mar 18 19:27:46 crc kubenswrapper[4830]: I0318 19:27:46.963374 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 18 19:27:46 crc kubenswrapper[4830]: I0318 19:27:46.969946 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 18 19:27:48 crc kubenswrapper[4830]: I0318 19:27:48.265759 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be6f87e7-6b08-44db-8451-7d098673af36" path="/var/lib/kubelet/pods/be6f87e7-6b08-44db-8451-7d098673af36/volumes" Mar 18 19:27:53 crc kubenswrapper[4830]: I0318 19:27:53.234604 4830 scope.go:117] "RemoveContainer" containerID="2b2583ffa620998bcee9d8c36a0271eaafa77acd768af552c84519fa8e9cd8a5" Mar 18 19:27:53 crc kubenswrapper[4830]: E0318 19:27:53.235622 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:28:00 crc kubenswrapper[4830]: I0318 19:28:00.157481 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564368-6rcr9"] Mar 18 19:28:00 crc kubenswrapper[4830]: E0318 19:28:00.158631 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be6f87e7-6b08-44db-8451-7d098673af36" containerName="mariadb-client" Mar 18 19:28:00 crc kubenswrapper[4830]: I0318 19:28:00.158649 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="be6f87e7-6b08-44db-8451-7d098673af36" containerName="mariadb-client" Mar 18 19:28:00 crc kubenswrapper[4830]: I0318 19:28:00.158877 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="be6f87e7-6b08-44db-8451-7d098673af36" containerName="mariadb-client" Mar 18 19:28:00 crc kubenswrapper[4830]: I0318 19:28:00.159431 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564368-6rcr9" Mar 18 19:28:00 crc kubenswrapper[4830]: I0318 19:28:00.162298 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 19:28:00 crc kubenswrapper[4830]: I0318 19:28:00.162307 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:28:00 crc kubenswrapper[4830]: I0318 19:28:00.162974 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:28:00 crc kubenswrapper[4830]: I0318 19:28:00.170031 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564368-6rcr9"] Mar 18 19:28:00 crc kubenswrapper[4830]: I0318 19:28:00.272437 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5htqn\" (UniqueName: \"kubernetes.io/projected/b6e61852-2fc0-4654-9fe3-084767781d4d-kube-api-access-5htqn\") pod \"auto-csr-approver-29564368-6rcr9\" (UID: \"b6e61852-2fc0-4654-9fe3-084767781d4d\") " pod="openshift-infra/auto-csr-approver-29564368-6rcr9" Mar 18 19:28:00 crc kubenswrapper[4830]: I0318 19:28:00.373046 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5htqn\" (UniqueName: \"kubernetes.io/projected/b6e61852-2fc0-4654-9fe3-084767781d4d-kube-api-access-5htqn\") pod \"auto-csr-approver-29564368-6rcr9\" (UID: \"b6e61852-2fc0-4654-9fe3-084767781d4d\") " pod="openshift-infra/auto-csr-approver-29564368-6rcr9" Mar 18 19:28:00 crc kubenswrapper[4830]: I0318 19:28:00.398958 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5htqn\" (UniqueName: \"kubernetes.io/projected/b6e61852-2fc0-4654-9fe3-084767781d4d-kube-api-access-5htqn\") pod \"auto-csr-approver-29564368-6rcr9\" (UID: \"b6e61852-2fc0-4654-9fe3-084767781d4d\") " pod="openshift-infra/auto-csr-approver-29564368-6rcr9" Mar 18 19:28:00 crc kubenswrapper[4830]: I0318 19:28:00.477933 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564368-6rcr9" Mar 18 19:28:00 crc kubenswrapper[4830]: I0318 19:28:00.958222 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564368-6rcr9"] Mar 18 19:28:01 crc kubenswrapper[4830]: I0318 19:28:01.058820 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564368-6rcr9" event={"ID":"b6e61852-2fc0-4654-9fe3-084767781d4d","Type":"ContainerStarted","Data":"5d27bb5bdeb92e65c32ca6f8c57fa575cc1e7cac2e068f3de8e0548436edd9fc"} Mar 18 19:28:03 crc kubenswrapper[4830]: I0318 19:28:03.082514 4830 generic.go:334] "Generic (PLEG): container finished" podID="b6e61852-2fc0-4654-9fe3-084767781d4d" containerID="4ac8ccfb14a6a8f02d4a7e58c9ffed928a41635000ad34fc38fad25125e240ba" exitCode=0 Mar 18 19:28:03 crc kubenswrapper[4830]: I0318 19:28:03.082592 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564368-6rcr9" event={"ID":"b6e61852-2fc0-4654-9fe3-084767781d4d","Type":"ContainerDied","Data":"4ac8ccfb14a6a8f02d4a7e58c9ffed928a41635000ad34fc38fad25125e240ba"} Mar 18 19:28:04 crc kubenswrapper[4830]: I0318 19:28:04.519229 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564368-6rcr9" Mar 18 19:28:04 crc kubenswrapper[4830]: I0318 19:28:04.665335 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5htqn\" (UniqueName: \"kubernetes.io/projected/b6e61852-2fc0-4654-9fe3-084767781d4d-kube-api-access-5htqn\") pod \"b6e61852-2fc0-4654-9fe3-084767781d4d\" (UID: \"b6e61852-2fc0-4654-9fe3-084767781d4d\") " Mar 18 19:28:04 crc kubenswrapper[4830]: I0318 19:28:04.673447 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6e61852-2fc0-4654-9fe3-084767781d4d-kube-api-access-5htqn" (OuterVolumeSpecName: "kube-api-access-5htqn") pod "b6e61852-2fc0-4654-9fe3-084767781d4d" (UID: "b6e61852-2fc0-4654-9fe3-084767781d4d"). InnerVolumeSpecName "kube-api-access-5htqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:28:04 crc kubenswrapper[4830]: I0318 19:28:04.767116 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5htqn\" (UniqueName: \"kubernetes.io/projected/b6e61852-2fc0-4654-9fe3-084767781d4d-kube-api-access-5htqn\") on node \"crc\" DevicePath \"\"" Mar 18 19:28:05 crc kubenswrapper[4830]: I0318 19:28:05.104509 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564368-6rcr9" event={"ID":"b6e61852-2fc0-4654-9fe3-084767781d4d","Type":"ContainerDied","Data":"5d27bb5bdeb92e65c32ca6f8c57fa575cc1e7cac2e068f3de8e0548436edd9fc"} Mar 18 19:28:05 crc kubenswrapper[4830]: I0318 19:28:05.104570 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d27bb5bdeb92e65c32ca6f8c57fa575cc1e7cac2e068f3de8e0548436edd9fc" Mar 18 19:28:05 crc kubenswrapper[4830]: I0318 19:28:05.105009 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564368-6rcr9" Mar 18 19:28:05 crc kubenswrapper[4830]: I0318 19:28:05.590711 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564362-mpc2b"] Mar 18 19:28:05 crc kubenswrapper[4830]: I0318 19:28:05.597594 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564362-mpc2b"] Mar 18 19:28:06 crc kubenswrapper[4830]: I0318 19:28:06.243116 4830 scope.go:117] "RemoveContainer" containerID="2b2583ffa620998bcee9d8c36a0271eaafa77acd768af552c84519fa8e9cd8a5" Mar 18 19:28:06 crc kubenswrapper[4830]: E0318 19:28:06.243508 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:28:06 crc kubenswrapper[4830]: I0318 19:28:06.251493 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b57ef22a-4cfb-4f81-841c-23851f749849" path="/var/lib/kubelet/pods/b57ef22a-4cfb-4f81-841c-23851f749849/volumes" Mar 18 19:28:20 crc kubenswrapper[4830]: I0318 19:28:20.185368 4830 scope.go:117] "RemoveContainer" containerID="43ba8d7b6a349d4d3df028886c0c5146d3f6ac7931e5a6fc910670598d34a940" Mar 18 19:28:21 crc kubenswrapper[4830]: I0318 19:28:21.234731 4830 scope.go:117] "RemoveContainer" containerID="2b2583ffa620998bcee9d8c36a0271eaafa77acd768af552c84519fa8e9cd8a5" Mar 18 19:28:21 crc kubenswrapper[4830]: E0318 19:28:21.235205 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:28:36 crc kubenswrapper[4830]: I0318 19:28:36.241593 4830 scope.go:117] "RemoveContainer" containerID="2b2583ffa620998bcee9d8c36a0271eaafa77acd768af552c84519fa8e9cd8a5" Mar 18 19:28:36 crc kubenswrapper[4830]: E0318 19:28:36.242446 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:28:49 crc kubenswrapper[4830]: I0318 19:28:49.234557 4830 scope.go:117] "RemoveContainer" containerID="2b2583ffa620998bcee9d8c36a0271eaafa77acd768af552c84519fa8e9cd8a5" Mar 18 19:28:49 crc kubenswrapper[4830]: E0318 19:28:49.236255 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:28:59 crc kubenswrapper[4830]: I0318 19:28:59.073217 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dnfrh"] Mar 18 19:28:59 crc kubenswrapper[4830]: E0318 19:28:59.074604 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6e61852-2fc0-4654-9fe3-084767781d4d" containerName="oc" Mar 18 19:28:59 crc kubenswrapper[4830]: I0318 19:28:59.074621 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6e61852-2fc0-4654-9fe3-084767781d4d" containerName="oc" Mar 18 19:28:59 crc kubenswrapper[4830]: I0318 19:28:59.074812 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6e61852-2fc0-4654-9fe3-084767781d4d" containerName="oc" Mar 18 19:28:59 crc kubenswrapper[4830]: I0318 19:28:59.076237 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dnfrh" Mar 18 19:28:59 crc kubenswrapper[4830]: I0318 19:28:59.083065 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dnfrh"] Mar 18 19:28:59 crc kubenswrapper[4830]: I0318 19:28:59.115210 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1-utilities\") pod \"redhat-marketplace-dnfrh\" (UID: \"f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1\") " pod="openshift-marketplace/redhat-marketplace-dnfrh" Mar 18 19:28:59 crc kubenswrapper[4830]: I0318 19:28:59.115505 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rdl9\" (UniqueName: \"kubernetes.io/projected/f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1-kube-api-access-5rdl9\") pod \"redhat-marketplace-dnfrh\" (UID: \"f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1\") " pod="openshift-marketplace/redhat-marketplace-dnfrh" Mar 18 19:28:59 crc kubenswrapper[4830]: I0318 19:28:59.115603 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1-catalog-content\") pod \"redhat-marketplace-dnfrh\" (UID: \"f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1\") " pod="openshift-marketplace/redhat-marketplace-dnfrh" Mar 18 19:28:59 crc kubenswrapper[4830]: I0318 19:28:59.216559 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1-catalog-content\") pod \"redhat-marketplace-dnfrh\" (UID: \"f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1\") " pod="openshift-marketplace/redhat-marketplace-dnfrh" Mar 18 19:28:59 crc kubenswrapper[4830]: I0318 19:28:59.216870 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1-utilities\") pod \"redhat-marketplace-dnfrh\" (UID: \"f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1\") " pod="openshift-marketplace/redhat-marketplace-dnfrh" Mar 18 19:28:59 crc kubenswrapper[4830]: I0318 19:28:59.216984 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1-catalog-content\") pod \"redhat-marketplace-dnfrh\" (UID: \"f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1\") " pod="openshift-marketplace/redhat-marketplace-dnfrh" Mar 18 19:28:59 crc kubenswrapper[4830]: I0318 19:28:59.217123 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rdl9\" (UniqueName: \"kubernetes.io/projected/f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1-kube-api-access-5rdl9\") pod \"redhat-marketplace-dnfrh\" (UID: \"f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1\") " pod="openshift-marketplace/redhat-marketplace-dnfrh" Mar 18 19:28:59 crc kubenswrapper[4830]: I0318 19:28:59.217310 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1-utilities\") pod \"redhat-marketplace-dnfrh\" (UID: \"f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1\") " pod="openshift-marketplace/redhat-marketplace-dnfrh" Mar 18 19:28:59 crc kubenswrapper[4830]: I0318 19:28:59.273227 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rdl9\" (UniqueName: \"kubernetes.io/projected/f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1-kube-api-access-5rdl9\") pod \"redhat-marketplace-dnfrh\" (UID: \"f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1\") " pod="openshift-marketplace/redhat-marketplace-dnfrh" Mar 18 19:28:59 crc kubenswrapper[4830]: I0318 19:28:59.415811 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dnfrh" Mar 18 19:28:59 crc kubenswrapper[4830]: I0318 19:28:59.656120 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dnfrh"] Mar 18 19:29:00 crc kubenswrapper[4830]: I0318 19:29:00.629102 4830 generic.go:334] "Generic (PLEG): container finished" podID="f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1" containerID="c9786aeacbfe08c06dc46e5fc8abeeb0c1caee6a8d70c0789aab199835ee7722" exitCode=0 Mar 18 19:29:00 crc kubenswrapper[4830]: I0318 19:29:00.629176 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dnfrh" event={"ID":"f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1","Type":"ContainerDied","Data":"c9786aeacbfe08c06dc46e5fc8abeeb0c1caee6a8d70c0789aab199835ee7722"} Mar 18 19:29:00 crc kubenswrapper[4830]: I0318 19:29:00.629600 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dnfrh" event={"ID":"f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1","Type":"ContainerStarted","Data":"81968f0826854906c1925889506cd48172bdfd61548a85b9a48534dcbeb35a3a"} Mar 18 19:29:01 crc kubenswrapper[4830]: I0318 19:29:01.643196 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dnfrh" event={"ID":"f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1","Type":"ContainerStarted","Data":"c627940362e25b781c5d420400396aedf1045f530db31340d760a60375fc39b7"} Mar 18 19:29:02 crc kubenswrapper[4830]: I0318 19:29:02.235050 4830 scope.go:117] "RemoveContainer" containerID="2b2583ffa620998bcee9d8c36a0271eaafa77acd768af552c84519fa8e9cd8a5" Mar 18 19:29:02 crc kubenswrapper[4830]: E0318 19:29:02.235725 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:29:02 crc kubenswrapper[4830]: I0318 19:29:02.653733 4830 generic.go:334] "Generic (PLEG): container finished" podID="f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1" containerID="c627940362e25b781c5d420400396aedf1045f530db31340d760a60375fc39b7" exitCode=0 Mar 18 19:29:02 crc kubenswrapper[4830]: I0318 19:29:02.653800 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dnfrh" event={"ID":"f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1","Type":"ContainerDied","Data":"c627940362e25b781c5d420400396aedf1045f530db31340d760a60375fc39b7"} Mar 18 19:29:04 crc kubenswrapper[4830]: I0318 19:29:04.669360 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dnfrh" event={"ID":"f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1","Type":"ContainerStarted","Data":"82782ae7e4e1006717dae7b4d7e7099a0507e786de0af43c5c9b524356148d55"} Mar 18 19:29:04 crc kubenswrapper[4830]: I0318 19:29:04.692651 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dnfrh" podStartSLOduration=2.750521745 podStartE2EDuration="5.692626523s" podCreationTimestamp="2026-03-18 19:28:59 +0000 UTC" firstStartedPulling="2026-03-18 19:29:00.630848574 +0000 UTC m=+5175.198478956" lastFinishedPulling="2026-03-18 19:29:03.572953392 +0000 UTC m=+5178.140583734" observedRunningTime="2026-03-18 19:29:04.68684948 +0000 UTC m=+5179.254479832" watchObservedRunningTime="2026-03-18 19:29:04.692626523 +0000 UTC m=+5179.260256885" Mar 18 19:29:09 crc kubenswrapper[4830]: I0318 19:29:09.416444 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dnfrh" Mar 18 19:29:09 crc kubenswrapper[4830]: I0318 19:29:09.417181 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dnfrh" Mar 18 19:29:09 crc kubenswrapper[4830]: I0318 19:29:09.492807 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dnfrh" Mar 18 19:29:09 crc kubenswrapper[4830]: I0318 19:29:09.763830 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dnfrh" Mar 18 19:29:09 crc kubenswrapper[4830]: I0318 19:29:09.811399 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dnfrh"] Mar 18 19:29:11 crc kubenswrapper[4830]: I0318 19:29:11.728398 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dnfrh" podUID="f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1" containerName="registry-server" containerID="cri-o://82782ae7e4e1006717dae7b4d7e7099a0507e786de0af43c5c9b524356148d55" gracePeriod=2 Mar 18 19:29:12 crc kubenswrapper[4830]: I0318 19:29:12.308428 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dnfrh" Mar 18 19:29:12 crc kubenswrapper[4830]: I0318 19:29:12.405415 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rdl9\" (UniqueName: \"kubernetes.io/projected/f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1-kube-api-access-5rdl9\") pod \"f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1\" (UID: \"f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1\") " Mar 18 19:29:12 crc kubenswrapper[4830]: I0318 19:29:12.405480 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1-utilities\") pod \"f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1\" (UID: \"f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1\") " Mar 18 19:29:12 crc kubenswrapper[4830]: I0318 19:29:12.405573 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1-catalog-content\") pod \"f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1\" (UID: \"f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1\") " Mar 18 19:29:12 crc kubenswrapper[4830]: I0318 19:29:12.407048 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1-utilities" (OuterVolumeSpecName: "utilities") pod "f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1" (UID: "f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:29:12 crc kubenswrapper[4830]: I0318 19:29:12.431502 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1" (UID: "f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:29:12 crc kubenswrapper[4830]: I0318 19:29:12.473338 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1-kube-api-access-5rdl9" (OuterVolumeSpecName: "kube-api-access-5rdl9") pod "f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1" (UID: "f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1"). InnerVolumeSpecName "kube-api-access-5rdl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:29:12 crc kubenswrapper[4830]: I0318 19:29:12.506987 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 19:29:12 crc kubenswrapper[4830]: I0318 19:29:12.507019 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rdl9\" (UniqueName: \"kubernetes.io/projected/f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1-kube-api-access-5rdl9\") on node \"crc\" DevicePath \"\"" Mar 18 19:29:12 crc kubenswrapper[4830]: I0318 19:29:12.507032 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 19:29:12 crc kubenswrapper[4830]: I0318 19:29:12.739205 4830 generic.go:334] "Generic (PLEG): container finished" podID="f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1" containerID="82782ae7e4e1006717dae7b4d7e7099a0507e786de0af43c5c9b524356148d55" exitCode=0 Mar 18 19:29:12 crc kubenswrapper[4830]: I0318 19:29:12.739262 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dnfrh" event={"ID":"f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1","Type":"ContainerDied","Data":"82782ae7e4e1006717dae7b4d7e7099a0507e786de0af43c5c9b524356148d55"} Mar 18 19:29:12 crc kubenswrapper[4830]: I0318 19:29:12.739313 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dnfrh" event={"ID":"f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1","Type":"ContainerDied","Data":"81968f0826854906c1925889506cd48172bdfd61548a85b9a48534dcbeb35a3a"} Mar 18 19:29:12 crc kubenswrapper[4830]: I0318 19:29:12.739360 4830 scope.go:117] "RemoveContainer" containerID="82782ae7e4e1006717dae7b4d7e7099a0507e786de0af43c5c9b524356148d55" Mar 18 19:29:12 crc kubenswrapper[4830]: I0318 19:29:12.740227 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dnfrh" Mar 18 19:29:12 crc kubenswrapper[4830]: I0318 19:29:12.764929 4830 scope.go:117] "RemoveContainer" containerID="c627940362e25b781c5d420400396aedf1045f530db31340d760a60375fc39b7" Mar 18 19:29:12 crc kubenswrapper[4830]: I0318 19:29:12.791596 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dnfrh"] Mar 18 19:29:12 crc kubenswrapper[4830]: I0318 19:29:12.802291 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dnfrh"] Mar 18 19:29:12 crc kubenswrapper[4830]: I0318 19:29:12.806964 4830 scope.go:117] "RemoveContainer" containerID="c9786aeacbfe08c06dc46e5fc8abeeb0c1caee6a8d70c0789aab199835ee7722" Mar 18 19:29:12 crc kubenswrapper[4830]: I0318 19:29:12.836016 4830 scope.go:117] "RemoveContainer" containerID="82782ae7e4e1006717dae7b4d7e7099a0507e786de0af43c5c9b524356148d55" Mar 18 19:29:12 crc kubenswrapper[4830]: E0318 19:29:12.836582 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82782ae7e4e1006717dae7b4d7e7099a0507e786de0af43c5c9b524356148d55\": container with ID starting with 82782ae7e4e1006717dae7b4d7e7099a0507e786de0af43c5c9b524356148d55 not found: ID does not exist" containerID="82782ae7e4e1006717dae7b4d7e7099a0507e786de0af43c5c9b524356148d55" Mar 18 19:29:12 crc kubenswrapper[4830]: I0318 19:29:12.836644 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82782ae7e4e1006717dae7b4d7e7099a0507e786de0af43c5c9b524356148d55"} err="failed to get container status \"82782ae7e4e1006717dae7b4d7e7099a0507e786de0af43c5c9b524356148d55\": rpc error: code = NotFound desc = could not find container \"82782ae7e4e1006717dae7b4d7e7099a0507e786de0af43c5c9b524356148d55\": container with ID starting with 82782ae7e4e1006717dae7b4d7e7099a0507e786de0af43c5c9b524356148d55 not found: ID does not exist" Mar 18 19:29:12 crc kubenswrapper[4830]: I0318 19:29:12.836676 4830 scope.go:117] "RemoveContainer" containerID="c627940362e25b781c5d420400396aedf1045f530db31340d760a60375fc39b7" Mar 18 19:29:12 crc kubenswrapper[4830]: E0318 19:29:12.837264 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c627940362e25b781c5d420400396aedf1045f530db31340d760a60375fc39b7\": container with ID starting with c627940362e25b781c5d420400396aedf1045f530db31340d760a60375fc39b7 not found: ID does not exist" containerID="c627940362e25b781c5d420400396aedf1045f530db31340d760a60375fc39b7" Mar 18 19:29:12 crc kubenswrapper[4830]: I0318 19:29:12.837310 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c627940362e25b781c5d420400396aedf1045f530db31340d760a60375fc39b7"} err="failed to get container status \"c627940362e25b781c5d420400396aedf1045f530db31340d760a60375fc39b7\": rpc error: code = NotFound desc = could not find container \"c627940362e25b781c5d420400396aedf1045f530db31340d760a60375fc39b7\": container with ID starting with c627940362e25b781c5d420400396aedf1045f530db31340d760a60375fc39b7 not found: ID does not exist" Mar 18 19:29:12 crc kubenswrapper[4830]: I0318 19:29:12.837344 4830 scope.go:117] "RemoveContainer" containerID="c9786aeacbfe08c06dc46e5fc8abeeb0c1caee6a8d70c0789aab199835ee7722" Mar 18 19:29:12 crc kubenswrapper[4830]: E0318 19:29:12.838042 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9786aeacbfe08c06dc46e5fc8abeeb0c1caee6a8d70c0789aab199835ee7722\": container with ID starting with c9786aeacbfe08c06dc46e5fc8abeeb0c1caee6a8d70c0789aab199835ee7722 not found: ID does not exist" containerID="c9786aeacbfe08c06dc46e5fc8abeeb0c1caee6a8d70c0789aab199835ee7722" Mar 18 19:29:12 crc kubenswrapper[4830]: I0318 19:29:12.838125 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9786aeacbfe08c06dc46e5fc8abeeb0c1caee6a8d70c0789aab199835ee7722"} err="failed to get container status \"c9786aeacbfe08c06dc46e5fc8abeeb0c1caee6a8d70c0789aab199835ee7722\": rpc error: code = NotFound desc = could not find container \"c9786aeacbfe08c06dc46e5fc8abeeb0c1caee6a8d70c0789aab199835ee7722\": container with ID starting with c9786aeacbfe08c06dc46e5fc8abeeb0c1caee6a8d70c0789aab199835ee7722 not found: ID does not exist" Mar 18 19:29:14 crc kubenswrapper[4830]: I0318 19:29:14.236378 4830 scope.go:117] "RemoveContainer" containerID="2b2583ffa620998bcee9d8c36a0271eaafa77acd768af552c84519fa8e9cd8a5" Mar 18 19:29:14 crc kubenswrapper[4830]: E0318 19:29:14.237349 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:29:14 crc kubenswrapper[4830]: I0318 19:29:14.256622 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1" path="/var/lib/kubelet/pods/f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1/volumes" Mar 18 19:29:20 crc kubenswrapper[4830]: I0318 19:29:20.300087 4830 scope.go:117] "RemoveContainer" containerID="65fa2f44cdf2b50b7675724bf08b7be35806b60643093ea085b9f69536a61c76" Mar 18 19:29:28 crc kubenswrapper[4830]: I0318 19:29:28.235716 4830 scope.go:117] "RemoveContainer" containerID="2b2583ffa620998bcee9d8c36a0271eaafa77acd768af552c84519fa8e9cd8a5" Mar 18 19:29:28 crc kubenswrapper[4830]: E0318 19:29:28.236899 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:29:42 crc kubenswrapper[4830]: I0318 19:29:42.234731 4830 scope.go:117] "RemoveContainer" containerID="2b2583ffa620998bcee9d8c36a0271eaafa77acd768af552c84519fa8e9cd8a5" Mar 18 19:29:42 crc kubenswrapper[4830]: E0318 19:29:42.235730 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:29:57 crc kubenswrapper[4830]: I0318 19:29:57.235218 4830 scope.go:117] "RemoveContainer" containerID="2b2583ffa620998bcee9d8c36a0271eaafa77acd768af552c84519fa8e9cd8a5" Mar 18 19:29:57 crc kubenswrapper[4830]: E0318 19:29:57.236507 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:30:00 crc kubenswrapper[4830]: I0318 19:30:00.169397 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564370-57ddf"] Mar 18 19:30:00 crc kubenswrapper[4830]: E0318 19:30:00.170554 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1" containerName="registry-server" Mar 18 19:30:00 crc kubenswrapper[4830]: I0318 19:30:00.170577 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1" containerName="registry-server" Mar 18 19:30:00 crc kubenswrapper[4830]: E0318 19:30:00.170597 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1" containerName="extract-content" Mar 18 19:30:00 crc kubenswrapper[4830]: I0318 19:30:00.170612 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1" containerName="extract-content" Mar 18 19:30:00 crc kubenswrapper[4830]: E0318 19:30:00.170626 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1" containerName="extract-utilities" Mar 18 19:30:00 crc kubenswrapper[4830]: I0318 19:30:00.170637 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1" containerName="extract-utilities" Mar 18 19:30:00 crc kubenswrapper[4830]: I0318 19:30:00.170943 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0b2bd75-5b45-4829-acd2-2bbbcd6d97b1" containerName="registry-server" Mar 18 19:30:00 crc kubenswrapper[4830]: I0318 19:30:00.171748 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564370-57ddf" Mar 18 19:30:00 crc kubenswrapper[4830]: I0318 19:30:00.174388 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:30:00 crc kubenswrapper[4830]: I0318 19:30:00.174753 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:30:00 crc kubenswrapper[4830]: I0318 19:30:00.177609 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 19:30:00 crc kubenswrapper[4830]: I0318 19:30:00.188568 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564370-nxnrx"] Mar 18 19:30:00 crc kubenswrapper[4830]: I0318 19:30:00.189912 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564370-nxnrx" Mar 18 19:30:00 crc kubenswrapper[4830]: I0318 19:30:00.195610 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 19:30:00 crc kubenswrapper[4830]: I0318 19:30:00.197431 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 19:30:00 crc kubenswrapper[4830]: I0318 19:30:00.199604 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564370-57ddf"] Mar 18 19:30:00 crc kubenswrapper[4830]: I0318 19:30:00.212995 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564370-nxnrx"] Mar 18 19:30:00 crc kubenswrapper[4830]: I0318 19:30:00.341490 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f00e1f8-d5f4-4951-af1d-b8704426062d-secret-volume\") pod \"collect-profiles-29564370-nxnrx\" (UID: \"8f00e1f8-d5f4-4951-af1d-b8704426062d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564370-nxnrx" Mar 18 19:30:00 crc kubenswrapper[4830]: I0318 19:30:00.341648 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f00e1f8-d5f4-4951-af1d-b8704426062d-config-volume\") pod \"collect-profiles-29564370-nxnrx\" (UID: \"8f00e1f8-d5f4-4951-af1d-b8704426062d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564370-nxnrx" Mar 18 19:30:00 crc kubenswrapper[4830]: I0318 19:30:00.341710 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gqkx\" (UniqueName: \"kubernetes.io/projected/8f00e1f8-d5f4-4951-af1d-b8704426062d-kube-api-access-9gqkx\") pod \"collect-profiles-29564370-nxnrx\" (UID: \"8f00e1f8-d5f4-4951-af1d-b8704426062d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564370-nxnrx" Mar 18 19:30:00 crc kubenswrapper[4830]: I0318 19:30:00.341954 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snc8n\" (UniqueName: \"kubernetes.io/projected/70b24fea-e1aa-4450-89d7-6f932c84f2c7-kube-api-access-snc8n\") pod \"auto-csr-approver-29564370-57ddf\" (UID: \"70b24fea-e1aa-4450-89d7-6f932c84f2c7\") " pod="openshift-infra/auto-csr-approver-29564370-57ddf" Mar 18 19:30:00 crc kubenswrapper[4830]: I0318 19:30:00.444247 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f00e1f8-d5f4-4951-af1d-b8704426062d-config-volume\") pod \"collect-profiles-29564370-nxnrx\" (UID: \"8f00e1f8-d5f4-4951-af1d-b8704426062d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564370-nxnrx" Mar 18 19:30:00 crc kubenswrapper[4830]: I0318 19:30:00.444405 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gqkx\" (UniqueName: \"kubernetes.io/projected/8f00e1f8-d5f4-4951-af1d-b8704426062d-kube-api-access-9gqkx\") pod \"collect-profiles-29564370-nxnrx\" (UID: \"8f00e1f8-d5f4-4951-af1d-b8704426062d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564370-nxnrx" Mar 18 19:30:00 crc kubenswrapper[4830]: I0318 19:30:00.444462 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snc8n\" (UniqueName: \"kubernetes.io/projected/70b24fea-e1aa-4450-89d7-6f932c84f2c7-kube-api-access-snc8n\") pod \"auto-csr-approver-29564370-57ddf\" (UID: \"70b24fea-e1aa-4450-89d7-6f932c84f2c7\") " pod="openshift-infra/auto-csr-approver-29564370-57ddf" Mar 18 19:30:00 crc kubenswrapper[4830]: I0318 19:30:00.444564 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f00e1f8-d5f4-4951-af1d-b8704426062d-secret-volume\") pod \"collect-profiles-29564370-nxnrx\" (UID: \"8f00e1f8-d5f4-4951-af1d-b8704426062d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564370-nxnrx" Mar 18 19:30:00 crc kubenswrapper[4830]: I0318 19:30:00.445710 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f00e1f8-d5f4-4951-af1d-b8704426062d-config-volume\") pod \"collect-profiles-29564370-nxnrx\" (UID: \"8f00e1f8-d5f4-4951-af1d-b8704426062d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564370-nxnrx" Mar 18 19:30:00 crc kubenswrapper[4830]: I0318 19:30:00.452226 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f00e1f8-d5f4-4951-af1d-b8704426062d-secret-volume\") pod \"collect-profiles-29564370-nxnrx\" (UID: \"8f00e1f8-d5f4-4951-af1d-b8704426062d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564370-nxnrx" Mar 18 19:30:00 crc kubenswrapper[4830]: I0318 19:30:00.463556 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snc8n\" (UniqueName: \"kubernetes.io/projected/70b24fea-e1aa-4450-89d7-6f932c84f2c7-kube-api-access-snc8n\") pod \"auto-csr-approver-29564370-57ddf\" (UID: \"70b24fea-e1aa-4450-89d7-6f932c84f2c7\") " pod="openshift-infra/auto-csr-approver-29564370-57ddf" Mar 18 19:30:00 crc kubenswrapper[4830]: I0318 19:30:00.466346 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gqkx\" (UniqueName: \"kubernetes.io/projected/8f00e1f8-d5f4-4951-af1d-b8704426062d-kube-api-access-9gqkx\") pod \"collect-profiles-29564370-nxnrx\" (UID: \"8f00e1f8-d5f4-4951-af1d-b8704426062d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564370-nxnrx" Mar 18 19:30:00 crc kubenswrapper[4830]: I0318 19:30:00.502970 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564370-57ddf" Mar 18 19:30:00 crc kubenswrapper[4830]: I0318 19:30:00.519921 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564370-nxnrx" Mar 18 19:30:00 crc kubenswrapper[4830]: I0318 19:30:00.797729 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564370-nxnrx"] Mar 18 19:30:00 crc kubenswrapper[4830]: I0318 19:30:00.923643 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564370-57ddf"] Mar 18 19:30:00 crc kubenswrapper[4830]: W0318 19:30:00.931093 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70b24fea_e1aa_4450_89d7_6f932c84f2c7.slice/crio-640138994cc19236c341d41535e4a03966e9b51aecee3a2e2dc7f2e3ac21c7b9 WatchSource:0}: Error finding container 640138994cc19236c341d41535e4a03966e9b51aecee3a2e2dc7f2e3ac21c7b9: Status 404 returned error can't find the container with id 640138994cc19236c341d41535e4a03966e9b51aecee3a2e2dc7f2e3ac21c7b9 Mar 18 19:30:01 crc kubenswrapper[4830]: I0318 19:30:01.205390 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564370-nxnrx" event={"ID":"8f00e1f8-d5f4-4951-af1d-b8704426062d","Type":"ContainerStarted","Data":"c4f31cb85f4b5135a3b20cefda9d5c2e0e3aba9db4979de77cc3fca9c4562790"} Mar 18 19:30:01 crc kubenswrapper[4830]: I0318 19:30:01.205493 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564370-nxnrx" event={"ID":"8f00e1f8-d5f4-4951-af1d-b8704426062d","Type":"ContainerStarted","Data":"00c5bc50b69e826d1e426cba870096d80472242d8d835dee4662a72c04a7a1dc"} Mar 18 19:30:01 crc kubenswrapper[4830]: I0318 19:30:01.207440 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564370-57ddf" event={"ID":"70b24fea-e1aa-4450-89d7-6f932c84f2c7","Type":"ContainerStarted","Data":"640138994cc19236c341d41535e4a03966e9b51aecee3a2e2dc7f2e3ac21c7b9"} Mar 18 19:30:02 crc kubenswrapper[4830]: I0318 19:30:02.218249 4830 generic.go:334] "Generic (PLEG): container finished" podID="8f00e1f8-d5f4-4951-af1d-b8704426062d" containerID="c4f31cb85f4b5135a3b20cefda9d5c2e0e3aba9db4979de77cc3fca9c4562790" exitCode=0 Mar 18 19:30:02 crc kubenswrapper[4830]: I0318 19:30:02.218330 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564370-nxnrx" event={"ID":"8f00e1f8-d5f4-4951-af1d-b8704426062d","Type":"ContainerDied","Data":"c4f31cb85f4b5135a3b20cefda9d5c2e0e3aba9db4979de77cc3fca9c4562790"} Mar 18 19:30:02 crc kubenswrapper[4830]: I0318 19:30:02.518893 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564370-nxnrx" Mar 18 19:30:02 crc kubenswrapper[4830]: I0318 19:30:02.681363 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gqkx\" (UniqueName: \"kubernetes.io/projected/8f00e1f8-d5f4-4951-af1d-b8704426062d-kube-api-access-9gqkx\") pod \"8f00e1f8-d5f4-4951-af1d-b8704426062d\" (UID: \"8f00e1f8-d5f4-4951-af1d-b8704426062d\") " Mar 18 19:30:02 crc kubenswrapper[4830]: I0318 19:30:02.681445 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f00e1f8-d5f4-4951-af1d-b8704426062d-secret-volume\") pod \"8f00e1f8-d5f4-4951-af1d-b8704426062d\" (UID: \"8f00e1f8-d5f4-4951-af1d-b8704426062d\") " Mar 18 19:30:02 crc kubenswrapper[4830]: I0318 19:30:02.682319 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f00e1f8-d5f4-4951-af1d-b8704426062d-config-volume\") pod \"8f00e1f8-d5f4-4951-af1d-b8704426062d\" (UID: \"8f00e1f8-d5f4-4951-af1d-b8704426062d\") " Mar 18 19:30:02 crc kubenswrapper[4830]: I0318 19:30:02.683062 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f00e1f8-d5f4-4951-af1d-b8704426062d-config-volume" (OuterVolumeSpecName: "config-volume") pod "8f00e1f8-d5f4-4951-af1d-b8704426062d" (UID: "8f00e1f8-d5f4-4951-af1d-b8704426062d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:30:02 crc kubenswrapper[4830]: I0318 19:30:02.683619 4830 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f00e1f8-d5f4-4951-af1d-b8704426062d-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 19:30:02 crc kubenswrapper[4830]: I0318 19:30:02.690386 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f00e1f8-d5f4-4951-af1d-b8704426062d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8f00e1f8-d5f4-4951-af1d-b8704426062d" (UID: "8f00e1f8-d5f4-4951-af1d-b8704426062d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 19:30:02 crc kubenswrapper[4830]: I0318 19:30:02.690503 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f00e1f8-d5f4-4951-af1d-b8704426062d-kube-api-access-9gqkx" (OuterVolumeSpecName: "kube-api-access-9gqkx") pod "8f00e1f8-d5f4-4951-af1d-b8704426062d" (UID: "8f00e1f8-d5f4-4951-af1d-b8704426062d"). InnerVolumeSpecName "kube-api-access-9gqkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:30:02 crc kubenswrapper[4830]: I0318 19:30:02.785582 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gqkx\" (UniqueName: \"kubernetes.io/projected/8f00e1f8-d5f4-4951-af1d-b8704426062d-kube-api-access-9gqkx\") on node \"crc\" DevicePath \"\"" Mar 18 19:30:02 crc kubenswrapper[4830]: I0318 19:30:02.785900 4830 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f00e1f8-d5f4-4951-af1d-b8704426062d-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 19:30:03 crc kubenswrapper[4830]: I0318 19:30:03.231993 4830 generic.go:334] "Generic (PLEG): container finished" podID="70b24fea-e1aa-4450-89d7-6f932c84f2c7" containerID="7c0e8892ac206e736e07af575208adb9e3727da708072c019517ec285b107e9a" exitCode=0 Mar 18 19:30:03 crc kubenswrapper[4830]: I0318 19:30:03.232143 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564370-57ddf" event={"ID":"70b24fea-e1aa-4450-89d7-6f932c84f2c7","Type":"ContainerDied","Data":"7c0e8892ac206e736e07af575208adb9e3727da708072c019517ec285b107e9a"} Mar 18 19:30:03 crc kubenswrapper[4830]: I0318 19:30:03.237077 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564370-nxnrx" event={"ID":"8f00e1f8-d5f4-4951-af1d-b8704426062d","Type":"ContainerDied","Data":"00c5bc50b69e826d1e426cba870096d80472242d8d835dee4662a72c04a7a1dc"} Mar 18 19:30:03 crc kubenswrapper[4830]: I0318 19:30:03.237134 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564370-nxnrx" Mar 18 19:30:03 crc kubenswrapper[4830]: I0318 19:30:03.237145 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00c5bc50b69e826d1e426cba870096d80472242d8d835dee4662a72c04a7a1dc" Mar 18 19:30:03 crc kubenswrapper[4830]: I0318 19:30:03.618450 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564325-zzz7z"] Mar 18 19:30:03 crc kubenswrapper[4830]: I0318 19:30:03.630687 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564325-zzz7z"] Mar 18 19:30:04 crc kubenswrapper[4830]: I0318 19:30:04.253664 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1957d43b-bc6c-4df6-92fa-b934f78770ea" path="/var/lib/kubelet/pods/1957d43b-bc6c-4df6-92fa-b934f78770ea/volumes" Mar 18 19:30:04 crc kubenswrapper[4830]: I0318 19:30:04.631762 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564370-57ddf" Mar 18 19:30:04 crc kubenswrapper[4830]: I0318 19:30:04.822928 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snc8n\" (UniqueName: \"kubernetes.io/projected/70b24fea-e1aa-4450-89d7-6f932c84f2c7-kube-api-access-snc8n\") pod \"70b24fea-e1aa-4450-89d7-6f932c84f2c7\" (UID: \"70b24fea-e1aa-4450-89d7-6f932c84f2c7\") " Mar 18 19:30:04 crc kubenswrapper[4830]: I0318 19:30:04.832148 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70b24fea-e1aa-4450-89d7-6f932c84f2c7-kube-api-access-snc8n" (OuterVolumeSpecName: "kube-api-access-snc8n") pod "70b24fea-e1aa-4450-89d7-6f932c84f2c7" (UID: "70b24fea-e1aa-4450-89d7-6f932c84f2c7"). InnerVolumeSpecName "kube-api-access-snc8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:30:04 crc kubenswrapper[4830]: I0318 19:30:04.925328 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snc8n\" (UniqueName: \"kubernetes.io/projected/70b24fea-e1aa-4450-89d7-6f932c84f2c7-kube-api-access-snc8n\") on node \"crc\" DevicePath \"\"" Mar 18 19:30:05 crc kubenswrapper[4830]: I0318 19:30:05.260014 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564370-57ddf" event={"ID":"70b24fea-e1aa-4450-89d7-6f932c84f2c7","Type":"ContainerDied","Data":"640138994cc19236c341d41535e4a03966e9b51aecee3a2e2dc7f2e3ac21c7b9"} Mar 18 19:30:05 crc kubenswrapper[4830]: I0318 19:30:05.260468 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="640138994cc19236c341d41535e4a03966e9b51aecee3a2e2dc7f2e3ac21c7b9" Mar 18 19:30:05 crc kubenswrapper[4830]: I0318 19:30:05.260104 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564370-57ddf" Mar 18 19:30:05 crc kubenswrapper[4830]: I0318 19:30:05.679981 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564364-hxvwf"] Mar 18 19:30:05 crc kubenswrapper[4830]: I0318 19:30:05.694585 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564364-hxvwf"] Mar 18 19:30:06 crc kubenswrapper[4830]: I0318 19:30:06.242395 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d23f38d-d210-46b2-9f30-b80e980f274c" path="/var/lib/kubelet/pods/4d23f38d-d210-46b2-9f30-b80e980f274c/volumes" Mar 18 19:30:12 crc kubenswrapper[4830]: I0318 19:30:12.235173 4830 scope.go:117] "RemoveContainer" containerID="2b2583ffa620998bcee9d8c36a0271eaafa77acd768af552c84519fa8e9cd8a5" Mar 18 19:30:12 crc kubenswrapper[4830]: E0318 19:30:12.236253 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:30:20 crc kubenswrapper[4830]: I0318 19:30:20.376108 4830 scope.go:117] "RemoveContainer" containerID="3335daa75f545881a57e4c88cb6ce0b05e351fc8483403bbb1dd1e8800cec6d2" Mar 18 19:30:20 crc kubenswrapper[4830]: I0318 19:30:20.417676 4830 scope.go:117] "RemoveContainer" containerID="0ecffee954cff0fe8e929d6ef29507d5e60e90c7f2a06198355eab86d41f3f4b" Mar 18 19:30:26 crc kubenswrapper[4830]: I0318 19:30:26.241995 4830 scope.go:117] "RemoveContainer" containerID="2b2583ffa620998bcee9d8c36a0271eaafa77acd768af552c84519fa8e9cd8a5" Mar 18 19:30:26 crc kubenswrapper[4830]: E0318 19:30:26.243155 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:30:38 crc kubenswrapper[4830]: I0318 19:30:38.234319 4830 scope.go:117] "RemoveContainer" containerID="2b2583ffa620998bcee9d8c36a0271eaafa77acd768af552c84519fa8e9cd8a5" Mar 18 19:30:38 crc kubenswrapper[4830]: E0318 19:30:38.236014 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:30:53 crc kubenswrapper[4830]: I0318 19:30:53.234556 4830 scope.go:117] "RemoveContainer" containerID="2b2583ffa620998bcee9d8c36a0271eaafa77acd768af552c84519fa8e9cd8a5" Mar 18 19:30:53 crc kubenswrapper[4830]: E0318 19:30:53.235341 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:31:07 crc kubenswrapper[4830]: I0318 19:31:07.235323 4830 scope.go:117] "RemoveContainer" containerID="2b2583ffa620998bcee9d8c36a0271eaafa77acd768af552c84519fa8e9cd8a5" Mar 18 19:31:07 crc kubenswrapper[4830]: E0318 19:31:07.236518 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:31:19 crc kubenswrapper[4830]: I0318 19:31:19.234666 4830 scope.go:117] "RemoveContainer" containerID="2b2583ffa620998bcee9d8c36a0271eaafa77acd768af552c84519fa8e9cd8a5" Mar 18 19:31:19 crc kubenswrapper[4830]: E0318 19:31:19.235728 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:31:25 crc kubenswrapper[4830]: I0318 19:31:25.906465 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Mar 18 19:31:25 crc kubenswrapper[4830]: E0318 19:31:25.907920 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f00e1f8-d5f4-4951-af1d-b8704426062d" containerName="collect-profiles" Mar 18 19:31:25 crc kubenswrapper[4830]: I0318 19:31:25.907957 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f00e1f8-d5f4-4951-af1d-b8704426062d" containerName="collect-profiles" Mar 18 19:31:25 crc kubenswrapper[4830]: E0318 19:31:25.908060 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70b24fea-e1aa-4450-89d7-6f932c84f2c7" containerName="oc" Mar 18 19:31:25 crc kubenswrapper[4830]: I0318 19:31:25.908079 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="70b24fea-e1aa-4450-89d7-6f932c84f2c7" containerName="oc" Mar 18 19:31:25 crc kubenswrapper[4830]: I0318 19:31:25.908424 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f00e1f8-d5f4-4951-af1d-b8704426062d" containerName="collect-profiles" Mar 18 19:31:25 crc kubenswrapper[4830]: I0318 19:31:25.908452 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="70b24fea-e1aa-4450-89d7-6f932c84f2c7" containerName="oc" Mar 18 19:31:25 crc kubenswrapper[4830]: I0318 19:31:25.909553 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 18 19:31:25 crc kubenswrapper[4830]: I0318 19:31:25.911515 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-n6rfz" Mar 18 19:31:25 crc kubenswrapper[4830]: I0318 19:31:25.930968 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Mar 18 19:31:26 crc kubenswrapper[4830]: I0318 19:31:26.032416 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk8nl\" (UniqueName: \"kubernetes.io/projected/11c7a099-6459-475e-9c09-d2acfacec884-kube-api-access-vk8nl\") pod \"mariadb-copy-data\" (UID: \"11c7a099-6459-475e-9c09-d2acfacec884\") " pod="openstack/mariadb-copy-data" Mar 18 19:31:26 crc kubenswrapper[4830]: I0318 19:31:26.032639 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4ac8d60b-d7a9-400d-92e7-7a529aef80b1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4ac8d60b-d7a9-400d-92e7-7a529aef80b1\") pod \"mariadb-copy-data\" (UID: \"11c7a099-6459-475e-9c09-d2acfacec884\") " pod="openstack/mariadb-copy-data" Mar 18 19:31:26 crc kubenswrapper[4830]: I0318 19:31:26.134544 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4ac8d60b-d7a9-400d-92e7-7a529aef80b1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4ac8d60b-d7a9-400d-92e7-7a529aef80b1\") pod \"mariadb-copy-data\" (UID: \"11c7a099-6459-475e-9c09-d2acfacec884\") " pod="openstack/mariadb-copy-data" Mar 18 19:31:26 crc kubenswrapper[4830]: I0318 19:31:26.134650 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk8nl\" (UniqueName: \"kubernetes.io/projected/11c7a099-6459-475e-9c09-d2acfacec884-kube-api-access-vk8nl\") pod \"mariadb-copy-data\" (UID: \"11c7a099-6459-475e-9c09-d2acfacec884\") " pod="openstack/mariadb-copy-data" Mar 18 19:31:26 crc kubenswrapper[4830]: I0318 19:31:26.139338 4830 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 19:31:26 crc kubenswrapper[4830]: I0318 19:31:26.139450 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4ac8d60b-d7a9-400d-92e7-7a529aef80b1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4ac8d60b-d7a9-400d-92e7-7a529aef80b1\") pod \"mariadb-copy-data\" (UID: \"11c7a099-6459-475e-9c09-d2acfacec884\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b16ecd7679df562450391d5981f62b93a5898b7412afb77bb2cdb702c7ea3c73/globalmount\"" pod="openstack/mariadb-copy-data" Mar 18 19:31:26 crc kubenswrapper[4830]: I0318 19:31:26.161391 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk8nl\" (UniqueName: \"kubernetes.io/projected/11c7a099-6459-475e-9c09-d2acfacec884-kube-api-access-vk8nl\") pod \"mariadb-copy-data\" (UID: \"11c7a099-6459-475e-9c09-d2acfacec884\") " pod="openstack/mariadb-copy-data" Mar 18 19:31:26 crc kubenswrapper[4830]: I0318 19:31:26.190014 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4ac8d60b-d7a9-400d-92e7-7a529aef80b1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4ac8d60b-d7a9-400d-92e7-7a529aef80b1\") pod \"mariadb-copy-data\" (UID: \"11c7a099-6459-475e-9c09-d2acfacec884\") " pod="openstack/mariadb-copy-data" Mar 18 19:31:26 crc kubenswrapper[4830]: I0318 19:31:26.237693 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 18 19:31:26 crc kubenswrapper[4830]: I0318 19:31:26.811137 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Mar 18 19:31:27 crc kubenswrapper[4830]: I0318 19:31:27.008740 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"11c7a099-6459-475e-9c09-d2acfacec884","Type":"ContainerStarted","Data":"df953ae65055fb5dc96db58aeca9d58fd3b7502df8bb41c3ecef52e12a616d46"} Mar 18 19:31:27 crc kubenswrapper[4830]: I0318 19:31:27.009158 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"11c7a099-6459-475e-9c09-d2acfacec884","Type":"ContainerStarted","Data":"3e3c79789a315e6f40fb3308aa1642f53ed016fe70842ea87050804126b27ca0"} Mar 18 19:31:27 crc kubenswrapper[4830]: I0318 19:31:27.053294 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=3.053262346 podStartE2EDuration="3.053262346s" podCreationTimestamp="2026-03-18 19:31:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:31:27.048170062 +0000 UTC m=+5321.615800404" watchObservedRunningTime="2026-03-18 19:31:27.053262346 +0000 UTC m=+5321.620892718" Mar 18 19:31:29 crc kubenswrapper[4830]: I0318 19:31:29.934246 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 18 19:31:29 crc kubenswrapper[4830]: I0318 19:31:29.936432 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 19:31:29 crc kubenswrapper[4830]: I0318 19:31:29.943415 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 18 19:31:30 crc kubenswrapper[4830]: I0318 19:31:30.005270 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbxkl\" (UniqueName: \"kubernetes.io/projected/bcb712c7-bcae-45aa-9565-02bd1001336c-kube-api-access-hbxkl\") pod \"mariadb-client\" (UID: \"bcb712c7-bcae-45aa-9565-02bd1001336c\") " pod="openstack/mariadb-client" Mar 18 19:31:30 crc kubenswrapper[4830]: I0318 19:31:30.107343 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbxkl\" (UniqueName: \"kubernetes.io/projected/bcb712c7-bcae-45aa-9565-02bd1001336c-kube-api-access-hbxkl\") pod \"mariadb-client\" (UID: \"bcb712c7-bcae-45aa-9565-02bd1001336c\") " pod="openstack/mariadb-client" Mar 18 19:31:30 crc kubenswrapper[4830]: I0318 19:31:30.138814 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbxkl\" (UniqueName: \"kubernetes.io/projected/bcb712c7-bcae-45aa-9565-02bd1001336c-kube-api-access-hbxkl\") pod \"mariadb-client\" (UID: \"bcb712c7-bcae-45aa-9565-02bd1001336c\") " pod="openstack/mariadb-client" Mar 18 19:31:30 crc kubenswrapper[4830]: I0318 19:31:30.235210 4830 scope.go:117] "RemoveContainer" containerID="2b2583ffa620998bcee9d8c36a0271eaafa77acd768af552c84519fa8e9cd8a5" Mar 18 19:31:30 crc kubenswrapper[4830]: E0318 19:31:30.236244 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:31:30 crc kubenswrapper[4830]: I0318 19:31:30.270636 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 19:31:30 crc kubenswrapper[4830]: I0318 19:31:30.514190 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 18 19:31:30 crc kubenswrapper[4830]: W0318 19:31:30.524307 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbcb712c7_bcae_45aa_9565_02bd1001336c.slice/crio-3f881ea1b86e5276aa8923dbe5777d6c3906677284da5de90ddc864e2abe02e7 WatchSource:0}: Error finding container 3f881ea1b86e5276aa8923dbe5777d6c3906677284da5de90ddc864e2abe02e7: Status 404 returned error can't find the container with id 3f881ea1b86e5276aa8923dbe5777d6c3906677284da5de90ddc864e2abe02e7 Mar 18 19:31:31 crc kubenswrapper[4830]: I0318 19:31:31.047647 4830 generic.go:334] "Generic (PLEG): container finished" podID="bcb712c7-bcae-45aa-9565-02bd1001336c" containerID="fe3295288a11cae05a63cabc68d21d080181f6943b89ea74288fd70d8f0385f5" exitCode=0 Mar 18 19:31:31 crc kubenswrapper[4830]: I0318 19:31:31.047712 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"bcb712c7-bcae-45aa-9565-02bd1001336c","Type":"ContainerDied","Data":"fe3295288a11cae05a63cabc68d21d080181f6943b89ea74288fd70d8f0385f5"} Mar 18 19:31:31 crc kubenswrapper[4830]: I0318 19:31:31.047747 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"bcb712c7-bcae-45aa-9565-02bd1001336c","Type":"ContainerStarted","Data":"3f881ea1b86e5276aa8923dbe5777d6c3906677284da5de90ddc864e2abe02e7"} Mar 18 19:31:32 crc kubenswrapper[4830]: I0318 19:31:32.388230 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 19:31:32 crc kubenswrapper[4830]: I0318 19:31:32.414283 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_bcb712c7-bcae-45aa-9565-02bd1001336c/mariadb-client/0.log" Mar 18 19:31:32 crc kubenswrapper[4830]: I0318 19:31:32.445829 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbxkl\" (UniqueName: \"kubernetes.io/projected/bcb712c7-bcae-45aa-9565-02bd1001336c-kube-api-access-hbxkl\") pod \"bcb712c7-bcae-45aa-9565-02bd1001336c\" (UID: \"bcb712c7-bcae-45aa-9565-02bd1001336c\") " Mar 18 19:31:32 crc kubenswrapper[4830]: I0318 19:31:32.451501 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 18 19:31:32 crc kubenswrapper[4830]: I0318 19:31:32.456363 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcb712c7-bcae-45aa-9565-02bd1001336c-kube-api-access-hbxkl" (OuterVolumeSpecName: "kube-api-access-hbxkl") pod "bcb712c7-bcae-45aa-9565-02bd1001336c" (UID: "bcb712c7-bcae-45aa-9565-02bd1001336c"). InnerVolumeSpecName "kube-api-access-hbxkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:31:32 crc kubenswrapper[4830]: I0318 19:31:32.458040 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 18 19:31:32 crc kubenswrapper[4830]: I0318 19:31:32.548615 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbxkl\" (UniqueName: \"kubernetes.io/projected/bcb712c7-bcae-45aa-9565-02bd1001336c-kube-api-access-hbxkl\") on node \"crc\" DevicePath \"\"" Mar 18 19:31:32 crc kubenswrapper[4830]: I0318 19:31:32.562443 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 18 19:31:32 crc kubenswrapper[4830]: E0318 19:31:32.562802 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcb712c7-bcae-45aa-9565-02bd1001336c" containerName="mariadb-client" Mar 18 19:31:32 crc kubenswrapper[4830]: I0318 19:31:32.562816 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcb712c7-bcae-45aa-9565-02bd1001336c" containerName="mariadb-client" Mar 18 19:31:32 crc kubenswrapper[4830]: I0318 19:31:32.563006 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcb712c7-bcae-45aa-9565-02bd1001336c" containerName="mariadb-client" Mar 18 19:31:32 crc kubenswrapper[4830]: I0318 19:31:32.563547 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 19:31:32 crc kubenswrapper[4830]: I0318 19:31:32.571760 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 18 19:31:32 crc kubenswrapper[4830]: I0318 19:31:32.650225 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbr25\" (UniqueName: \"kubernetes.io/projected/10400d4e-601e-4710-b36d-dd0159ff726a-kube-api-access-tbr25\") pod \"mariadb-client\" (UID: \"10400d4e-601e-4710-b36d-dd0159ff726a\") " pod="openstack/mariadb-client" Mar 18 19:31:32 crc kubenswrapper[4830]: I0318 19:31:32.752040 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbr25\" (UniqueName: \"kubernetes.io/projected/10400d4e-601e-4710-b36d-dd0159ff726a-kube-api-access-tbr25\") pod \"mariadb-client\" (UID: \"10400d4e-601e-4710-b36d-dd0159ff726a\") " pod="openstack/mariadb-client" Mar 18 19:31:32 crc kubenswrapper[4830]: I0318 19:31:32.779992 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbr25\" (UniqueName: \"kubernetes.io/projected/10400d4e-601e-4710-b36d-dd0159ff726a-kube-api-access-tbr25\") pod \"mariadb-client\" (UID: \"10400d4e-601e-4710-b36d-dd0159ff726a\") " pod="openstack/mariadb-client" Mar 18 19:31:32 crc kubenswrapper[4830]: I0318 19:31:32.886229 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 19:31:33 crc kubenswrapper[4830]: I0318 19:31:33.067480 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f881ea1b86e5276aa8923dbe5777d6c3906677284da5de90ddc864e2abe02e7" Mar 18 19:31:33 crc kubenswrapper[4830]: I0318 19:31:33.068263 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 19:31:33 crc kubenswrapper[4830]: I0318 19:31:33.102434 4830 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="bcb712c7-bcae-45aa-9565-02bd1001336c" podUID="10400d4e-601e-4710-b36d-dd0159ff726a" Mar 18 19:31:33 crc kubenswrapper[4830]: I0318 19:31:33.380712 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 18 19:31:33 crc kubenswrapper[4830]: W0318 19:31:33.390339 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10400d4e_601e_4710_b36d_dd0159ff726a.slice/crio-47a288be7d0405a6e077362c92199c1fd182f079393558dcfb9716b488b1add7 WatchSource:0}: Error finding container 47a288be7d0405a6e077362c92199c1fd182f079393558dcfb9716b488b1add7: Status 404 returned error can't find the container with id 47a288be7d0405a6e077362c92199c1fd182f079393558dcfb9716b488b1add7 Mar 18 19:31:34 crc kubenswrapper[4830]: I0318 19:31:34.082970 4830 generic.go:334] "Generic (PLEG): container finished" podID="10400d4e-601e-4710-b36d-dd0159ff726a" containerID="5a6fb9d5be538d1cba3e482b20abe4416f054634107597fd5210bf1c45d1c1f5" exitCode=0 Mar 18 19:31:34 crc kubenswrapper[4830]: I0318 19:31:34.083349 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"10400d4e-601e-4710-b36d-dd0159ff726a","Type":"ContainerDied","Data":"5a6fb9d5be538d1cba3e482b20abe4416f054634107597fd5210bf1c45d1c1f5"} Mar 18 19:31:34 crc kubenswrapper[4830]: I0318 19:31:34.083390 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"10400d4e-601e-4710-b36d-dd0159ff726a","Type":"ContainerStarted","Data":"47a288be7d0405a6e077362c92199c1fd182f079393558dcfb9716b488b1add7"} Mar 18 19:31:34 crc kubenswrapper[4830]: I0318 19:31:34.249734 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcb712c7-bcae-45aa-9565-02bd1001336c" path="/var/lib/kubelet/pods/bcb712c7-bcae-45aa-9565-02bd1001336c/volumes" Mar 18 19:31:35 crc kubenswrapper[4830]: I0318 19:31:35.487212 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 19:31:35 crc kubenswrapper[4830]: I0318 19:31:35.501789 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbr25\" (UniqueName: \"kubernetes.io/projected/10400d4e-601e-4710-b36d-dd0159ff726a-kube-api-access-tbr25\") pod \"10400d4e-601e-4710-b36d-dd0159ff726a\" (UID: \"10400d4e-601e-4710-b36d-dd0159ff726a\") " Mar 18 19:31:35 crc kubenswrapper[4830]: I0318 19:31:35.504868 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_10400d4e-601e-4710-b36d-dd0159ff726a/mariadb-client/0.log" Mar 18 19:31:35 crc kubenswrapper[4830]: I0318 19:31:35.507890 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10400d4e-601e-4710-b36d-dd0159ff726a-kube-api-access-tbr25" (OuterVolumeSpecName: "kube-api-access-tbr25") pod "10400d4e-601e-4710-b36d-dd0159ff726a" (UID: "10400d4e-601e-4710-b36d-dd0159ff726a"). InnerVolumeSpecName "kube-api-access-tbr25". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:31:35 crc kubenswrapper[4830]: I0318 19:31:35.531830 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 18 19:31:35 crc kubenswrapper[4830]: I0318 19:31:35.539415 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 18 19:31:35 crc kubenswrapper[4830]: I0318 19:31:35.605375 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbr25\" (UniqueName: \"kubernetes.io/projected/10400d4e-601e-4710-b36d-dd0159ff726a-kube-api-access-tbr25\") on node \"crc\" DevicePath \"\"" Mar 18 19:31:36 crc kubenswrapper[4830]: I0318 19:31:36.106444 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47a288be7d0405a6e077362c92199c1fd182f079393558dcfb9716b488b1add7" Mar 18 19:31:36 crc kubenswrapper[4830]: I0318 19:31:36.106508 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 19:31:36 crc kubenswrapper[4830]: I0318 19:31:36.253119 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10400d4e-601e-4710-b36d-dd0159ff726a" path="/var/lib/kubelet/pods/10400d4e-601e-4710-b36d-dd0159ff726a/volumes" Mar 18 19:31:44 crc kubenswrapper[4830]: I0318 19:31:44.235046 4830 scope.go:117] "RemoveContainer" containerID="2b2583ffa620998bcee9d8c36a0271eaafa77acd768af552c84519fa8e9cd8a5" Mar 18 19:31:44 crc kubenswrapper[4830]: E0318 19:31:44.236390 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:31:56 crc kubenswrapper[4830]: I0318 19:31:56.244659 4830 scope.go:117] "RemoveContainer" containerID="2b2583ffa620998bcee9d8c36a0271eaafa77acd768af552c84519fa8e9cd8a5" Mar 18 19:31:56 crc kubenswrapper[4830]: E0318 19:31:56.246634 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:32:00 crc kubenswrapper[4830]: I0318 19:32:00.162404 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564372-f5wwf"] Mar 18 19:32:00 crc kubenswrapper[4830]: E0318 19:32:00.163332 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10400d4e-601e-4710-b36d-dd0159ff726a" containerName="mariadb-client" Mar 18 19:32:00 crc kubenswrapper[4830]: I0318 19:32:00.163352 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="10400d4e-601e-4710-b36d-dd0159ff726a" containerName="mariadb-client" Mar 18 19:32:00 crc kubenswrapper[4830]: I0318 19:32:00.163601 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="10400d4e-601e-4710-b36d-dd0159ff726a" containerName="mariadb-client" Mar 18 19:32:00 crc kubenswrapper[4830]: I0318 19:32:00.164384 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564372-f5wwf" Mar 18 19:32:00 crc kubenswrapper[4830]: I0318 19:32:00.167549 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:32:00 crc kubenswrapper[4830]: I0318 19:32:00.167652 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:32:00 crc kubenswrapper[4830]: I0318 19:32:00.168322 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 19:32:00 crc kubenswrapper[4830]: I0318 19:32:00.173089 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564372-f5wwf"] Mar 18 19:32:00 crc kubenswrapper[4830]: I0318 19:32:00.323737 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7dhh\" (UniqueName: \"kubernetes.io/projected/31812e16-dacb-42b9-90ca-91a269bc452e-kube-api-access-b7dhh\") pod \"auto-csr-approver-29564372-f5wwf\" (UID: \"31812e16-dacb-42b9-90ca-91a269bc452e\") " pod="openshift-infra/auto-csr-approver-29564372-f5wwf" Mar 18 19:32:00 crc kubenswrapper[4830]: I0318 19:32:00.425598 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7dhh\" (UniqueName: \"kubernetes.io/projected/31812e16-dacb-42b9-90ca-91a269bc452e-kube-api-access-b7dhh\") pod \"auto-csr-approver-29564372-f5wwf\" (UID: \"31812e16-dacb-42b9-90ca-91a269bc452e\") " pod="openshift-infra/auto-csr-approver-29564372-f5wwf" Mar 18 19:32:00 crc kubenswrapper[4830]: I0318 19:32:00.463870 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7dhh\" (UniqueName: \"kubernetes.io/projected/31812e16-dacb-42b9-90ca-91a269bc452e-kube-api-access-b7dhh\") pod \"auto-csr-approver-29564372-f5wwf\" (UID: \"31812e16-dacb-42b9-90ca-91a269bc452e\") " pod="openshift-infra/auto-csr-approver-29564372-f5wwf" Mar 18 19:32:00 crc kubenswrapper[4830]: I0318 19:32:00.484613 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564372-f5wwf" Mar 18 19:32:00 crc kubenswrapper[4830]: I0318 19:32:00.760977 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564372-f5wwf"] Mar 18 19:32:00 crc kubenswrapper[4830]: I0318 19:32:00.773199 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 19:32:01 crc kubenswrapper[4830]: I0318 19:32:01.336346 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564372-f5wwf" event={"ID":"31812e16-dacb-42b9-90ca-91a269bc452e","Type":"ContainerStarted","Data":"5fa0486c37636ed420aa474240329e4636093c47d2277daccc9d6aea4a268a05"} Mar 18 19:32:02 crc kubenswrapper[4830]: I0318 19:32:02.346369 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564372-f5wwf" event={"ID":"31812e16-dacb-42b9-90ca-91a269bc452e","Type":"ContainerStarted","Data":"8619914e4a023da11a3e9be1f7c2e238278ad285ee79f0240a5698a24e4b92cd"} Mar 18 19:32:02 crc kubenswrapper[4830]: I0318 19:32:02.366975 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564372-f5wwf" podStartSLOduration=1.233957675 podStartE2EDuration="2.36695037s" podCreationTimestamp="2026-03-18 19:32:00 +0000 UTC" firstStartedPulling="2026-03-18 19:32:00.772870268 +0000 UTC m=+5355.340500600" lastFinishedPulling="2026-03-18 19:32:01.905862963 +0000 UTC m=+5356.473493295" observedRunningTime="2026-03-18 19:32:02.359758537 +0000 UTC m=+5356.927388879" watchObservedRunningTime="2026-03-18 19:32:02.36695037 +0000 UTC m=+5356.934580702" Mar 18 19:32:03 crc kubenswrapper[4830]: I0318 19:32:03.358948 4830 generic.go:334] "Generic (PLEG): container finished" podID="31812e16-dacb-42b9-90ca-91a269bc452e" containerID="8619914e4a023da11a3e9be1f7c2e238278ad285ee79f0240a5698a24e4b92cd" exitCode=0 Mar 18 19:32:03 crc kubenswrapper[4830]: I0318 19:32:03.359001 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564372-f5wwf" event={"ID":"31812e16-dacb-42b9-90ca-91a269bc452e","Type":"ContainerDied","Data":"8619914e4a023da11a3e9be1f7c2e238278ad285ee79f0240a5698a24e4b92cd"} Mar 18 19:32:04 crc kubenswrapper[4830]: I0318 19:32:04.756024 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564372-f5wwf" Mar 18 19:32:04 crc kubenswrapper[4830]: I0318 19:32:04.901144 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7dhh\" (UniqueName: \"kubernetes.io/projected/31812e16-dacb-42b9-90ca-91a269bc452e-kube-api-access-b7dhh\") pod \"31812e16-dacb-42b9-90ca-91a269bc452e\" (UID: \"31812e16-dacb-42b9-90ca-91a269bc452e\") " Mar 18 19:32:04 crc kubenswrapper[4830]: I0318 19:32:04.910459 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31812e16-dacb-42b9-90ca-91a269bc452e-kube-api-access-b7dhh" (OuterVolumeSpecName: "kube-api-access-b7dhh") pod "31812e16-dacb-42b9-90ca-91a269bc452e" (UID: "31812e16-dacb-42b9-90ca-91a269bc452e"). InnerVolumeSpecName "kube-api-access-b7dhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:32:05 crc kubenswrapper[4830]: I0318 19:32:05.003173 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7dhh\" (UniqueName: \"kubernetes.io/projected/31812e16-dacb-42b9-90ca-91a269bc452e-kube-api-access-b7dhh\") on node \"crc\" DevicePath \"\"" Mar 18 19:32:05 crc kubenswrapper[4830]: I0318 19:32:05.379354 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564372-f5wwf" event={"ID":"31812e16-dacb-42b9-90ca-91a269bc452e","Type":"ContainerDied","Data":"5fa0486c37636ed420aa474240329e4636093c47d2277daccc9d6aea4a268a05"} Mar 18 19:32:05 crc kubenswrapper[4830]: I0318 19:32:05.379428 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fa0486c37636ed420aa474240329e4636093c47d2277daccc9d6aea4a268a05" Mar 18 19:32:05 crc kubenswrapper[4830]: I0318 19:32:05.379439 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564372-f5wwf" Mar 18 19:32:05 crc kubenswrapper[4830]: I0318 19:32:05.443040 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564366-dwmzn"] Mar 18 19:32:05 crc kubenswrapper[4830]: I0318 19:32:05.450529 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564366-dwmzn"] Mar 18 19:32:06 crc kubenswrapper[4830]: I0318 19:32:06.257412 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84e011eb-ce09-43f1-82ec-f3d3c1b025b4" path="/var/lib/kubelet/pods/84e011eb-ce09-43f1-82ec-f3d3c1b025b4/volumes" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.122897 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 19:32:07 crc kubenswrapper[4830]: E0318 19:32:07.123308 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31812e16-dacb-42b9-90ca-91a269bc452e" containerName="oc" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.123342 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="31812e16-dacb-42b9-90ca-91a269bc452e" containerName="oc" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.123669 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="31812e16-dacb-42b9-90ca-91a269bc452e" containerName="oc" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.125324 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.127720 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.128227 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.128441 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.128899 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.129415 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-n9jt5" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.144879 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.147524 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.156067 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.166739 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.176637 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.178891 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.193806 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.234659 4830 scope.go:117] "RemoveContainer" containerID="2b2583ffa620998bcee9d8c36a0271eaafa77acd768af552c84519fa8e9cd8a5" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.237637 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4grqk\" (UniqueName: \"kubernetes.io/projected/b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06-kube-api-access-4grqk\") pod \"ovsdbserver-sb-0\" (UID: \"b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.237669 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.237686 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.237722 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.237745 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06-config\") pod \"ovsdbserver-sb-0\" (UID: \"b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.237796 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.237835 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ec175dde-29e2-445b-b288-d8f054a98111\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ec175dde-29e2-445b-b288-d8f054a98111\") pod \"ovsdbserver-sb-0\" (UID: \"b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.237925 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.339511 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b542ab9f-9954-4ce5-9b4a-e043befc3ffb-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"b542ab9f-9954-4ce5-9b4a-e043befc3ffb\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.339555 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b542ab9f-9954-4ce5-9b4a-e043befc3ffb-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"b542ab9f-9954-4ce5-9b4a-e043befc3ffb\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.339703 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ec175dde-29e2-445b-b288-d8f054a98111\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ec175dde-29e2-445b-b288-d8f054a98111\") pod \"ovsdbserver-sb-0\" (UID: \"b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.339749 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b542ab9f-9954-4ce5-9b4a-e043befc3ffb-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"b542ab9f-9954-4ce5-9b4a-e043befc3ffb\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.339796 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrxnp\" (UniqueName: \"kubernetes.io/projected/aa7136b8-263e-426e-9a0b-b9951c57dc16-kube-api-access-vrxnp\") pod \"ovsdbserver-sb-2\" (UID: \"aa7136b8-263e-426e-9a0b-b9951c57dc16\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.340173 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/aa7136b8-263e-426e-9a0b-b9951c57dc16-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"aa7136b8-263e-426e-9a0b-b9951c57dc16\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.340303 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa7136b8-263e-426e-9a0b-b9951c57dc16-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"aa7136b8-263e-426e-9a0b-b9951c57dc16\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.340348 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa7136b8-263e-426e-9a0b-b9951c57dc16-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"aa7136b8-263e-426e-9a0b-b9951c57dc16\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.340409 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.340599 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b542ab9f-9954-4ce5-9b4a-e043befc3ffb-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"b542ab9f-9954-4ce5-9b4a-e043befc3ffb\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.340742 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b542ab9f-9954-4ce5-9b4a-e043befc3ffb-config\") pod \"ovsdbserver-sb-1\" (UID: \"b542ab9f-9954-4ce5-9b4a-e043befc3ffb\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.340801 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa7136b8-263e-426e-9a0b-b9951c57dc16-config\") pod \"ovsdbserver-sb-2\" (UID: \"aa7136b8-263e-426e-9a0b-b9951c57dc16\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.341890 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4grqk\" (UniqueName: \"kubernetes.io/projected/b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06-kube-api-access-4grqk\") pod \"ovsdbserver-sb-0\" (UID: \"b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.341929 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.341956 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.342014 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa7136b8-263e-426e-9a0b-b9951c57dc16-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"aa7136b8-263e-426e-9a0b-b9951c57dc16\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.342057 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b542ab9f-9954-4ce5-9b4a-e043befc3ffb-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"b542ab9f-9954-4ce5-9b4a-e043befc3ffb\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.342088 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcxqc\" (UniqueName: \"kubernetes.io/projected/b542ab9f-9954-4ce5-9b4a-e043befc3ffb-kube-api-access-jcxqc\") pod \"ovsdbserver-sb-1\" (UID: \"b542ab9f-9954-4ce5-9b4a-e043befc3ffb\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.342126 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06-config\") pod \"ovsdbserver-sb-0\" (UID: \"b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.342151 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.342201 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa7136b8-263e-426e-9a0b-b9951c57dc16-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"aa7136b8-263e-426e-9a0b-b9951c57dc16\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.342277 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e38b8410-9e8c-4240-a897-db813e0bda49\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e38b8410-9e8c-4240-a897-db813e0bda49\") pod \"ovsdbserver-sb-2\" (UID: \"aa7136b8-263e-426e-9a0b-b9951c57dc16\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.342320 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-dd401975-ad09-4e4d-a943-01aad83a3778\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd401975-ad09-4e4d-a943-01aad83a3778\") pod \"ovsdbserver-sb-1\" (UID: \"b542ab9f-9954-4ce5-9b4a-e043befc3ffb\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.342347 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.342696 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.343854 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06-config\") pod \"ovsdbserver-sb-0\" (UID: \"b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.344010 4830 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.344046 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ec175dde-29e2-445b-b288-d8f054a98111\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ec175dde-29e2-445b-b288-d8f054a98111\") pod \"ovsdbserver-sb-0\" (UID: \"b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d448540bb1be65648908875fe4aa79dac7d1257ee7cb55a80599e0aa8c254f47/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.344427 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.348417 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.349512 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.355560 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.372062 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4grqk\" (UniqueName: \"kubernetes.io/projected/b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06-kube-api-access-4grqk\") pod \"ovsdbserver-sb-0\" (UID: \"b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.404160 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ec175dde-29e2-445b-b288-d8f054a98111\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ec175dde-29e2-445b-b288-d8f054a98111\") pod \"ovsdbserver-sb-0\" (UID: \"b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.443750 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcxqc\" (UniqueName: \"kubernetes.io/projected/b542ab9f-9954-4ce5-9b4a-e043befc3ffb-kube-api-access-jcxqc\") pod \"ovsdbserver-sb-1\" (UID: \"b542ab9f-9954-4ce5-9b4a-e043befc3ffb\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.443832 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa7136b8-263e-426e-9a0b-b9951c57dc16-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"aa7136b8-263e-426e-9a0b-b9951c57dc16\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.443859 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e38b8410-9e8c-4240-a897-db813e0bda49\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e38b8410-9e8c-4240-a897-db813e0bda49\") pod \"ovsdbserver-sb-2\" (UID: \"aa7136b8-263e-426e-9a0b-b9951c57dc16\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.443881 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-dd401975-ad09-4e4d-a943-01aad83a3778\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd401975-ad09-4e4d-a943-01aad83a3778\") pod \"ovsdbserver-sb-1\" (UID: \"b542ab9f-9954-4ce5-9b4a-e043befc3ffb\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.443908 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b542ab9f-9954-4ce5-9b4a-e043befc3ffb-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"b542ab9f-9954-4ce5-9b4a-e043befc3ffb\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.443926 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b542ab9f-9954-4ce5-9b4a-e043befc3ffb-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"b542ab9f-9954-4ce5-9b4a-e043befc3ffb\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.443953 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b542ab9f-9954-4ce5-9b4a-e043befc3ffb-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"b542ab9f-9954-4ce5-9b4a-e043befc3ffb\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.443973 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrxnp\" (UniqueName: \"kubernetes.io/projected/aa7136b8-263e-426e-9a0b-b9951c57dc16-kube-api-access-vrxnp\") pod \"ovsdbserver-sb-2\" (UID: \"aa7136b8-263e-426e-9a0b-b9951c57dc16\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.444020 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/aa7136b8-263e-426e-9a0b-b9951c57dc16-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"aa7136b8-263e-426e-9a0b-b9951c57dc16\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.444058 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa7136b8-263e-426e-9a0b-b9951c57dc16-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"aa7136b8-263e-426e-9a0b-b9951c57dc16\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.444079 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa7136b8-263e-426e-9a0b-b9951c57dc16-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"aa7136b8-263e-426e-9a0b-b9951c57dc16\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.444118 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b542ab9f-9954-4ce5-9b4a-e043befc3ffb-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"b542ab9f-9954-4ce5-9b4a-e043befc3ffb\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.444150 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b542ab9f-9954-4ce5-9b4a-e043befc3ffb-config\") pod \"ovsdbserver-sb-1\" (UID: \"b542ab9f-9954-4ce5-9b4a-e043befc3ffb\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.444167 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa7136b8-263e-426e-9a0b-b9951c57dc16-config\") pod \"ovsdbserver-sb-2\" (UID: \"aa7136b8-263e-426e-9a0b-b9951c57dc16\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.444196 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa7136b8-263e-426e-9a0b-b9951c57dc16-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"aa7136b8-263e-426e-9a0b-b9951c57dc16\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.444216 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b542ab9f-9954-4ce5-9b4a-e043befc3ffb-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"b542ab9f-9954-4ce5-9b4a-e043befc3ffb\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.444703 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b542ab9f-9954-4ce5-9b4a-e043befc3ffb-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"b542ab9f-9954-4ce5-9b4a-e043befc3ffb\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.446964 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa7136b8-263e-426e-9a0b-b9951c57dc16-config\") pod \"ovsdbserver-sb-2\" (UID: \"aa7136b8-263e-426e-9a0b-b9951c57dc16\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.447257 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/aa7136b8-263e-426e-9a0b-b9951c57dc16-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"aa7136b8-263e-426e-9a0b-b9951c57dc16\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.447651 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b542ab9f-9954-4ce5-9b4a-e043befc3ffb-config\") pod \"ovsdbserver-sb-1\" (UID: \"b542ab9f-9954-4ce5-9b4a-e043befc3ffb\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.448040 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa7136b8-263e-426e-9a0b-b9951c57dc16-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"aa7136b8-263e-426e-9a0b-b9951c57dc16\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.449316 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b542ab9f-9954-4ce5-9b4a-e043befc3ffb-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"b542ab9f-9954-4ce5-9b4a-e043befc3ffb\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.449621 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b542ab9f-9954-4ce5-9b4a-e043befc3ffb-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"b542ab9f-9954-4ce5-9b4a-e043befc3ffb\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.450294 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b542ab9f-9954-4ce5-9b4a-e043befc3ffb-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"b542ab9f-9954-4ce5-9b4a-e043befc3ffb\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.451501 4830 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.451538 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-dd401975-ad09-4e4d-a943-01aad83a3778\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd401975-ad09-4e4d-a943-01aad83a3778\") pod \"ovsdbserver-sb-1\" (UID: \"b542ab9f-9954-4ce5-9b4a-e043befc3ffb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2ef1b7d6536d37c8ef06a6839e9933ce3a27cf3662572c87f1e203cc41134c94/globalmount\"" pod="openstack/ovsdbserver-sb-1" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.453064 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa7136b8-263e-426e-9a0b-b9951c57dc16-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"aa7136b8-263e-426e-9a0b-b9951c57dc16\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.453271 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa7136b8-263e-426e-9a0b-b9951c57dc16-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"aa7136b8-263e-426e-9a0b-b9951c57dc16\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.455245 4830 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.455488 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e38b8410-9e8c-4240-a897-db813e0bda49\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e38b8410-9e8c-4240-a897-db813e0bda49\") pod \"ovsdbserver-sb-2\" (UID: \"aa7136b8-263e-426e-9a0b-b9951c57dc16\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d0b0092ccead61e1a0c1743ee3781a637f776ac2ada511f9c67869aea4bdcd8e/globalmount\"" pod="openstack/ovsdbserver-sb-2" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.457719 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa7136b8-263e-426e-9a0b-b9951c57dc16-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"aa7136b8-263e-426e-9a0b-b9951c57dc16\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.457735 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b542ab9f-9954-4ce5-9b4a-e043befc3ffb-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"b542ab9f-9954-4ce5-9b4a-e043befc3ffb\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.468042 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrxnp\" (UniqueName: \"kubernetes.io/projected/aa7136b8-263e-426e-9a0b-b9951c57dc16-kube-api-access-vrxnp\") pod \"ovsdbserver-sb-2\" (UID: \"aa7136b8-263e-426e-9a0b-b9951c57dc16\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.470055 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcxqc\" (UniqueName: \"kubernetes.io/projected/b542ab9f-9954-4ce5-9b4a-e043befc3ffb-kube-api-access-jcxqc\") pod \"ovsdbserver-sb-1\" (UID: \"b542ab9f-9954-4ce5-9b4a-e043befc3ffb\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.482216 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.497905 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-dd401975-ad09-4e4d-a943-01aad83a3778\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd401975-ad09-4e4d-a943-01aad83a3778\") pod \"ovsdbserver-sb-1\" (UID: \"b542ab9f-9954-4ce5-9b4a-e043befc3ffb\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.499526 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e38b8410-9e8c-4240-a897-db813e0bda49\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e38b8410-9e8c-4240-a897-db813e0bda49\") pod \"ovsdbserver-sb-2\" (UID: \"aa7136b8-263e-426e-9a0b-b9951c57dc16\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.506918 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Mar 18 19:32:07 crc kubenswrapper[4830]: I0318 19:32:07.796752 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Mar 18 19:32:08 crc kubenswrapper[4830]: I0318 19:32:08.064743 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 18 19:32:08 crc kubenswrapper[4830]: W0318 19:32:08.326180 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa7136b8_263e_426e_9a0b_b9951c57dc16.slice/crio-8b441c256415ab405521e4902e89e441c2061d1d309920b2c14be9dc76c6129f WatchSource:0}: Error finding container 8b441c256415ab405521e4902e89e441c2061d1d309920b2c14be9dc76c6129f: Status 404 returned error can't find the container with id 8b441c256415ab405521e4902e89e441c2061d1d309920b2c14be9dc76c6129f Mar 18 19:32:08 crc kubenswrapper[4830]: I0318 19:32:08.330102 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 18 19:32:08 crc kubenswrapper[4830]: I0318 19:32:08.424249 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerStarted","Data":"359983fc39dc77da53ab9c5f404699ef39069a5a5ae55ff906f4e3793c0766a4"} Mar 18 19:32:08 crc kubenswrapper[4830]: I0318 19:32:08.452566 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"b542ab9f-9954-4ce5-9b4a-e043befc3ffb","Type":"ContainerStarted","Data":"d4fe8777a5ca37cf2bfbd03cd5ed0882e5b924c8a0539570b7dc58dcd5948416"} Mar 18 19:32:08 crc kubenswrapper[4830]: I0318 19:32:08.452607 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"b542ab9f-9954-4ce5-9b4a-e043befc3ffb","Type":"ContainerStarted","Data":"bf71274fd6a44157293de7c23984d7e81d766ea4cec38ed45a6ad96b948367d4"} Mar 18 19:32:08 crc kubenswrapper[4830]: I0318 19:32:08.480540 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"aa7136b8-263e-426e-9a0b-b9951c57dc16","Type":"ContainerStarted","Data":"8b441c256415ab405521e4902e89e441c2061d1d309920b2c14be9dc76c6129f"} Mar 18 19:32:08 crc kubenswrapper[4830]: I0318 19:32:08.701369 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 19:32:08 crc kubenswrapper[4830]: W0318 19:32:08.709862 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9e7c8a4_0bd6_4f1b_b681_3590db6d0d06.slice/crio-88c02bf012bea81011a9c4f3d300c1696a9f7447f65fa7243d2e4f2b833df84a WatchSource:0}: Error finding container 88c02bf012bea81011a9c4f3d300c1696a9f7447f65fa7243d2e4f2b833df84a: Status 404 returned error can't find the container with id 88c02bf012bea81011a9c4f3d300c1696a9f7447f65fa7243d2e4f2b833df84a Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.408792 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.410883 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.413552 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-kqk2h" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.413557 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.413602 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.413907 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.433749 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.447046 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.448730 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.458333 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.460197 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.470358 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.480618 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.488417 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a144c27-96d7-47b3-abb0-ceefa288f311-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3a144c27-96d7-47b3-abb0-ceefa288f311\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.488544 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-921c3430-7992-4843-8cfd-5c4f847f6796\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-921c3430-7992-4843-8cfd-5c4f847f6796\") pod \"ovsdbserver-nb-0\" (UID: \"3a144c27-96d7-47b3-abb0-ceefa288f311\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.488651 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3a144c27-96d7-47b3-abb0-ceefa288f311-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3a144c27-96d7-47b3-abb0-ceefa288f311\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.488748 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a144c27-96d7-47b3-abb0-ceefa288f311-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3a144c27-96d7-47b3-abb0-ceefa288f311\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.488898 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a144c27-96d7-47b3-abb0-ceefa288f311-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3a144c27-96d7-47b3-abb0-ceefa288f311\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.488956 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxg9s\" (UniqueName: \"kubernetes.io/projected/3a144c27-96d7-47b3-abb0-ceefa288f311-kube-api-access-bxg9s\") pod \"ovsdbserver-nb-0\" (UID: \"3a144c27-96d7-47b3-abb0-ceefa288f311\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.488992 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a144c27-96d7-47b3-abb0-ceefa288f311-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3a144c27-96d7-47b3-abb0-ceefa288f311\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.489080 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a144c27-96d7-47b3-abb0-ceefa288f311-config\") pod \"ovsdbserver-nb-0\" (UID: \"3a144c27-96d7-47b3-abb0-ceefa288f311\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.495170 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"b542ab9f-9954-4ce5-9b4a-e043befc3ffb","Type":"ContainerStarted","Data":"286fa8bcaa5bac433bba3bb5a3dfe36a569baf7639ffec85e27bdef57cdfd0d1"} Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.499913 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06","Type":"ContainerStarted","Data":"82efd8660335d7d0e7e387e55ed9fa37b405975440d64f6e383a125ed2a5d73f"} Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.499952 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06","Type":"ContainerStarted","Data":"8a6241e99d8a5be3eb667edb7e0c726d2d7eccd1f0088679365a167ee69551f6"} Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.499995 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06","Type":"ContainerStarted","Data":"88c02bf012bea81011a9c4f3d300c1696a9f7447f65fa7243d2e4f2b833df84a"} Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.502054 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"aa7136b8-263e-426e-9a0b-b9951c57dc16","Type":"ContainerStarted","Data":"c03d0843da31cc6e65be7e1996e2d397ad150b4b838699e057f226f1ec8d836e"} Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.502097 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"aa7136b8-263e-426e-9a0b-b9951c57dc16","Type":"ContainerStarted","Data":"d312d671261f976bed667f9018338a4d77a6cfd25d8a0ab733f10c3a5fe6923e"} Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.530251 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=3.53022984 podStartE2EDuration="3.53022984s" podCreationTimestamp="2026-03-18 19:32:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:32:09.525159727 +0000 UTC m=+5364.092790089" watchObservedRunningTime="2026-03-18 19:32:09.53022984 +0000 UTC m=+5364.097860182" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.553579 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.553557968 podStartE2EDuration="3.553557968s" podCreationTimestamp="2026-03-18 19:32:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:32:09.550049759 +0000 UTC m=+5364.117680101" watchObservedRunningTime="2026-03-18 19:32:09.553557968 +0000 UTC m=+5364.121188310" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.586842 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=3.5868184850000002 podStartE2EDuration="3.586818485s" podCreationTimestamp="2026-03-18 19:32:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:32:09.574061416 +0000 UTC m=+5364.141691758" watchObservedRunningTime="2026-03-18 19:32:09.586818485 +0000 UTC m=+5364.154448817" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.590529 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a144c27-96d7-47b3-abb0-ceefa288f311-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3a144c27-96d7-47b3-abb0-ceefa288f311\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.590580 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxg9s\" (UniqueName: \"kubernetes.io/projected/3a144c27-96d7-47b3-abb0-ceefa288f311-kube-api-access-bxg9s\") pod \"ovsdbserver-nb-0\" (UID: \"3a144c27-96d7-47b3-abb0-ceefa288f311\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.590609 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bbtx\" (UniqueName: \"kubernetes.io/projected/c6fd0990-a53c-473e-9a5f-44721bfac06c-kube-api-access-7bbtx\") pod \"ovsdbserver-nb-2\" (UID: \"c6fd0990-a53c-473e-9a5f-44721bfac06c\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.590640 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a144c27-96d7-47b3-abb0-ceefa288f311-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3a144c27-96d7-47b3-abb0-ceefa288f311\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.590669 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6fd0990-a53c-473e-9a5f-44721bfac06c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"c6fd0990-a53c-473e-9a5f-44721bfac06c\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.590711 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6fd0990-a53c-473e-9a5f-44721bfac06c-config\") pod \"ovsdbserver-nb-2\" (UID: \"c6fd0990-a53c-473e-9a5f-44721bfac06c\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.590741 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6fd0990-a53c-473e-9a5f-44721bfac06c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"c6fd0990-a53c-473e-9a5f-44721bfac06c\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.590758 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c6fd0990-a53c-473e-9a5f-44721bfac06c-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"c6fd0990-a53c-473e-9a5f-44721bfac06c\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.590800 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a144c27-96d7-47b3-abb0-ceefa288f311-config\") pod \"ovsdbserver-nb-0\" (UID: \"3a144c27-96d7-47b3-abb0-ceefa288f311\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.591619 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c6fd0990-a53c-473e-9a5f-44721bfac06c-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"c6fd0990-a53c-473e-9a5f-44721bfac06c\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.591687 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a66a884-3818-43a9-86e9-27d718abe5a6-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"7a66a884-3818-43a9-86e9-27d718abe5a6\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.591717 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a66a884-3818-43a9-86e9-27d718abe5a6-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"7a66a884-3818-43a9-86e9-27d718abe5a6\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.591817 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a66a884-3818-43a9-86e9-27d718abe5a6-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"7a66a884-3818-43a9-86e9-27d718abe5a6\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.591903 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a144c27-96d7-47b3-abb0-ceefa288f311-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3a144c27-96d7-47b3-abb0-ceefa288f311\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.591938 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-921c3430-7992-4843-8cfd-5c4f847f6796\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-921c3430-7992-4843-8cfd-5c4f847f6796\") pod \"ovsdbserver-nb-0\" (UID: \"3a144c27-96d7-47b3-abb0-ceefa288f311\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.591981 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3a144c27-96d7-47b3-abb0-ceefa288f311-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3a144c27-96d7-47b3-abb0-ceefa288f311\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.592020 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a66a884-3818-43a9-86e9-27d718abe5a6-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"7a66a884-3818-43a9-86e9-27d718abe5a6\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.592072 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a66a884-3818-43a9-86e9-27d718abe5a6-config\") pod \"ovsdbserver-nb-1\" (UID: \"7a66a884-3818-43a9-86e9-27d718abe5a6\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.592107 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkpfk\" (UniqueName: \"kubernetes.io/projected/7a66a884-3818-43a9-86e9-27d718abe5a6-kube-api-access-fkpfk\") pod \"ovsdbserver-nb-1\" (UID: \"7a66a884-3818-43a9-86e9-27d718abe5a6\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.592137 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6fd0990-a53c-473e-9a5f-44721bfac06c-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"c6fd0990-a53c-473e-9a5f-44721bfac06c\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.592165 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5028261b-2fba-4f85-b33b-838a7a592436\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5028261b-2fba-4f85-b33b-838a7a592436\") pod \"ovsdbserver-nb-1\" (UID: \"7a66a884-3818-43a9-86e9-27d718abe5a6\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.592194 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7a66a884-3818-43a9-86e9-27d718abe5a6-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"7a66a884-3818-43a9-86e9-27d718abe5a6\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.592300 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a144c27-96d7-47b3-abb0-ceefa288f311-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3a144c27-96d7-47b3-abb0-ceefa288f311\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.592347 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0b0a1f6a-5ad9-495f-b738-a863e8e7af55\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0b0a1f6a-5ad9-495f-b738-a863e8e7af55\") pod \"ovsdbserver-nb-2\" (UID: \"c6fd0990-a53c-473e-9a5f-44721bfac06c\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.593531 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3a144c27-96d7-47b3-abb0-ceefa288f311-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3a144c27-96d7-47b3-abb0-ceefa288f311\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.594124 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a144c27-96d7-47b3-abb0-ceefa288f311-config\") pod \"ovsdbserver-nb-0\" (UID: \"3a144c27-96d7-47b3-abb0-ceefa288f311\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.594235 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a144c27-96d7-47b3-abb0-ceefa288f311-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3a144c27-96d7-47b3-abb0-ceefa288f311\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.596151 4830 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.596189 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-921c3430-7992-4843-8cfd-5c4f847f6796\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-921c3430-7992-4843-8cfd-5c4f847f6796\") pod \"ovsdbserver-nb-0\" (UID: \"3a144c27-96d7-47b3-abb0-ceefa288f311\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cdb84588964fd3d1c498cfb17e14451d16e5b09e5e2f5589b0b0d6221d507e9c/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.602738 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a144c27-96d7-47b3-abb0-ceefa288f311-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3a144c27-96d7-47b3-abb0-ceefa288f311\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.602996 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a144c27-96d7-47b3-abb0-ceefa288f311-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3a144c27-96d7-47b3-abb0-ceefa288f311\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.604110 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a144c27-96d7-47b3-abb0-ceefa288f311-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3a144c27-96d7-47b3-abb0-ceefa288f311\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.609518 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxg9s\" (UniqueName: \"kubernetes.io/projected/3a144c27-96d7-47b3-abb0-ceefa288f311-kube-api-access-bxg9s\") pod \"ovsdbserver-nb-0\" (UID: \"3a144c27-96d7-47b3-abb0-ceefa288f311\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.636456 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-921c3430-7992-4843-8cfd-5c4f847f6796\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-921c3430-7992-4843-8cfd-5c4f847f6796\") pod \"ovsdbserver-nb-0\" (UID: \"3a144c27-96d7-47b3-abb0-ceefa288f311\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.694882 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bbtx\" (UniqueName: \"kubernetes.io/projected/c6fd0990-a53c-473e-9a5f-44721bfac06c-kube-api-access-7bbtx\") pod \"ovsdbserver-nb-2\" (UID: \"c6fd0990-a53c-473e-9a5f-44721bfac06c\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.695079 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6fd0990-a53c-473e-9a5f-44721bfac06c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"c6fd0990-a53c-473e-9a5f-44721bfac06c\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.695150 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6fd0990-a53c-473e-9a5f-44721bfac06c-config\") pod \"ovsdbserver-nb-2\" (UID: \"c6fd0990-a53c-473e-9a5f-44721bfac06c\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.695198 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c6fd0990-a53c-473e-9a5f-44721bfac06c-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"c6fd0990-a53c-473e-9a5f-44721bfac06c\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.695237 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6fd0990-a53c-473e-9a5f-44721bfac06c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"c6fd0990-a53c-473e-9a5f-44721bfac06c\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.695299 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c6fd0990-a53c-473e-9a5f-44721bfac06c-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"c6fd0990-a53c-473e-9a5f-44721bfac06c\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.695343 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a66a884-3818-43a9-86e9-27d718abe5a6-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"7a66a884-3818-43a9-86e9-27d718abe5a6\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.695379 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a66a884-3818-43a9-86e9-27d718abe5a6-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"7a66a884-3818-43a9-86e9-27d718abe5a6\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.695423 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a66a884-3818-43a9-86e9-27d718abe5a6-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"7a66a884-3818-43a9-86e9-27d718abe5a6\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.695488 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a66a884-3818-43a9-86e9-27d718abe5a6-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"7a66a884-3818-43a9-86e9-27d718abe5a6\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.695543 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a66a884-3818-43a9-86e9-27d718abe5a6-config\") pod \"ovsdbserver-nb-1\" (UID: \"7a66a884-3818-43a9-86e9-27d718abe5a6\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.695583 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkpfk\" (UniqueName: \"kubernetes.io/projected/7a66a884-3818-43a9-86e9-27d718abe5a6-kube-api-access-fkpfk\") pod \"ovsdbserver-nb-1\" (UID: \"7a66a884-3818-43a9-86e9-27d718abe5a6\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.695621 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6fd0990-a53c-473e-9a5f-44721bfac06c-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"c6fd0990-a53c-473e-9a5f-44721bfac06c\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.696731 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6fd0990-a53c-473e-9a5f-44721bfac06c-config\") pod \"ovsdbserver-nb-2\" (UID: \"c6fd0990-a53c-473e-9a5f-44721bfac06c\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.696748 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c6fd0990-a53c-473e-9a5f-44721bfac06c-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"c6fd0990-a53c-473e-9a5f-44721bfac06c\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.697342 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a66a884-3818-43a9-86e9-27d718abe5a6-config\") pod \"ovsdbserver-nb-1\" (UID: \"7a66a884-3818-43a9-86e9-27d718abe5a6\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.697398 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c6fd0990-a53c-473e-9a5f-44721bfac06c-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"c6fd0990-a53c-473e-9a5f-44721bfac06c\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.697470 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5028261b-2fba-4f85-b33b-838a7a592436\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5028261b-2fba-4f85-b33b-838a7a592436\") pod \"ovsdbserver-nb-1\" (UID: \"7a66a884-3818-43a9-86e9-27d718abe5a6\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.698012 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7a66a884-3818-43a9-86e9-27d718abe5a6-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"7a66a884-3818-43a9-86e9-27d718abe5a6\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.698104 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0b0a1f6a-5ad9-495f-b738-a863e8e7af55\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0b0a1f6a-5ad9-495f-b738-a863e8e7af55\") pod \"ovsdbserver-nb-2\" (UID: \"c6fd0990-a53c-473e-9a5f-44721bfac06c\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.698451 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a66a884-3818-43a9-86e9-27d718abe5a6-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"7a66a884-3818-43a9-86e9-27d718abe5a6\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.699037 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7a66a884-3818-43a9-86e9-27d718abe5a6-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"7a66a884-3818-43a9-86e9-27d718abe5a6\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.700417 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6fd0990-a53c-473e-9a5f-44721bfac06c-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"c6fd0990-a53c-473e-9a5f-44721bfac06c\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.700507 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a66a884-3818-43a9-86e9-27d718abe5a6-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"7a66a884-3818-43a9-86e9-27d718abe5a6\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.702803 4830 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.702871 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0b0a1f6a-5ad9-495f-b738-a863e8e7af55\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0b0a1f6a-5ad9-495f-b738-a863e8e7af55\") pod \"ovsdbserver-nb-2\" (UID: \"c6fd0990-a53c-473e-9a5f-44721bfac06c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5716fcbb77d02d1f05a7bbafed4bac0a45e033aa1d0d61c7905e385c67afa5f9/globalmount\"" pod="openstack/ovsdbserver-nb-2" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.703201 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6fd0990-a53c-473e-9a5f-44721bfac06c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"c6fd0990-a53c-473e-9a5f-44721bfac06c\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.703197 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6fd0990-a53c-473e-9a5f-44721bfac06c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"c6fd0990-a53c-473e-9a5f-44721bfac06c\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.703332 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a66a884-3818-43a9-86e9-27d718abe5a6-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"7a66a884-3818-43a9-86e9-27d718abe5a6\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.703594 4830 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.703637 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5028261b-2fba-4f85-b33b-838a7a592436\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5028261b-2fba-4f85-b33b-838a7a592436\") pod \"ovsdbserver-nb-1\" (UID: \"7a66a884-3818-43a9-86e9-27d718abe5a6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/32ef45665b5795acb7cbb2d46b9cf9e042111e770860a86ee6d51b50b69acb5f/globalmount\"" pod="openstack/ovsdbserver-nb-1" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.704545 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a66a884-3818-43a9-86e9-27d718abe5a6-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"7a66a884-3818-43a9-86e9-27d718abe5a6\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.713948 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkpfk\" (UniqueName: \"kubernetes.io/projected/7a66a884-3818-43a9-86e9-27d718abe5a6-kube-api-access-fkpfk\") pod \"ovsdbserver-nb-1\" (UID: \"7a66a884-3818-43a9-86e9-27d718abe5a6\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.725858 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bbtx\" (UniqueName: \"kubernetes.io/projected/c6fd0990-a53c-473e-9a5f-44721bfac06c-kube-api-access-7bbtx\") pod \"ovsdbserver-nb-2\" (UID: \"c6fd0990-a53c-473e-9a5f-44721bfac06c\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.739033 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0b0a1f6a-5ad9-495f-b738-a863e8e7af55\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0b0a1f6a-5ad9-495f-b738-a863e8e7af55\") pod \"ovsdbserver-nb-2\" (UID: \"c6fd0990-a53c-473e-9a5f-44721bfac06c\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.741674 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.742055 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5028261b-2fba-4f85-b33b-838a7a592436\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5028261b-2fba-4f85-b33b-838a7a592436\") pod \"ovsdbserver-nb-1\" (UID: \"7a66a884-3818-43a9-86e9-27d718abe5a6\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.775242 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Mar 18 19:32:09 crc kubenswrapper[4830]: I0318 19:32:09.786571 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Mar 18 19:32:10 crc kubenswrapper[4830]: I0318 19:32:10.146388 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 18 19:32:10 crc kubenswrapper[4830]: W0318 19:32:10.146615 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a66a884_3818_43a9_86e9_27d718abe5a6.slice/crio-39ba614070c37e0708bdc236c155b4b7a09b51cc2491eaac7306987629b47e73 WatchSource:0}: Error finding container 39ba614070c37e0708bdc236c155b4b7a09b51cc2491eaac7306987629b47e73: Status 404 returned error can't find the container with id 39ba614070c37e0708bdc236c155b4b7a09b51cc2491eaac7306987629b47e73 Mar 18 19:32:10 crc kubenswrapper[4830]: I0318 19:32:10.269252 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 18 19:32:10 crc kubenswrapper[4830]: W0318 19:32:10.274794 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6fd0990_a53c_473e_9a5f_44721bfac06c.slice/crio-9106e7e2a0b4cafa7119d680d0d62eebdb9bfdaec9cd08ba1f03040775585e71 WatchSource:0}: Error finding container 9106e7e2a0b4cafa7119d680d0d62eebdb9bfdaec9cd08ba1f03040775585e71: Status 404 returned error can't find the container with id 9106e7e2a0b4cafa7119d680d0d62eebdb9bfdaec9cd08ba1f03040775585e71 Mar 18 19:32:10 crc kubenswrapper[4830]: W0318 19:32:10.344515 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a144c27_96d7_47b3_abb0_ceefa288f311.slice/crio-e69bb499bd3562f15e45672089f7f324d47d373c57aeedcef2b1f0997848d2e2 WatchSource:0}: Error finding container e69bb499bd3562f15e45672089f7f324d47d373c57aeedcef2b1f0997848d2e2: Status 404 returned error can't find the container with id e69bb499bd3562f15e45672089f7f324d47d373c57aeedcef2b1f0997848d2e2 Mar 18 19:32:10 crc kubenswrapper[4830]: I0318 19:32:10.345671 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 19:32:10 crc kubenswrapper[4830]: I0318 19:32:10.482501 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 18 19:32:10 crc kubenswrapper[4830]: I0318 19:32:10.508193 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Mar 18 19:32:10 crc kubenswrapper[4830]: I0318 19:32:10.511297 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3a144c27-96d7-47b3-abb0-ceefa288f311","Type":"ContainerStarted","Data":"e69bb499bd3562f15e45672089f7f324d47d373c57aeedcef2b1f0997848d2e2"} Mar 18 19:32:10 crc kubenswrapper[4830]: I0318 19:32:10.513090 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"c6fd0990-a53c-473e-9a5f-44721bfac06c","Type":"ContainerStarted","Data":"9106e7e2a0b4cafa7119d680d0d62eebdb9bfdaec9cd08ba1f03040775585e71"} Mar 18 19:32:10 crc kubenswrapper[4830]: I0318 19:32:10.517139 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"7a66a884-3818-43a9-86e9-27d718abe5a6","Type":"ContainerStarted","Data":"39ba614070c37e0708bdc236c155b4b7a09b51cc2491eaac7306987629b47e73"} Mar 18 19:32:10 crc kubenswrapper[4830]: I0318 19:32:10.797289 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Mar 18 19:32:11 crc kubenswrapper[4830]: I0318 19:32:11.532341 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"7a66a884-3818-43a9-86e9-27d718abe5a6","Type":"ContainerStarted","Data":"74ce1116b2466e38e3a62e7e0b17f0ec5426e56ee1dadf46df31021bd8942f94"} Mar 18 19:32:11 crc kubenswrapper[4830]: I0318 19:32:11.532427 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"7a66a884-3818-43a9-86e9-27d718abe5a6","Type":"ContainerStarted","Data":"cca066accab5902a08e6f077fb365dcc09da83ce0a38d76d23b27d08a8a1fffd"} Mar 18 19:32:11 crc kubenswrapper[4830]: I0318 19:32:11.536204 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3a144c27-96d7-47b3-abb0-ceefa288f311","Type":"ContainerStarted","Data":"fc1e42c669daf36cebd88c40f124cf563b67f03c94a704b847fb31fc46d42d40"} Mar 18 19:32:11 crc kubenswrapper[4830]: I0318 19:32:11.536506 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3a144c27-96d7-47b3-abb0-ceefa288f311","Type":"ContainerStarted","Data":"3843301c48b276a8b6fd24275b11f240c5f3bdcfdf7c552696f6ee5fa82a6c36"} Mar 18 19:32:11 crc kubenswrapper[4830]: I0318 19:32:11.538365 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"c6fd0990-a53c-473e-9a5f-44721bfac06c","Type":"ContainerStarted","Data":"c2e264244f64d65bae6691e3f44c633457839e4656345d1ff453029ad7d24805"} Mar 18 19:32:11 crc kubenswrapper[4830]: I0318 19:32:11.538443 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"c6fd0990-a53c-473e-9a5f-44721bfac06c","Type":"ContainerStarted","Data":"fd295ce02ff705fc329434341a86fdec630c92b399153fcde9713f9ac040832d"} Mar 18 19:32:11 crc kubenswrapper[4830]: I0318 19:32:11.562103 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=3.562058422 podStartE2EDuration="3.562058422s" podCreationTimestamp="2026-03-18 19:32:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:32:11.560154468 +0000 UTC m=+5366.127784870" watchObservedRunningTime="2026-03-18 19:32:11.562058422 +0000 UTC m=+5366.129688764" Mar 18 19:32:11 crc kubenswrapper[4830]: I0318 19:32:11.602277 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.602247775 podStartE2EDuration="3.602247775s" podCreationTimestamp="2026-03-18 19:32:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:32:11.590420631 +0000 UTC m=+5366.158051013" watchObservedRunningTime="2026-03-18 19:32:11.602247775 +0000 UTC m=+5366.169878147" Mar 18 19:32:11 crc kubenswrapper[4830]: I0318 19:32:11.623342 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=3.623313568 podStartE2EDuration="3.623313568s" podCreationTimestamp="2026-03-18 19:32:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:32:11.621850327 +0000 UTC m=+5366.189480699" watchObservedRunningTime="2026-03-18 19:32:11.623313568 +0000 UTC m=+5366.190943940" Mar 18 19:32:12 crc kubenswrapper[4830]: I0318 19:32:12.482734 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 18 19:32:12 crc kubenswrapper[4830]: I0318 19:32:12.508217 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Mar 18 19:32:12 crc kubenswrapper[4830]: I0318 19:32:12.742477 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 18 19:32:12 crc kubenswrapper[4830]: I0318 19:32:12.776005 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Mar 18 19:32:12 crc kubenswrapper[4830]: I0318 19:32:12.787464 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Mar 18 19:32:12 crc kubenswrapper[4830]: I0318 19:32:12.797690 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Mar 18 19:32:13 crc kubenswrapper[4830]: I0318 19:32:13.557498 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 18 19:32:13 crc kubenswrapper[4830]: I0318 19:32:13.586747 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Mar 18 19:32:13 crc kubenswrapper[4830]: I0318 19:32:13.642398 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 18 19:32:13 crc kubenswrapper[4830]: I0318 19:32:13.648344 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Mar 18 19:32:13 crc kubenswrapper[4830]: I0318 19:32:13.872443 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Mar 18 19:32:13 crc kubenswrapper[4830]: I0318 19:32:13.886618 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54c58fd5b7-jmrg4"] Mar 18 19:32:13 crc kubenswrapper[4830]: I0318 19:32:13.888510 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54c58fd5b7-jmrg4" Mar 18 19:32:13 crc kubenswrapper[4830]: I0318 19:32:13.891534 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54c58fd5b7-jmrg4"] Mar 18 19:32:13 crc kubenswrapper[4830]: I0318 19:32:13.892233 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 18 19:32:13 crc kubenswrapper[4830]: I0318 19:32:13.916811 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Mar 18 19:32:13 crc kubenswrapper[4830]: I0318 19:32:13.991062 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d56c384-303f-47d9-acfc-bdde50843ff5-dns-svc\") pod \"dnsmasq-dns-54c58fd5b7-jmrg4\" (UID: \"0d56c384-303f-47d9-acfc-bdde50843ff5\") " pod="openstack/dnsmasq-dns-54c58fd5b7-jmrg4" Mar 18 19:32:13 crc kubenswrapper[4830]: I0318 19:32:13.991122 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5qtt\" (UniqueName: \"kubernetes.io/projected/0d56c384-303f-47d9-acfc-bdde50843ff5-kube-api-access-n5qtt\") pod \"dnsmasq-dns-54c58fd5b7-jmrg4\" (UID: \"0d56c384-303f-47d9-acfc-bdde50843ff5\") " pod="openstack/dnsmasq-dns-54c58fd5b7-jmrg4" Mar 18 19:32:13 crc kubenswrapper[4830]: I0318 19:32:13.991174 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d56c384-303f-47d9-acfc-bdde50843ff5-config\") pod \"dnsmasq-dns-54c58fd5b7-jmrg4\" (UID: \"0d56c384-303f-47d9-acfc-bdde50843ff5\") " pod="openstack/dnsmasq-dns-54c58fd5b7-jmrg4" Mar 18 19:32:13 crc kubenswrapper[4830]: I0318 19:32:13.991373 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d56c384-303f-47d9-acfc-bdde50843ff5-ovsdbserver-sb\") pod \"dnsmasq-dns-54c58fd5b7-jmrg4\" (UID: \"0d56c384-303f-47d9-acfc-bdde50843ff5\") " pod="openstack/dnsmasq-dns-54c58fd5b7-jmrg4" Mar 18 19:32:14 crc kubenswrapper[4830]: I0318 19:32:14.092902 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d56c384-303f-47d9-acfc-bdde50843ff5-ovsdbserver-sb\") pod \"dnsmasq-dns-54c58fd5b7-jmrg4\" (UID: \"0d56c384-303f-47d9-acfc-bdde50843ff5\") " pod="openstack/dnsmasq-dns-54c58fd5b7-jmrg4" Mar 18 19:32:14 crc kubenswrapper[4830]: I0318 19:32:14.092988 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d56c384-303f-47d9-acfc-bdde50843ff5-dns-svc\") pod \"dnsmasq-dns-54c58fd5b7-jmrg4\" (UID: \"0d56c384-303f-47d9-acfc-bdde50843ff5\") " pod="openstack/dnsmasq-dns-54c58fd5b7-jmrg4" Mar 18 19:32:14 crc kubenswrapper[4830]: I0318 19:32:14.093023 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5qtt\" (UniqueName: \"kubernetes.io/projected/0d56c384-303f-47d9-acfc-bdde50843ff5-kube-api-access-n5qtt\") pod \"dnsmasq-dns-54c58fd5b7-jmrg4\" (UID: \"0d56c384-303f-47d9-acfc-bdde50843ff5\") " pod="openstack/dnsmasq-dns-54c58fd5b7-jmrg4" Mar 18 19:32:14 crc kubenswrapper[4830]: I0318 19:32:14.093059 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d56c384-303f-47d9-acfc-bdde50843ff5-config\") pod \"dnsmasq-dns-54c58fd5b7-jmrg4\" (UID: \"0d56c384-303f-47d9-acfc-bdde50843ff5\") " pod="openstack/dnsmasq-dns-54c58fd5b7-jmrg4" Mar 18 19:32:14 crc kubenswrapper[4830]: I0318 19:32:14.093951 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d56c384-303f-47d9-acfc-bdde50843ff5-config\") pod \"dnsmasq-dns-54c58fd5b7-jmrg4\" (UID: \"0d56c384-303f-47d9-acfc-bdde50843ff5\") " pod="openstack/dnsmasq-dns-54c58fd5b7-jmrg4" Mar 18 19:32:14 crc kubenswrapper[4830]: I0318 19:32:14.093972 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d56c384-303f-47d9-acfc-bdde50843ff5-dns-svc\") pod \"dnsmasq-dns-54c58fd5b7-jmrg4\" (UID: \"0d56c384-303f-47d9-acfc-bdde50843ff5\") " pod="openstack/dnsmasq-dns-54c58fd5b7-jmrg4" Mar 18 19:32:14 crc kubenswrapper[4830]: I0318 19:32:14.094198 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d56c384-303f-47d9-acfc-bdde50843ff5-ovsdbserver-sb\") pod \"dnsmasq-dns-54c58fd5b7-jmrg4\" (UID: \"0d56c384-303f-47d9-acfc-bdde50843ff5\") " pod="openstack/dnsmasq-dns-54c58fd5b7-jmrg4" Mar 18 19:32:14 crc kubenswrapper[4830]: I0318 19:32:14.110542 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5qtt\" (UniqueName: \"kubernetes.io/projected/0d56c384-303f-47d9-acfc-bdde50843ff5-kube-api-access-n5qtt\") pod \"dnsmasq-dns-54c58fd5b7-jmrg4\" (UID: \"0d56c384-303f-47d9-acfc-bdde50843ff5\") " pod="openstack/dnsmasq-dns-54c58fd5b7-jmrg4" Mar 18 19:32:14 crc kubenswrapper[4830]: I0318 19:32:14.211577 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54c58fd5b7-jmrg4" Mar 18 19:32:14 crc kubenswrapper[4830]: I0318 19:32:14.647961 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54c58fd5b7-jmrg4"] Mar 18 19:32:14 crc kubenswrapper[4830]: W0318 19:32:14.656849 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d56c384_303f_47d9_acfc_bdde50843ff5.slice/crio-408279fd922bd5ff4b682322afc8783b04a4d3652cf97ba598a559eaa7300519 WatchSource:0}: Error finding container 408279fd922bd5ff4b682322afc8783b04a4d3652cf97ba598a559eaa7300519: Status 404 returned error can't find the container with id 408279fd922bd5ff4b682322afc8783b04a4d3652cf97ba598a559eaa7300519 Mar 18 19:32:14 crc kubenswrapper[4830]: I0318 19:32:14.742244 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 18 19:32:14 crc kubenswrapper[4830]: I0318 19:32:14.776046 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Mar 18 19:32:14 crc kubenswrapper[4830]: I0318 19:32:14.786898 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Mar 18 19:32:15 crc kubenswrapper[4830]: I0318 19:32:15.588017 4830 generic.go:334] "Generic (PLEG): container finished" podID="0d56c384-303f-47d9-acfc-bdde50843ff5" containerID="f9c4cd7a6412cb509c74276897b4f3de0c0878473aed8cab2af0be1845e67f0c" exitCode=0 Mar 18 19:32:15 crc kubenswrapper[4830]: I0318 19:32:15.588148 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54c58fd5b7-jmrg4" event={"ID":"0d56c384-303f-47d9-acfc-bdde50843ff5","Type":"ContainerDied","Data":"f9c4cd7a6412cb509c74276897b4f3de0c0878473aed8cab2af0be1845e67f0c"} Mar 18 19:32:15 crc kubenswrapper[4830]: I0318 19:32:15.588492 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54c58fd5b7-jmrg4" event={"ID":"0d56c384-303f-47d9-acfc-bdde50843ff5","Type":"ContainerStarted","Data":"408279fd922bd5ff4b682322afc8783b04a4d3652cf97ba598a559eaa7300519"} Mar 18 19:32:15 crc kubenswrapper[4830]: I0318 19:32:15.802684 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 18 19:32:15 crc kubenswrapper[4830]: I0318 19:32:15.860225 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 18 19:32:15 crc kubenswrapper[4830]: I0318 19:32:15.862169 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Mar 18 19:32:15 crc kubenswrapper[4830]: I0318 19:32:15.871183 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Mar 18 19:32:15 crc kubenswrapper[4830]: I0318 19:32:15.922747 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Mar 18 19:32:15 crc kubenswrapper[4830]: I0318 19:32:15.932438 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Mar 18 19:32:16 crc kubenswrapper[4830]: I0318 19:32:16.141423 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54c58fd5b7-jmrg4"] Mar 18 19:32:16 crc kubenswrapper[4830]: I0318 19:32:16.173616 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b9c58654c-md7lg"] Mar 18 19:32:16 crc kubenswrapper[4830]: I0318 19:32:16.174840 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9c58654c-md7lg" Mar 18 19:32:16 crc kubenswrapper[4830]: I0318 19:32:16.177628 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 18 19:32:16 crc kubenswrapper[4830]: I0318 19:32:16.208145 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b9c58654c-md7lg"] Mar 18 19:32:16 crc kubenswrapper[4830]: I0318 19:32:16.241344 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9db59c48-e641-4065-8d37-60d2aa70d67e-config\") pod \"dnsmasq-dns-7b9c58654c-md7lg\" (UID: \"9db59c48-e641-4065-8d37-60d2aa70d67e\") " pod="openstack/dnsmasq-dns-7b9c58654c-md7lg" Mar 18 19:32:16 crc kubenswrapper[4830]: I0318 19:32:16.241428 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9db59c48-e641-4065-8d37-60d2aa70d67e-ovsdbserver-sb\") pod \"dnsmasq-dns-7b9c58654c-md7lg\" (UID: \"9db59c48-e641-4065-8d37-60d2aa70d67e\") " pod="openstack/dnsmasq-dns-7b9c58654c-md7lg" Mar 18 19:32:16 crc kubenswrapper[4830]: I0318 19:32:16.241467 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9db59c48-e641-4065-8d37-60d2aa70d67e-dns-svc\") pod \"dnsmasq-dns-7b9c58654c-md7lg\" (UID: \"9db59c48-e641-4065-8d37-60d2aa70d67e\") " pod="openstack/dnsmasq-dns-7b9c58654c-md7lg" Mar 18 19:32:16 crc kubenswrapper[4830]: I0318 19:32:16.241559 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9db59c48-e641-4065-8d37-60d2aa70d67e-ovsdbserver-nb\") pod \"dnsmasq-dns-7b9c58654c-md7lg\" (UID: \"9db59c48-e641-4065-8d37-60d2aa70d67e\") " pod="openstack/dnsmasq-dns-7b9c58654c-md7lg" Mar 18 19:32:16 crc kubenswrapper[4830]: I0318 19:32:16.241599 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkrkx\" (UniqueName: \"kubernetes.io/projected/9db59c48-e641-4065-8d37-60d2aa70d67e-kube-api-access-jkrkx\") pod \"dnsmasq-dns-7b9c58654c-md7lg\" (UID: \"9db59c48-e641-4065-8d37-60d2aa70d67e\") " pod="openstack/dnsmasq-dns-7b9c58654c-md7lg" Mar 18 19:32:16 crc kubenswrapper[4830]: I0318 19:32:16.342860 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9db59c48-e641-4065-8d37-60d2aa70d67e-dns-svc\") pod \"dnsmasq-dns-7b9c58654c-md7lg\" (UID: \"9db59c48-e641-4065-8d37-60d2aa70d67e\") " pod="openstack/dnsmasq-dns-7b9c58654c-md7lg" Mar 18 19:32:16 crc kubenswrapper[4830]: I0318 19:32:16.342944 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9db59c48-e641-4065-8d37-60d2aa70d67e-ovsdbserver-nb\") pod \"dnsmasq-dns-7b9c58654c-md7lg\" (UID: \"9db59c48-e641-4065-8d37-60d2aa70d67e\") " pod="openstack/dnsmasq-dns-7b9c58654c-md7lg" Mar 18 19:32:16 crc kubenswrapper[4830]: I0318 19:32:16.342970 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkrkx\" (UniqueName: \"kubernetes.io/projected/9db59c48-e641-4065-8d37-60d2aa70d67e-kube-api-access-jkrkx\") pod \"dnsmasq-dns-7b9c58654c-md7lg\" (UID: \"9db59c48-e641-4065-8d37-60d2aa70d67e\") " pod="openstack/dnsmasq-dns-7b9c58654c-md7lg" Mar 18 19:32:16 crc kubenswrapper[4830]: I0318 19:32:16.343014 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9db59c48-e641-4065-8d37-60d2aa70d67e-config\") pod \"dnsmasq-dns-7b9c58654c-md7lg\" (UID: \"9db59c48-e641-4065-8d37-60d2aa70d67e\") " pod="openstack/dnsmasq-dns-7b9c58654c-md7lg" Mar 18 19:32:16 crc kubenswrapper[4830]: I0318 19:32:16.343057 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9db59c48-e641-4065-8d37-60d2aa70d67e-ovsdbserver-sb\") pod \"dnsmasq-dns-7b9c58654c-md7lg\" (UID: \"9db59c48-e641-4065-8d37-60d2aa70d67e\") " pod="openstack/dnsmasq-dns-7b9c58654c-md7lg" Mar 18 19:32:16 crc kubenswrapper[4830]: I0318 19:32:16.344054 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9db59c48-e641-4065-8d37-60d2aa70d67e-ovsdbserver-sb\") pod \"dnsmasq-dns-7b9c58654c-md7lg\" (UID: \"9db59c48-e641-4065-8d37-60d2aa70d67e\") " pod="openstack/dnsmasq-dns-7b9c58654c-md7lg" Mar 18 19:32:16 crc kubenswrapper[4830]: I0318 19:32:16.344078 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9db59c48-e641-4065-8d37-60d2aa70d67e-ovsdbserver-nb\") pod \"dnsmasq-dns-7b9c58654c-md7lg\" (UID: \"9db59c48-e641-4065-8d37-60d2aa70d67e\") " pod="openstack/dnsmasq-dns-7b9c58654c-md7lg" Mar 18 19:32:16 crc kubenswrapper[4830]: I0318 19:32:16.344124 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9db59c48-e641-4065-8d37-60d2aa70d67e-dns-svc\") pod \"dnsmasq-dns-7b9c58654c-md7lg\" (UID: \"9db59c48-e641-4065-8d37-60d2aa70d67e\") " pod="openstack/dnsmasq-dns-7b9c58654c-md7lg" Mar 18 19:32:16 crc kubenswrapper[4830]: I0318 19:32:16.344210 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9db59c48-e641-4065-8d37-60d2aa70d67e-config\") pod \"dnsmasq-dns-7b9c58654c-md7lg\" (UID: \"9db59c48-e641-4065-8d37-60d2aa70d67e\") " pod="openstack/dnsmasq-dns-7b9c58654c-md7lg" Mar 18 19:32:16 crc kubenswrapper[4830]: I0318 19:32:16.367755 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkrkx\" (UniqueName: \"kubernetes.io/projected/9db59c48-e641-4065-8d37-60d2aa70d67e-kube-api-access-jkrkx\") pod \"dnsmasq-dns-7b9c58654c-md7lg\" (UID: \"9db59c48-e641-4065-8d37-60d2aa70d67e\") " pod="openstack/dnsmasq-dns-7b9c58654c-md7lg" Mar 18 19:32:16 crc kubenswrapper[4830]: I0318 19:32:16.490023 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9c58654c-md7lg" Mar 18 19:32:16 crc kubenswrapper[4830]: I0318 19:32:16.599340 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54c58fd5b7-jmrg4" event={"ID":"0d56c384-303f-47d9-acfc-bdde50843ff5","Type":"ContainerStarted","Data":"a41c96de12ff2be95c8b073ae4d317cd1b867bc174452081e2a4185504f6f022"} Mar 18 19:32:16 crc kubenswrapper[4830]: I0318 19:32:16.601325 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54c58fd5b7-jmrg4" Mar 18 19:32:16 crc kubenswrapper[4830]: I0318 19:32:16.633935 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54c58fd5b7-jmrg4" podStartSLOduration=3.63389357 podStartE2EDuration="3.63389357s" podCreationTimestamp="2026-03-18 19:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:32:16.631547914 +0000 UTC m=+5371.199178246" watchObservedRunningTime="2026-03-18 19:32:16.63389357 +0000 UTC m=+5371.201523912" Mar 18 19:32:16 crc kubenswrapper[4830]: I0318 19:32:16.970029 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b9c58654c-md7lg"] Mar 18 19:32:16 crc kubenswrapper[4830]: W0318 19:32:16.973830 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9db59c48_e641_4065_8d37_60d2aa70d67e.slice/crio-2d14df195dab04d637a234e425f59d4cb37ca8190786ab20075b2c70d9e73fa0 WatchSource:0}: Error finding container 2d14df195dab04d637a234e425f59d4cb37ca8190786ab20075b2c70d9e73fa0: Status 404 returned error can't find the container with id 2d14df195dab04d637a234e425f59d4cb37ca8190786ab20075b2c70d9e73fa0 Mar 18 19:32:17 crc kubenswrapper[4830]: I0318 19:32:17.608269 4830 generic.go:334] "Generic (PLEG): container finished" podID="9db59c48-e641-4065-8d37-60d2aa70d67e" containerID="bcb787169a6e165dae3a59da725e71386f6ab401ca073675f13a24f908669f79" exitCode=0 Mar 18 19:32:17 crc kubenswrapper[4830]: I0318 19:32:17.608342 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9c58654c-md7lg" event={"ID":"9db59c48-e641-4065-8d37-60d2aa70d67e","Type":"ContainerDied","Data":"bcb787169a6e165dae3a59da725e71386f6ab401ca073675f13a24f908669f79"} Mar 18 19:32:17 crc kubenswrapper[4830]: I0318 19:32:17.608864 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9c58654c-md7lg" event={"ID":"9db59c48-e641-4065-8d37-60d2aa70d67e","Type":"ContainerStarted","Data":"2d14df195dab04d637a234e425f59d4cb37ca8190786ab20075b2c70d9e73fa0"} Mar 18 19:32:17 crc kubenswrapper[4830]: I0318 19:32:17.608998 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54c58fd5b7-jmrg4" podUID="0d56c384-303f-47d9-acfc-bdde50843ff5" containerName="dnsmasq-dns" containerID="cri-o://a41c96de12ff2be95c8b073ae4d317cd1b867bc174452081e2a4185504f6f022" gracePeriod=10 Mar 18 19:32:18 crc kubenswrapper[4830]: I0318 19:32:18.034950 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54c58fd5b7-jmrg4" Mar 18 19:32:18 crc kubenswrapper[4830]: I0318 19:32:18.176585 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d56c384-303f-47d9-acfc-bdde50843ff5-ovsdbserver-sb\") pod \"0d56c384-303f-47d9-acfc-bdde50843ff5\" (UID: \"0d56c384-303f-47d9-acfc-bdde50843ff5\") " Mar 18 19:32:18 crc kubenswrapper[4830]: I0318 19:32:18.176642 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d56c384-303f-47d9-acfc-bdde50843ff5-config\") pod \"0d56c384-303f-47d9-acfc-bdde50843ff5\" (UID: \"0d56c384-303f-47d9-acfc-bdde50843ff5\") " Mar 18 19:32:18 crc kubenswrapper[4830]: I0318 19:32:18.176718 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5qtt\" (UniqueName: \"kubernetes.io/projected/0d56c384-303f-47d9-acfc-bdde50843ff5-kube-api-access-n5qtt\") pod \"0d56c384-303f-47d9-acfc-bdde50843ff5\" (UID: \"0d56c384-303f-47d9-acfc-bdde50843ff5\") " Mar 18 19:32:18 crc kubenswrapper[4830]: I0318 19:32:18.176784 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d56c384-303f-47d9-acfc-bdde50843ff5-dns-svc\") pod \"0d56c384-303f-47d9-acfc-bdde50843ff5\" (UID: \"0d56c384-303f-47d9-acfc-bdde50843ff5\") " Mar 18 19:32:18 crc kubenswrapper[4830]: I0318 19:32:18.181118 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d56c384-303f-47d9-acfc-bdde50843ff5-kube-api-access-n5qtt" (OuterVolumeSpecName: "kube-api-access-n5qtt") pod "0d56c384-303f-47d9-acfc-bdde50843ff5" (UID: "0d56c384-303f-47d9-acfc-bdde50843ff5"). InnerVolumeSpecName "kube-api-access-n5qtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:32:18 crc kubenswrapper[4830]: I0318 19:32:18.220979 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d56c384-303f-47d9-acfc-bdde50843ff5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0d56c384-303f-47d9-acfc-bdde50843ff5" (UID: "0d56c384-303f-47d9-acfc-bdde50843ff5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:32:18 crc kubenswrapper[4830]: I0318 19:32:18.230629 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d56c384-303f-47d9-acfc-bdde50843ff5-config" (OuterVolumeSpecName: "config") pod "0d56c384-303f-47d9-acfc-bdde50843ff5" (UID: "0d56c384-303f-47d9-acfc-bdde50843ff5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:32:18 crc kubenswrapper[4830]: I0318 19:32:18.234477 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d56c384-303f-47d9-acfc-bdde50843ff5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0d56c384-303f-47d9-acfc-bdde50843ff5" (UID: "0d56c384-303f-47d9-acfc-bdde50843ff5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:32:18 crc kubenswrapper[4830]: I0318 19:32:18.279187 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d56c384-303f-47d9-acfc-bdde50843ff5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 19:32:18 crc kubenswrapper[4830]: I0318 19:32:18.279422 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d56c384-303f-47d9-acfc-bdde50843ff5-config\") on node \"crc\" DevicePath \"\"" Mar 18 19:32:18 crc kubenswrapper[4830]: I0318 19:32:18.279435 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5qtt\" (UniqueName: \"kubernetes.io/projected/0d56c384-303f-47d9-acfc-bdde50843ff5-kube-api-access-n5qtt\") on node \"crc\" DevicePath \"\"" Mar 18 19:32:18 crc kubenswrapper[4830]: I0318 19:32:18.279448 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d56c384-303f-47d9-acfc-bdde50843ff5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 19:32:18 crc kubenswrapper[4830]: I0318 19:32:18.623192 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9c58654c-md7lg" event={"ID":"9db59c48-e641-4065-8d37-60d2aa70d67e","Type":"ContainerStarted","Data":"ea13b2086997c97ffcd4c8308752a19194d9ce1f07fcbc1dd67aa48a25f1f5d4"} Mar 18 19:32:18 crc kubenswrapper[4830]: I0318 19:32:18.625643 4830 generic.go:334] "Generic (PLEG): container finished" podID="0d56c384-303f-47d9-acfc-bdde50843ff5" containerID="a41c96de12ff2be95c8b073ae4d317cd1b867bc174452081e2a4185504f6f022" exitCode=0 Mar 18 19:32:18 crc kubenswrapper[4830]: I0318 19:32:18.625696 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54c58fd5b7-jmrg4" event={"ID":"0d56c384-303f-47d9-acfc-bdde50843ff5","Type":"ContainerDied","Data":"a41c96de12ff2be95c8b073ae4d317cd1b867bc174452081e2a4185504f6f022"} Mar 18 19:32:18 crc kubenswrapper[4830]: I0318 19:32:18.625724 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54c58fd5b7-jmrg4" event={"ID":"0d56c384-303f-47d9-acfc-bdde50843ff5","Type":"ContainerDied","Data":"408279fd922bd5ff4b682322afc8783b04a4d3652cf97ba598a559eaa7300519"} Mar 18 19:32:18 crc kubenswrapper[4830]: I0318 19:32:18.625726 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54c58fd5b7-jmrg4" Mar 18 19:32:18 crc kubenswrapper[4830]: I0318 19:32:18.625748 4830 scope.go:117] "RemoveContainer" containerID="a41c96de12ff2be95c8b073ae4d317cd1b867bc174452081e2a4185504f6f022" Mar 18 19:32:18 crc kubenswrapper[4830]: I0318 19:32:18.647947 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b9c58654c-md7lg" podStartSLOduration=2.64791212 podStartE2EDuration="2.64791212s" podCreationTimestamp="2026-03-18 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:32:18.642842237 +0000 UTC m=+5373.210472569" watchObservedRunningTime="2026-03-18 19:32:18.64791212 +0000 UTC m=+5373.215542492" Mar 18 19:32:18 crc kubenswrapper[4830]: I0318 19:32:18.649626 4830 scope.go:117] "RemoveContainer" containerID="f9c4cd7a6412cb509c74276897b4f3de0c0878473aed8cab2af0be1845e67f0c" Mar 18 19:32:18 crc kubenswrapper[4830]: I0318 19:32:18.669198 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54c58fd5b7-jmrg4"] Mar 18 19:32:18 crc kubenswrapper[4830]: I0318 19:32:18.670056 4830 scope.go:117] "RemoveContainer" containerID="a41c96de12ff2be95c8b073ae4d317cd1b867bc174452081e2a4185504f6f022" Mar 18 19:32:18 crc kubenswrapper[4830]: E0318 19:32:18.670504 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a41c96de12ff2be95c8b073ae4d317cd1b867bc174452081e2a4185504f6f022\": container with ID starting with a41c96de12ff2be95c8b073ae4d317cd1b867bc174452081e2a4185504f6f022 not found: ID does not exist" containerID="a41c96de12ff2be95c8b073ae4d317cd1b867bc174452081e2a4185504f6f022" Mar 18 19:32:18 crc kubenswrapper[4830]: I0318 19:32:18.670545 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a41c96de12ff2be95c8b073ae4d317cd1b867bc174452081e2a4185504f6f022"} err="failed to get container status \"a41c96de12ff2be95c8b073ae4d317cd1b867bc174452081e2a4185504f6f022\": rpc error: code = NotFound desc = could not find container \"a41c96de12ff2be95c8b073ae4d317cd1b867bc174452081e2a4185504f6f022\": container with ID starting with a41c96de12ff2be95c8b073ae4d317cd1b867bc174452081e2a4185504f6f022 not found: ID does not exist" Mar 18 19:32:18 crc kubenswrapper[4830]: I0318 19:32:18.670571 4830 scope.go:117] "RemoveContainer" containerID="f9c4cd7a6412cb509c74276897b4f3de0c0878473aed8cab2af0be1845e67f0c" Mar 18 19:32:18 crc kubenswrapper[4830]: E0318 19:32:18.671018 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9c4cd7a6412cb509c74276897b4f3de0c0878473aed8cab2af0be1845e67f0c\": container with ID starting with f9c4cd7a6412cb509c74276897b4f3de0c0878473aed8cab2af0be1845e67f0c not found: ID does not exist" containerID="f9c4cd7a6412cb509c74276897b4f3de0c0878473aed8cab2af0be1845e67f0c" Mar 18 19:32:18 crc kubenswrapper[4830]: I0318 19:32:18.671102 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9c4cd7a6412cb509c74276897b4f3de0c0878473aed8cab2af0be1845e67f0c"} err="failed to get container status \"f9c4cd7a6412cb509c74276897b4f3de0c0878473aed8cab2af0be1845e67f0c\": rpc error: code = NotFound desc = could not find container \"f9c4cd7a6412cb509c74276897b4f3de0c0878473aed8cab2af0be1845e67f0c\": container with ID starting with f9c4cd7a6412cb509c74276897b4f3de0c0878473aed8cab2af0be1845e67f0c not found: ID does not exist" Mar 18 19:32:18 crc kubenswrapper[4830]: I0318 19:32:18.676709 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54c58fd5b7-jmrg4"] Mar 18 19:32:18 crc kubenswrapper[4830]: I0318 19:32:18.894257 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Mar 18 19:32:18 crc kubenswrapper[4830]: E0318 19:32:18.895325 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d56c384-303f-47d9-acfc-bdde50843ff5" containerName="init" Mar 18 19:32:18 crc kubenswrapper[4830]: I0318 19:32:18.895349 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d56c384-303f-47d9-acfc-bdde50843ff5" containerName="init" Mar 18 19:32:18 crc kubenswrapper[4830]: E0318 19:32:18.895408 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d56c384-303f-47d9-acfc-bdde50843ff5" containerName="dnsmasq-dns" Mar 18 19:32:18 crc kubenswrapper[4830]: I0318 19:32:18.895418 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d56c384-303f-47d9-acfc-bdde50843ff5" containerName="dnsmasq-dns" Mar 18 19:32:18 crc kubenswrapper[4830]: I0318 19:32:18.895786 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d56c384-303f-47d9-acfc-bdde50843ff5" containerName="dnsmasq-dns" Mar 18 19:32:18 crc kubenswrapper[4830]: I0318 19:32:18.896845 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 18 19:32:18 crc kubenswrapper[4830]: I0318 19:32:18.907009 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Mar 18 19:32:18 crc kubenswrapper[4830]: I0318 19:32:18.920974 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Mar 18 19:32:19 crc kubenswrapper[4830]: I0318 19:32:19.092602 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/d3970015-6cb7-4a6e-a33a-8701129b7335-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"d3970015-6cb7-4a6e-a33a-8701129b7335\") " pod="openstack/ovn-copy-data" Mar 18 19:32:19 crc kubenswrapper[4830]: I0318 19:32:19.092692 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-622r2\" (UniqueName: \"kubernetes.io/projected/d3970015-6cb7-4a6e-a33a-8701129b7335-kube-api-access-622r2\") pod \"ovn-copy-data\" (UID: \"d3970015-6cb7-4a6e-a33a-8701129b7335\") " pod="openstack/ovn-copy-data" Mar 18 19:32:19 crc kubenswrapper[4830]: I0318 19:32:19.092860 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8e48fda1-acbf-4e7c-ab63-0791570d63b5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8e48fda1-acbf-4e7c-ab63-0791570d63b5\") pod \"ovn-copy-data\" (UID: \"d3970015-6cb7-4a6e-a33a-8701129b7335\") " pod="openstack/ovn-copy-data" Mar 18 19:32:19 crc kubenswrapper[4830]: I0318 19:32:19.194473 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/d3970015-6cb7-4a6e-a33a-8701129b7335-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"d3970015-6cb7-4a6e-a33a-8701129b7335\") " pod="openstack/ovn-copy-data" Mar 18 19:32:19 crc kubenswrapper[4830]: I0318 19:32:19.194528 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-622r2\" (UniqueName: \"kubernetes.io/projected/d3970015-6cb7-4a6e-a33a-8701129b7335-kube-api-access-622r2\") pod \"ovn-copy-data\" (UID: \"d3970015-6cb7-4a6e-a33a-8701129b7335\") " pod="openstack/ovn-copy-data" Mar 18 19:32:19 crc kubenswrapper[4830]: I0318 19:32:19.194572 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8e48fda1-acbf-4e7c-ab63-0791570d63b5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8e48fda1-acbf-4e7c-ab63-0791570d63b5\") pod \"ovn-copy-data\" (UID: \"d3970015-6cb7-4a6e-a33a-8701129b7335\") " pod="openstack/ovn-copy-data" Mar 18 19:32:19 crc kubenswrapper[4830]: I0318 19:32:19.197745 4830 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 19:32:19 crc kubenswrapper[4830]: I0318 19:32:19.197845 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8e48fda1-acbf-4e7c-ab63-0791570d63b5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8e48fda1-acbf-4e7c-ab63-0791570d63b5\") pod \"ovn-copy-data\" (UID: \"d3970015-6cb7-4a6e-a33a-8701129b7335\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5f5670953e57d3d983373d18238800f6c8c0ce9223d850401e5942e29bc8ded2/globalmount\"" pod="openstack/ovn-copy-data" Mar 18 19:32:19 crc kubenswrapper[4830]: I0318 19:32:19.212663 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/d3970015-6cb7-4a6e-a33a-8701129b7335-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"d3970015-6cb7-4a6e-a33a-8701129b7335\") " pod="openstack/ovn-copy-data" Mar 18 19:32:19 crc kubenswrapper[4830]: I0318 19:32:19.217297 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-622r2\" (UniqueName: \"kubernetes.io/projected/d3970015-6cb7-4a6e-a33a-8701129b7335-kube-api-access-622r2\") pod \"ovn-copy-data\" (UID: \"d3970015-6cb7-4a6e-a33a-8701129b7335\") " pod="openstack/ovn-copy-data" Mar 18 19:32:19 crc kubenswrapper[4830]: I0318 19:32:19.247424 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8e48fda1-acbf-4e7c-ab63-0791570d63b5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8e48fda1-acbf-4e7c-ab63-0791570d63b5\") pod \"ovn-copy-data\" (UID: \"d3970015-6cb7-4a6e-a33a-8701129b7335\") " pod="openstack/ovn-copy-data" Mar 18 19:32:19 crc kubenswrapper[4830]: I0318 19:32:19.406192 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 18 19:32:19 crc kubenswrapper[4830]: I0318 19:32:19.640086 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b9c58654c-md7lg" Mar 18 19:32:19 crc kubenswrapper[4830]: I0318 19:32:19.927569 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Mar 18 19:32:19 crc kubenswrapper[4830]: W0318 19:32:19.936021 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3970015_6cb7_4a6e_a33a_8701129b7335.slice/crio-9c2420c56052ffbec7c80d325e2022583c4242b487ebdc0668be24a24bf3164b WatchSource:0}: Error finding container 9c2420c56052ffbec7c80d325e2022583c4242b487ebdc0668be24a24bf3164b: Status 404 returned error can't find the container with id 9c2420c56052ffbec7c80d325e2022583c4242b487ebdc0668be24a24bf3164b Mar 18 19:32:20 crc kubenswrapper[4830]: I0318 19:32:20.250432 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d56c384-303f-47d9-acfc-bdde50843ff5" path="/var/lib/kubelet/pods/0d56c384-303f-47d9-acfc-bdde50843ff5/volumes" Mar 18 19:32:20 crc kubenswrapper[4830]: I0318 19:32:20.563101 4830 scope.go:117] "RemoveContainer" containerID="de68858792bec333aecb9155a45dab5286433cd2198d3c7e9d45ccb6a050742a" Mar 18 19:32:20 crc kubenswrapper[4830]: I0318 19:32:20.605902 4830 scope.go:117] "RemoveContainer" containerID="f7f83d76884de6e54ab330c49fc08f69f2b52725fcd7cac00f09aa160afd4f80" Mar 18 19:32:20 crc kubenswrapper[4830]: I0318 19:32:20.652458 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"d3970015-6cb7-4a6e-a33a-8701129b7335","Type":"ContainerStarted","Data":"9c2420c56052ffbec7c80d325e2022583c4242b487ebdc0668be24a24bf3164b"} Mar 18 19:32:23 crc kubenswrapper[4830]: I0318 19:32:23.683301 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"d3970015-6cb7-4a6e-a33a-8701129b7335","Type":"ContainerStarted","Data":"54f0d7c9f1f1880aaed6a062352c16c2f6c8f8bab6909f920f32f644ea75e3e2"} Mar 18 19:32:23 crc kubenswrapper[4830]: I0318 19:32:23.706833 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.600342692 podStartE2EDuration="6.706803974s" podCreationTimestamp="2026-03-18 19:32:17 +0000 UTC" firstStartedPulling="2026-03-18 19:32:19.938861577 +0000 UTC m=+5374.506491919" lastFinishedPulling="2026-03-18 19:32:23.045322869 +0000 UTC m=+5377.612953201" observedRunningTime="2026-03-18 19:32:23.702936585 +0000 UTC m=+5378.270566957" watchObservedRunningTime="2026-03-18 19:32:23.706803974 +0000 UTC m=+5378.274434336" Mar 18 19:32:26 crc kubenswrapper[4830]: I0318 19:32:26.493047 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b9c58654c-md7lg" Mar 18 19:32:26 crc kubenswrapper[4830]: I0318 19:32:26.578204 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-684c864bc9-grnt7"] Mar 18 19:32:26 crc kubenswrapper[4830]: I0318 19:32:26.578496 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-684c864bc9-grnt7" podUID="05dc5c44-41f3-44d6-ab05-9054e98e2523" containerName="dnsmasq-dns" containerID="cri-o://d60876bc67bb8b3d2d1c869fd3b1cc8a48a268e96e515e37bd5f92f9279e68ed" gracePeriod=10 Mar 18 19:32:26 crc kubenswrapper[4830]: I0318 19:32:26.716495 4830 generic.go:334] "Generic (PLEG): container finished" podID="05dc5c44-41f3-44d6-ab05-9054e98e2523" containerID="d60876bc67bb8b3d2d1c869fd3b1cc8a48a268e96e515e37bd5f92f9279e68ed" exitCode=0 Mar 18 19:32:26 crc kubenswrapper[4830]: I0318 19:32:26.716528 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-684c864bc9-grnt7" event={"ID":"05dc5c44-41f3-44d6-ab05-9054e98e2523","Type":"ContainerDied","Data":"d60876bc67bb8b3d2d1c869fd3b1cc8a48a268e96e515e37bd5f92f9279e68ed"} Mar 18 19:32:27 crc kubenswrapper[4830]: I0318 19:32:27.059847 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-684c864bc9-grnt7" Mar 18 19:32:27 crc kubenswrapper[4830]: I0318 19:32:27.142949 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmfcr\" (UniqueName: \"kubernetes.io/projected/05dc5c44-41f3-44d6-ab05-9054e98e2523-kube-api-access-jmfcr\") pod \"05dc5c44-41f3-44d6-ab05-9054e98e2523\" (UID: \"05dc5c44-41f3-44d6-ab05-9054e98e2523\") " Mar 18 19:32:27 crc kubenswrapper[4830]: I0318 19:32:27.143054 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05dc5c44-41f3-44d6-ab05-9054e98e2523-config\") pod \"05dc5c44-41f3-44d6-ab05-9054e98e2523\" (UID: \"05dc5c44-41f3-44d6-ab05-9054e98e2523\") " Mar 18 19:32:27 crc kubenswrapper[4830]: I0318 19:32:27.143133 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05dc5c44-41f3-44d6-ab05-9054e98e2523-dns-svc\") pod \"05dc5c44-41f3-44d6-ab05-9054e98e2523\" (UID: \"05dc5c44-41f3-44d6-ab05-9054e98e2523\") " Mar 18 19:32:27 crc kubenswrapper[4830]: I0318 19:32:27.156430 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05dc5c44-41f3-44d6-ab05-9054e98e2523-kube-api-access-jmfcr" (OuterVolumeSpecName: "kube-api-access-jmfcr") pod "05dc5c44-41f3-44d6-ab05-9054e98e2523" (UID: "05dc5c44-41f3-44d6-ab05-9054e98e2523"). InnerVolumeSpecName "kube-api-access-jmfcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:32:27 crc kubenswrapper[4830]: I0318 19:32:27.181047 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05dc5c44-41f3-44d6-ab05-9054e98e2523-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "05dc5c44-41f3-44d6-ab05-9054e98e2523" (UID: "05dc5c44-41f3-44d6-ab05-9054e98e2523"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:32:27 crc kubenswrapper[4830]: I0318 19:32:27.190344 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05dc5c44-41f3-44d6-ab05-9054e98e2523-config" (OuterVolumeSpecName: "config") pod "05dc5c44-41f3-44d6-ab05-9054e98e2523" (UID: "05dc5c44-41f3-44d6-ab05-9054e98e2523"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:32:27 crc kubenswrapper[4830]: I0318 19:32:27.248326 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmfcr\" (UniqueName: \"kubernetes.io/projected/05dc5c44-41f3-44d6-ab05-9054e98e2523-kube-api-access-jmfcr\") on node \"crc\" DevicePath \"\"" Mar 18 19:32:27 crc kubenswrapper[4830]: I0318 19:32:27.248374 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05dc5c44-41f3-44d6-ab05-9054e98e2523-config\") on node \"crc\" DevicePath \"\"" Mar 18 19:32:27 crc kubenswrapper[4830]: I0318 19:32:27.248392 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05dc5c44-41f3-44d6-ab05-9054e98e2523-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 19:32:27 crc kubenswrapper[4830]: I0318 19:32:27.729371 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-684c864bc9-grnt7" event={"ID":"05dc5c44-41f3-44d6-ab05-9054e98e2523","Type":"ContainerDied","Data":"ac53c57fe0a41b48573a8d4513e73c2ccfa99456d441423e52199a33183e2893"} Mar 18 19:32:27 crc kubenswrapper[4830]: I0318 19:32:27.729617 4830 scope.go:117] "RemoveContainer" containerID="d60876bc67bb8b3d2d1c869fd3b1cc8a48a268e96e515e37bd5f92f9279e68ed" Mar 18 19:32:27 crc kubenswrapper[4830]: I0318 19:32:27.729911 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-684c864bc9-grnt7" Mar 18 19:32:27 crc kubenswrapper[4830]: I0318 19:32:27.764605 4830 scope.go:117] "RemoveContainer" containerID="27abcad188fd5fd45b22ca44b46ac2c199a9392eae18ca84e606a09043a6dd09" Mar 18 19:32:27 crc kubenswrapper[4830]: I0318 19:32:27.791313 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-684c864bc9-grnt7"] Mar 18 19:32:27 crc kubenswrapper[4830]: I0318 19:32:27.807376 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-684c864bc9-grnt7"] Mar 18 19:32:28 crc kubenswrapper[4830]: I0318 19:32:28.252459 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05dc5c44-41f3-44d6-ab05-9054e98e2523" path="/var/lib/kubelet/pods/05dc5c44-41f3-44d6-ab05-9054e98e2523/volumes" Mar 18 19:32:29 crc kubenswrapper[4830]: I0318 19:32:29.460074 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 18 19:32:29 crc kubenswrapper[4830]: E0318 19:32:29.461313 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05dc5c44-41f3-44d6-ab05-9054e98e2523" containerName="dnsmasq-dns" Mar 18 19:32:29 crc kubenswrapper[4830]: I0318 19:32:29.461334 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="05dc5c44-41f3-44d6-ab05-9054e98e2523" containerName="dnsmasq-dns" Mar 18 19:32:29 crc kubenswrapper[4830]: E0318 19:32:29.461389 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05dc5c44-41f3-44d6-ab05-9054e98e2523" containerName="init" Mar 18 19:32:29 crc kubenswrapper[4830]: I0318 19:32:29.461398 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="05dc5c44-41f3-44d6-ab05-9054e98e2523" containerName="init" Mar 18 19:32:29 crc kubenswrapper[4830]: I0318 19:32:29.461838 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="05dc5c44-41f3-44d6-ab05-9054e98e2523" containerName="dnsmasq-dns" Mar 18 19:32:29 crc kubenswrapper[4830]: I0318 19:32:29.463147 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 19:32:29 crc kubenswrapper[4830]: I0318 19:32:29.466678 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 18 19:32:29 crc kubenswrapper[4830]: I0318 19:32:29.469187 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 18 19:32:29 crc kubenswrapper[4830]: I0318 19:32:29.470000 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-nrbft" Mar 18 19:32:29 crc kubenswrapper[4830]: I0318 19:32:29.470951 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 18 19:32:29 crc kubenswrapper[4830]: I0318 19:32:29.495684 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 19:32:29 crc kubenswrapper[4830]: I0318 19:32:29.591112 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/54f68d3c-1db5-4d9e-88e2-c970adb3a4fc-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"54f68d3c-1db5-4d9e-88e2-c970adb3a4fc\") " pod="openstack/ovn-northd-0" Mar 18 19:32:29 crc kubenswrapper[4830]: I0318 19:32:29.591265 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54f68d3c-1db5-4d9e-88e2-c970adb3a4fc-config\") pod \"ovn-northd-0\" (UID: \"54f68d3c-1db5-4d9e-88e2-c970adb3a4fc\") " pod="openstack/ovn-northd-0" Mar 18 19:32:29 crc kubenswrapper[4830]: I0318 19:32:29.591302 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54f68d3c-1db5-4d9e-88e2-c970adb3a4fc-scripts\") pod \"ovn-northd-0\" (UID: \"54f68d3c-1db5-4d9e-88e2-c970adb3a4fc\") " pod="openstack/ovn-northd-0" Mar 18 19:32:29 crc kubenswrapper[4830]: I0318 19:32:29.591342 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/54f68d3c-1db5-4d9e-88e2-c970adb3a4fc-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"54f68d3c-1db5-4d9e-88e2-c970adb3a4fc\") " pod="openstack/ovn-northd-0" Mar 18 19:32:29 crc kubenswrapper[4830]: I0318 19:32:29.591374 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/54f68d3c-1db5-4d9e-88e2-c970adb3a4fc-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"54f68d3c-1db5-4d9e-88e2-c970adb3a4fc\") " pod="openstack/ovn-northd-0" Mar 18 19:32:29 crc kubenswrapper[4830]: I0318 19:32:29.591452 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh9vt\" (UniqueName: \"kubernetes.io/projected/54f68d3c-1db5-4d9e-88e2-c970adb3a4fc-kube-api-access-sh9vt\") pod \"ovn-northd-0\" (UID: \"54f68d3c-1db5-4d9e-88e2-c970adb3a4fc\") " pod="openstack/ovn-northd-0" Mar 18 19:32:29 crc kubenswrapper[4830]: I0318 19:32:29.591475 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54f68d3c-1db5-4d9e-88e2-c970adb3a4fc-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"54f68d3c-1db5-4d9e-88e2-c970adb3a4fc\") " pod="openstack/ovn-northd-0" Mar 18 19:32:29 crc kubenswrapper[4830]: I0318 19:32:29.692967 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/54f68d3c-1db5-4d9e-88e2-c970adb3a4fc-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"54f68d3c-1db5-4d9e-88e2-c970adb3a4fc\") " pod="openstack/ovn-northd-0" Mar 18 19:32:29 crc kubenswrapper[4830]: I0318 19:32:29.693034 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54f68d3c-1db5-4d9e-88e2-c970adb3a4fc-config\") pod \"ovn-northd-0\" (UID: \"54f68d3c-1db5-4d9e-88e2-c970adb3a4fc\") " pod="openstack/ovn-northd-0" Mar 18 19:32:29 crc kubenswrapper[4830]: I0318 19:32:29.693056 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54f68d3c-1db5-4d9e-88e2-c970adb3a4fc-scripts\") pod \"ovn-northd-0\" (UID: \"54f68d3c-1db5-4d9e-88e2-c970adb3a4fc\") " pod="openstack/ovn-northd-0" Mar 18 19:32:29 crc kubenswrapper[4830]: I0318 19:32:29.693076 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/54f68d3c-1db5-4d9e-88e2-c970adb3a4fc-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"54f68d3c-1db5-4d9e-88e2-c970adb3a4fc\") " pod="openstack/ovn-northd-0" Mar 18 19:32:29 crc kubenswrapper[4830]: I0318 19:32:29.693094 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/54f68d3c-1db5-4d9e-88e2-c970adb3a4fc-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"54f68d3c-1db5-4d9e-88e2-c970adb3a4fc\") " pod="openstack/ovn-northd-0" Mar 18 19:32:29 crc kubenswrapper[4830]: I0318 19:32:29.693125 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh9vt\" (UniqueName: \"kubernetes.io/projected/54f68d3c-1db5-4d9e-88e2-c970adb3a4fc-kube-api-access-sh9vt\") pod \"ovn-northd-0\" (UID: \"54f68d3c-1db5-4d9e-88e2-c970adb3a4fc\") " pod="openstack/ovn-northd-0" Mar 18 19:32:29 crc kubenswrapper[4830]: I0318 19:32:29.693141 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54f68d3c-1db5-4d9e-88e2-c970adb3a4fc-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"54f68d3c-1db5-4d9e-88e2-c970adb3a4fc\") " pod="openstack/ovn-northd-0" Mar 18 19:32:29 crc kubenswrapper[4830]: I0318 19:32:29.693451 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/54f68d3c-1db5-4d9e-88e2-c970adb3a4fc-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"54f68d3c-1db5-4d9e-88e2-c970adb3a4fc\") " pod="openstack/ovn-northd-0" Mar 18 19:32:29 crc kubenswrapper[4830]: I0318 19:32:29.694193 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54f68d3c-1db5-4d9e-88e2-c970adb3a4fc-scripts\") pod \"ovn-northd-0\" (UID: \"54f68d3c-1db5-4d9e-88e2-c970adb3a4fc\") " pod="openstack/ovn-northd-0" Mar 18 19:32:29 crc kubenswrapper[4830]: I0318 19:32:29.694267 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54f68d3c-1db5-4d9e-88e2-c970adb3a4fc-config\") pod \"ovn-northd-0\" (UID: \"54f68d3c-1db5-4d9e-88e2-c970adb3a4fc\") " pod="openstack/ovn-northd-0" Mar 18 19:32:29 crc kubenswrapper[4830]: I0318 19:32:29.701491 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/54f68d3c-1db5-4d9e-88e2-c970adb3a4fc-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"54f68d3c-1db5-4d9e-88e2-c970adb3a4fc\") " pod="openstack/ovn-northd-0" Mar 18 19:32:29 crc kubenswrapper[4830]: I0318 19:32:29.702446 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/54f68d3c-1db5-4d9e-88e2-c970adb3a4fc-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"54f68d3c-1db5-4d9e-88e2-c970adb3a4fc\") " pod="openstack/ovn-northd-0" Mar 18 19:32:29 crc kubenswrapper[4830]: I0318 19:32:29.702547 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54f68d3c-1db5-4d9e-88e2-c970adb3a4fc-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"54f68d3c-1db5-4d9e-88e2-c970adb3a4fc\") " pod="openstack/ovn-northd-0" Mar 18 19:32:29 crc kubenswrapper[4830]: I0318 19:32:29.714997 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh9vt\" (UniqueName: \"kubernetes.io/projected/54f68d3c-1db5-4d9e-88e2-c970adb3a4fc-kube-api-access-sh9vt\") pod \"ovn-northd-0\" (UID: \"54f68d3c-1db5-4d9e-88e2-c970adb3a4fc\") " pod="openstack/ovn-northd-0" Mar 18 19:32:29 crc kubenswrapper[4830]: I0318 19:32:29.792937 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 19:32:30 crc kubenswrapper[4830]: I0318 19:32:30.260548 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 19:32:30 crc kubenswrapper[4830]: I0318 19:32:30.768568 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"54f68d3c-1db5-4d9e-88e2-c970adb3a4fc","Type":"ContainerStarted","Data":"29f665d3dfb147051a0f1b012d1dffcbe12533ee0dedf410d2bf28e5e10199b5"} Mar 18 19:32:30 crc kubenswrapper[4830]: I0318 19:32:30.769201 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"54f68d3c-1db5-4d9e-88e2-c970adb3a4fc","Type":"ContainerStarted","Data":"908113a80d8918747142b8205b727a77fde97bda8764df1392c61dc2d7b579ac"} Mar 18 19:32:30 crc kubenswrapper[4830]: I0318 19:32:30.769730 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"54f68d3c-1db5-4d9e-88e2-c970adb3a4fc","Type":"ContainerStarted","Data":"55f4cd3b4bde1d0625bc56b6b37705779a580b1c7777609e0c0c03afa5ff8947"} Mar 18 19:32:30 crc kubenswrapper[4830]: I0318 19:32:30.769854 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 18 19:32:30 crc kubenswrapper[4830]: I0318 19:32:30.797308 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.797281923 podStartE2EDuration="1.797281923s" podCreationTimestamp="2026-03-18 19:32:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:32:30.793051334 +0000 UTC m=+5385.360681676" watchObservedRunningTime="2026-03-18 19:32:30.797281923 +0000 UTC m=+5385.364912295" Mar 18 19:32:49 crc kubenswrapper[4830]: I0318 19:32:49.896947 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 18 19:33:27 crc kubenswrapper[4830]: I0318 19:33:27.323404 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6r27l/must-gather-cgl7l"] Mar 18 19:33:27 crc kubenswrapper[4830]: I0318 19:33:27.325072 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6r27l/must-gather-cgl7l" Mar 18 19:33:27 crc kubenswrapper[4830]: I0318 19:33:27.331117 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6r27l"/"openshift-service-ca.crt" Mar 18 19:33:27 crc kubenswrapper[4830]: I0318 19:33:27.331194 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-6r27l"/"default-dockercfg-rc5cd" Mar 18 19:33:27 crc kubenswrapper[4830]: I0318 19:33:27.331292 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6r27l"/"kube-root-ca.crt" Mar 18 19:33:27 crc kubenswrapper[4830]: I0318 19:33:27.338822 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6r27l/must-gather-cgl7l"] Mar 18 19:33:27 crc kubenswrapper[4830]: I0318 19:33:27.470844 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/40d9417e-f228-46aa-a7c8-81f506659186-must-gather-output\") pod \"must-gather-cgl7l\" (UID: \"40d9417e-f228-46aa-a7c8-81f506659186\") " pod="openshift-must-gather-6r27l/must-gather-cgl7l" Mar 18 19:33:27 crc kubenswrapper[4830]: I0318 19:33:27.470903 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fft99\" (UniqueName: \"kubernetes.io/projected/40d9417e-f228-46aa-a7c8-81f506659186-kube-api-access-fft99\") pod \"must-gather-cgl7l\" (UID: \"40d9417e-f228-46aa-a7c8-81f506659186\") " pod="openshift-must-gather-6r27l/must-gather-cgl7l" Mar 18 19:33:27 crc kubenswrapper[4830]: I0318 19:33:27.572945 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/40d9417e-f228-46aa-a7c8-81f506659186-must-gather-output\") pod \"must-gather-cgl7l\" (UID: \"40d9417e-f228-46aa-a7c8-81f506659186\") " pod="openshift-must-gather-6r27l/must-gather-cgl7l" Mar 18 19:33:27 crc kubenswrapper[4830]: I0318 19:33:27.573320 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fft99\" (UniqueName: \"kubernetes.io/projected/40d9417e-f228-46aa-a7c8-81f506659186-kube-api-access-fft99\") pod \"must-gather-cgl7l\" (UID: \"40d9417e-f228-46aa-a7c8-81f506659186\") " pod="openshift-must-gather-6r27l/must-gather-cgl7l" Mar 18 19:33:27 crc kubenswrapper[4830]: I0318 19:33:27.573502 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/40d9417e-f228-46aa-a7c8-81f506659186-must-gather-output\") pod \"must-gather-cgl7l\" (UID: \"40d9417e-f228-46aa-a7c8-81f506659186\") " pod="openshift-must-gather-6r27l/must-gather-cgl7l" Mar 18 19:33:27 crc kubenswrapper[4830]: I0318 19:33:27.600570 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fft99\" (UniqueName: \"kubernetes.io/projected/40d9417e-f228-46aa-a7c8-81f506659186-kube-api-access-fft99\") pod \"must-gather-cgl7l\" (UID: \"40d9417e-f228-46aa-a7c8-81f506659186\") " pod="openshift-must-gather-6r27l/must-gather-cgl7l" Mar 18 19:33:27 crc kubenswrapper[4830]: I0318 19:33:27.643152 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6r27l/must-gather-cgl7l" Mar 18 19:33:28 crc kubenswrapper[4830]: I0318 19:33:28.117184 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6r27l/must-gather-cgl7l"] Mar 18 19:33:28 crc kubenswrapper[4830]: I0318 19:33:28.344660 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6r27l/must-gather-cgl7l" event={"ID":"40d9417e-f228-46aa-a7c8-81f506659186","Type":"ContainerStarted","Data":"e810eed8fcd4ef62ef0efb87808acaccb36e488ef6c8535b1eca9f50a971b4de"} Mar 18 19:33:34 crc kubenswrapper[4830]: I0318 19:33:34.416745 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6r27l/must-gather-cgl7l" event={"ID":"40d9417e-f228-46aa-a7c8-81f506659186","Type":"ContainerStarted","Data":"ae72e7eddd74bf29ab46beaf2231e7164d31d8a25fee373f104b0c715d88f1bd"} Mar 18 19:33:34 crc kubenswrapper[4830]: I0318 19:33:34.417684 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6r27l/must-gather-cgl7l" event={"ID":"40d9417e-f228-46aa-a7c8-81f506659186","Type":"ContainerStarted","Data":"413131c532a8cd0d8ba5d1d51ae9f8e660c80d11698c75ef1501c86ba96e7ff5"} Mar 18 19:33:34 crc kubenswrapper[4830]: I0318 19:33:34.445324 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6r27l/must-gather-cgl7l" podStartSLOduration=1.751061728 podStartE2EDuration="7.445293341s" podCreationTimestamp="2026-03-18 19:33:27 +0000 UTC" firstStartedPulling="2026-03-18 19:33:28.125475365 +0000 UTC m=+5442.693105697" lastFinishedPulling="2026-03-18 19:33:33.819706958 +0000 UTC m=+5448.387337310" observedRunningTime="2026-03-18 19:33:34.431303996 +0000 UTC m=+5448.998934368" watchObservedRunningTime="2026-03-18 19:33:34.445293341 +0000 UTC m=+5449.012923713" Mar 18 19:33:34 crc kubenswrapper[4830]: I0318 19:33:34.726812 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6r27l/crc-debug-zq67s"] Mar 18 19:33:34 crc kubenswrapper[4830]: I0318 19:33:34.727706 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6r27l/crc-debug-zq67s" Mar 18 19:33:34 crc kubenswrapper[4830]: I0318 19:33:34.916571 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c830b09-8226-45f1-943f-6182986ec6bb-host\") pod \"crc-debug-zq67s\" (UID: \"8c830b09-8226-45f1-943f-6182986ec6bb\") " pod="openshift-must-gather-6r27l/crc-debug-zq67s" Mar 18 19:33:34 crc kubenswrapper[4830]: I0318 19:33:34.917164 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n6gg\" (UniqueName: \"kubernetes.io/projected/8c830b09-8226-45f1-943f-6182986ec6bb-kube-api-access-5n6gg\") pod \"crc-debug-zq67s\" (UID: \"8c830b09-8226-45f1-943f-6182986ec6bb\") " pod="openshift-must-gather-6r27l/crc-debug-zq67s" Mar 18 19:33:35 crc kubenswrapper[4830]: I0318 19:33:35.018626 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n6gg\" (UniqueName: \"kubernetes.io/projected/8c830b09-8226-45f1-943f-6182986ec6bb-kube-api-access-5n6gg\") pod \"crc-debug-zq67s\" (UID: \"8c830b09-8226-45f1-943f-6182986ec6bb\") " pod="openshift-must-gather-6r27l/crc-debug-zq67s" Mar 18 19:33:35 crc kubenswrapper[4830]: I0318 19:33:35.018806 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c830b09-8226-45f1-943f-6182986ec6bb-host\") pod \"crc-debug-zq67s\" (UID: \"8c830b09-8226-45f1-943f-6182986ec6bb\") " pod="openshift-must-gather-6r27l/crc-debug-zq67s" Mar 18 19:33:35 crc kubenswrapper[4830]: I0318 19:33:35.018976 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c830b09-8226-45f1-943f-6182986ec6bb-host\") pod \"crc-debug-zq67s\" (UID: \"8c830b09-8226-45f1-943f-6182986ec6bb\") " pod="openshift-must-gather-6r27l/crc-debug-zq67s" Mar 18 19:33:35 crc kubenswrapper[4830]: I0318 19:33:35.046121 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n6gg\" (UniqueName: \"kubernetes.io/projected/8c830b09-8226-45f1-943f-6182986ec6bb-kube-api-access-5n6gg\") pod \"crc-debug-zq67s\" (UID: \"8c830b09-8226-45f1-943f-6182986ec6bb\") " pod="openshift-must-gather-6r27l/crc-debug-zq67s" Mar 18 19:33:35 crc kubenswrapper[4830]: I0318 19:33:35.048828 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6r27l/crc-debug-zq67s" Mar 18 19:33:35 crc kubenswrapper[4830]: I0318 19:33:35.426844 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6r27l/crc-debug-zq67s" event={"ID":"8c830b09-8226-45f1-943f-6182986ec6bb","Type":"ContainerStarted","Data":"f1513045ed5d7862aa7c3540b70f3845d9becdc9b8019812c813193611ef8117"} Mar 18 19:33:46 crc kubenswrapper[4830]: I0318 19:33:46.519110 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6r27l/crc-debug-zq67s" event={"ID":"8c830b09-8226-45f1-943f-6182986ec6bb","Type":"ContainerStarted","Data":"82b0782b4012662d0d892f8c021f7d47d87c798c6505077e79cadb9d21947411"} Mar 18 19:33:46 crc kubenswrapper[4830]: I0318 19:33:46.534843 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6r27l/crc-debug-zq67s" podStartSLOduration=2.393265729 podStartE2EDuration="12.534820488s" podCreationTimestamp="2026-03-18 19:33:34 +0000 UTC" firstStartedPulling="2026-03-18 19:33:35.110396258 +0000 UTC m=+5449.678026600" lastFinishedPulling="2026-03-18 19:33:45.251951027 +0000 UTC m=+5459.819581359" observedRunningTime="2026-03-18 19:33:46.531311499 +0000 UTC m=+5461.098941841" watchObservedRunningTime="2026-03-18 19:33:46.534820488 +0000 UTC m=+5461.102450840" Mar 18 19:34:00 crc kubenswrapper[4830]: I0318 19:34:00.148321 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564374-xb5r4"] Mar 18 19:34:00 crc kubenswrapper[4830]: I0318 19:34:00.150352 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564374-xb5r4" Mar 18 19:34:00 crc kubenswrapper[4830]: I0318 19:34:00.153081 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:34:00 crc kubenswrapper[4830]: I0318 19:34:00.153446 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 19:34:00 crc kubenswrapper[4830]: I0318 19:34:00.153472 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:34:00 crc kubenswrapper[4830]: I0318 19:34:00.157302 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564374-xb5r4"] Mar 18 19:34:00 crc kubenswrapper[4830]: I0318 19:34:00.225974 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlzpg\" (UniqueName: \"kubernetes.io/projected/c289a25d-46e5-48a1-8e1a-ec0415ee0bd2-kube-api-access-tlzpg\") pod \"auto-csr-approver-29564374-xb5r4\" (UID: \"c289a25d-46e5-48a1-8e1a-ec0415ee0bd2\") " pod="openshift-infra/auto-csr-approver-29564374-xb5r4" Mar 18 19:34:00 crc kubenswrapper[4830]: I0318 19:34:00.327785 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlzpg\" (UniqueName: \"kubernetes.io/projected/c289a25d-46e5-48a1-8e1a-ec0415ee0bd2-kube-api-access-tlzpg\") pod \"auto-csr-approver-29564374-xb5r4\" (UID: \"c289a25d-46e5-48a1-8e1a-ec0415ee0bd2\") " pod="openshift-infra/auto-csr-approver-29564374-xb5r4" Mar 18 19:34:00 crc kubenswrapper[4830]: I0318 19:34:00.352220 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlzpg\" (UniqueName: \"kubernetes.io/projected/c289a25d-46e5-48a1-8e1a-ec0415ee0bd2-kube-api-access-tlzpg\") pod \"auto-csr-approver-29564374-xb5r4\" (UID: \"c289a25d-46e5-48a1-8e1a-ec0415ee0bd2\") " pod="openshift-infra/auto-csr-approver-29564374-xb5r4" Mar 18 19:34:00 crc kubenswrapper[4830]: I0318 19:34:00.475652 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564374-xb5r4" Mar 18 19:34:01 crc kubenswrapper[4830]: I0318 19:34:01.653700 4830 generic.go:334] "Generic (PLEG): container finished" podID="8c830b09-8226-45f1-943f-6182986ec6bb" containerID="82b0782b4012662d0d892f8c021f7d47d87c798c6505077e79cadb9d21947411" exitCode=0 Mar 18 19:34:01 crc kubenswrapper[4830]: I0318 19:34:01.653923 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6r27l/crc-debug-zq67s" event={"ID":"8c830b09-8226-45f1-943f-6182986ec6bb","Type":"ContainerDied","Data":"82b0782b4012662d0d892f8c021f7d47d87c798c6505077e79cadb9d21947411"} Mar 18 19:34:01 crc kubenswrapper[4830]: I0318 19:34:01.771311 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564374-xb5r4"] Mar 18 19:34:02 crc kubenswrapper[4830]: I0318 19:34:02.663028 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564374-xb5r4" event={"ID":"c289a25d-46e5-48a1-8e1a-ec0415ee0bd2","Type":"ContainerStarted","Data":"8c95a48888a605e4e06f134060d79e71102c612cde3995c05d5f5820568258a9"} Mar 18 19:34:02 crc kubenswrapper[4830]: I0318 19:34:02.800204 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6r27l/crc-debug-zq67s" Mar 18 19:34:02 crc kubenswrapper[4830]: I0318 19:34:02.832706 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6r27l/crc-debug-zq67s"] Mar 18 19:34:02 crc kubenswrapper[4830]: I0318 19:34:02.839028 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6r27l/crc-debug-zq67s"] Mar 18 19:34:02 crc kubenswrapper[4830]: I0318 19:34:02.870480 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5n6gg\" (UniqueName: \"kubernetes.io/projected/8c830b09-8226-45f1-943f-6182986ec6bb-kube-api-access-5n6gg\") pod \"8c830b09-8226-45f1-943f-6182986ec6bb\" (UID: \"8c830b09-8226-45f1-943f-6182986ec6bb\") " Mar 18 19:34:02 crc kubenswrapper[4830]: I0318 19:34:02.870640 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c830b09-8226-45f1-943f-6182986ec6bb-host\") pod \"8c830b09-8226-45f1-943f-6182986ec6bb\" (UID: \"8c830b09-8226-45f1-943f-6182986ec6bb\") " Mar 18 19:34:02 crc kubenswrapper[4830]: I0318 19:34:02.870779 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8c830b09-8226-45f1-943f-6182986ec6bb-host" (OuterVolumeSpecName: "host") pod "8c830b09-8226-45f1-943f-6182986ec6bb" (UID: "8c830b09-8226-45f1-943f-6182986ec6bb"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 19:34:02 crc kubenswrapper[4830]: I0318 19:34:02.871002 4830 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c830b09-8226-45f1-943f-6182986ec6bb-host\") on node \"crc\" DevicePath \"\"" Mar 18 19:34:02 crc kubenswrapper[4830]: I0318 19:34:02.875837 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c830b09-8226-45f1-943f-6182986ec6bb-kube-api-access-5n6gg" (OuterVolumeSpecName: "kube-api-access-5n6gg") pod "8c830b09-8226-45f1-943f-6182986ec6bb" (UID: "8c830b09-8226-45f1-943f-6182986ec6bb"). InnerVolumeSpecName "kube-api-access-5n6gg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:34:02 crc kubenswrapper[4830]: I0318 19:34:02.972931 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5n6gg\" (UniqueName: \"kubernetes.io/projected/8c830b09-8226-45f1-943f-6182986ec6bb-kube-api-access-5n6gg\") on node \"crc\" DevicePath \"\"" Mar 18 19:34:04 crc kubenswrapper[4830]: I0318 19:34:04.097522 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6r27l/crc-debug-s2n7b"] Mar 18 19:34:04 crc kubenswrapper[4830]: E0318 19:34:04.098979 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c830b09-8226-45f1-943f-6182986ec6bb" containerName="container-00" Mar 18 19:34:04 crc kubenswrapper[4830]: I0318 19:34:04.099161 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c830b09-8226-45f1-943f-6182986ec6bb" containerName="container-00" Mar 18 19:34:04 crc kubenswrapper[4830]: I0318 19:34:04.099566 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c830b09-8226-45f1-943f-6182986ec6bb" containerName="container-00" Mar 18 19:34:04 crc kubenswrapper[4830]: I0318 19:34:04.100539 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6r27l/crc-debug-s2n7b" Mar 18 19:34:04 crc kubenswrapper[4830]: I0318 19:34:04.100847 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1513045ed5d7862aa7c3540b70f3845d9becdc9b8019812c813193611ef8117" Mar 18 19:34:04 crc kubenswrapper[4830]: I0318 19:34:04.100912 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6r27l/crc-debug-zq67s" Mar 18 19:34:04 crc kubenswrapper[4830]: I0318 19:34:04.128176 4830 generic.go:334] "Generic (PLEG): container finished" podID="c289a25d-46e5-48a1-8e1a-ec0415ee0bd2" containerID="c2957f435f804f1afe69869cfd2f39a75df751ca9c6696ad1f1cbe7baab0573e" exitCode=0 Mar 18 19:34:04 crc kubenswrapper[4830]: I0318 19:34:04.128209 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564374-xb5r4" event={"ID":"c289a25d-46e5-48a1-8e1a-ec0415ee0bd2","Type":"ContainerDied","Data":"c2957f435f804f1afe69869cfd2f39a75df751ca9c6696ad1f1cbe7baab0573e"} Mar 18 19:34:04 crc kubenswrapper[4830]: I0318 19:34:04.243235 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c830b09-8226-45f1-943f-6182986ec6bb" path="/var/lib/kubelet/pods/8c830b09-8226-45f1-943f-6182986ec6bb/volumes" Mar 18 19:34:04 crc kubenswrapper[4830]: I0318 19:34:04.281937 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcp4w\" (UniqueName: \"kubernetes.io/projected/029e0d78-d3b3-427d-bf94-8a1134040ce7-kube-api-access-tcp4w\") pod \"crc-debug-s2n7b\" (UID: \"029e0d78-d3b3-427d-bf94-8a1134040ce7\") " pod="openshift-must-gather-6r27l/crc-debug-s2n7b" Mar 18 19:34:04 crc kubenswrapper[4830]: I0318 19:34:04.282001 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/029e0d78-d3b3-427d-bf94-8a1134040ce7-host\") pod \"crc-debug-s2n7b\" (UID: \"029e0d78-d3b3-427d-bf94-8a1134040ce7\") " pod="openshift-must-gather-6r27l/crc-debug-s2n7b" Mar 18 19:34:04 crc kubenswrapper[4830]: I0318 19:34:04.384829 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcp4w\" (UniqueName: \"kubernetes.io/projected/029e0d78-d3b3-427d-bf94-8a1134040ce7-kube-api-access-tcp4w\") pod \"crc-debug-s2n7b\" (UID: \"029e0d78-d3b3-427d-bf94-8a1134040ce7\") " pod="openshift-must-gather-6r27l/crc-debug-s2n7b" Mar 18 19:34:04 crc kubenswrapper[4830]: I0318 19:34:04.385204 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/029e0d78-d3b3-427d-bf94-8a1134040ce7-host\") pod \"crc-debug-s2n7b\" (UID: \"029e0d78-d3b3-427d-bf94-8a1134040ce7\") " pod="openshift-must-gather-6r27l/crc-debug-s2n7b" Mar 18 19:34:04 crc kubenswrapper[4830]: I0318 19:34:04.385718 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/029e0d78-d3b3-427d-bf94-8a1134040ce7-host\") pod \"crc-debug-s2n7b\" (UID: \"029e0d78-d3b3-427d-bf94-8a1134040ce7\") " pod="openshift-must-gather-6r27l/crc-debug-s2n7b" Mar 18 19:34:04 crc kubenswrapper[4830]: I0318 19:34:04.422854 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcp4w\" (UniqueName: \"kubernetes.io/projected/029e0d78-d3b3-427d-bf94-8a1134040ce7-kube-api-access-tcp4w\") pod \"crc-debug-s2n7b\" (UID: \"029e0d78-d3b3-427d-bf94-8a1134040ce7\") " pod="openshift-must-gather-6r27l/crc-debug-s2n7b" Mar 18 19:34:04 crc kubenswrapper[4830]: I0318 19:34:04.436425 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6r27l/crc-debug-s2n7b" Mar 18 19:34:04 crc kubenswrapper[4830]: W0318 19:34:04.474950 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod029e0d78_d3b3_427d_bf94_8a1134040ce7.slice/crio-c4045188b139bb1882479837481b581b7a39a9aa5f4821cd7ee7281a768773cb WatchSource:0}: Error finding container c4045188b139bb1882479837481b581b7a39a9aa5f4821cd7ee7281a768773cb: Status 404 returned error can't find the container with id c4045188b139bb1882479837481b581b7a39a9aa5f4821cd7ee7281a768773cb Mar 18 19:34:05 crc kubenswrapper[4830]: I0318 19:34:05.136988 4830 generic.go:334] "Generic (PLEG): container finished" podID="029e0d78-d3b3-427d-bf94-8a1134040ce7" containerID="6ad1c3bba87fcadc576c13864e5422d738a7b290264fe8b1b06d202384ee42b2" exitCode=1 Mar 18 19:34:05 crc kubenswrapper[4830]: I0318 19:34:05.137091 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6r27l/crc-debug-s2n7b" event={"ID":"029e0d78-d3b3-427d-bf94-8a1134040ce7","Type":"ContainerDied","Data":"6ad1c3bba87fcadc576c13864e5422d738a7b290264fe8b1b06d202384ee42b2"} Mar 18 19:34:05 crc kubenswrapper[4830]: I0318 19:34:05.137288 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6r27l/crc-debug-s2n7b" event={"ID":"029e0d78-d3b3-427d-bf94-8a1134040ce7","Type":"ContainerStarted","Data":"c4045188b139bb1882479837481b581b7a39a9aa5f4821cd7ee7281a768773cb"} Mar 18 19:34:05 crc kubenswrapper[4830]: I0318 19:34:05.179788 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6r27l/crc-debug-s2n7b"] Mar 18 19:34:05 crc kubenswrapper[4830]: I0318 19:34:05.185548 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6r27l/crc-debug-s2n7b"] Mar 18 19:34:05 crc kubenswrapper[4830]: I0318 19:34:05.448014 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564374-xb5r4" Mar 18 19:34:05 crc kubenswrapper[4830]: I0318 19:34:05.601807 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlzpg\" (UniqueName: \"kubernetes.io/projected/c289a25d-46e5-48a1-8e1a-ec0415ee0bd2-kube-api-access-tlzpg\") pod \"c289a25d-46e5-48a1-8e1a-ec0415ee0bd2\" (UID: \"c289a25d-46e5-48a1-8e1a-ec0415ee0bd2\") " Mar 18 19:34:05 crc kubenswrapper[4830]: I0318 19:34:05.607009 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c289a25d-46e5-48a1-8e1a-ec0415ee0bd2-kube-api-access-tlzpg" (OuterVolumeSpecName: "kube-api-access-tlzpg") pod "c289a25d-46e5-48a1-8e1a-ec0415ee0bd2" (UID: "c289a25d-46e5-48a1-8e1a-ec0415ee0bd2"). InnerVolumeSpecName "kube-api-access-tlzpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:34:05 crc kubenswrapper[4830]: I0318 19:34:05.703737 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlzpg\" (UniqueName: \"kubernetes.io/projected/c289a25d-46e5-48a1-8e1a-ec0415ee0bd2-kube-api-access-tlzpg\") on node \"crc\" DevicePath \"\"" Mar 18 19:34:06 crc kubenswrapper[4830]: I0318 19:34:06.147035 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564374-xb5r4" event={"ID":"c289a25d-46e5-48a1-8e1a-ec0415ee0bd2","Type":"ContainerDied","Data":"8c95a48888a605e4e06f134060d79e71102c612cde3995c05d5f5820568258a9"} Mar 18 19:34:06 crc kubenswrapper[4830]: I0318 19:34:06.147069 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c95a48888a605e4e06f134060d79e71102c612cde3995c05d5f5820568258a9" Mar 18 19:34:06 crc kubenswrapper[4830]: I0318 19:34:06.147088 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564374-xb5r4" Mar 18 19:34:06 crc kubenswrapper[4830]: I0318 19:34:06.258232 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6r27l/crc-debug-s2n7b" Mar 18 19:34:06 crc kubenswrapper[4830]: I0318 19:34:06.413745 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcp4w\" (UniqueName: \"kubernetes.io/projected/029e0d78-d3b3-427d-bf94-8a1134040ce7-kube-api-access-tcp4w\") pod \"029e0d78-d3b3-427d-bf94-8a1134040ce7\" (UID: \"029e0d78-d3b3-427d-bf94-8a1134040ce7\") " Mar 18 19:34:06 crc kubenswrapper[4830]: I0318 19:34:06.413924 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/029e0d78-d3b3-427d-bf94-8a1134040ce7-host\") pod \"029e0d78-d3b3-427d-bf94-8a1134040ce7\" (UID: \"029e0d78-d3b3-427d-bf94-8a1134040ce7\") " Mar 18 19:34:06 crc kubenswrapper[4830]: I0318 19:34:06.414115 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/029e0d78-d3b3-427d-bf94-8a1134040ce7-host" (OuterVolumeSpecName: "host") pod "029e0d78-d3b3-427d-bf94-8a1134040ce7" (UID: "029e0d78-d3b3-427d-bf94-8a1134040ce7"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 19:34:06 crc kubenswrapper[4830]: I0318 19:34:06.415287 4830 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/029e0d78-d3b3-427d-bf94-8a1134040ce7-host\") on node \"crc\" DevicePath \"\"" Mar 18 19:34:06 crc kubenswrapper[4830]: I0318 19:34:06.427519 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/029e0d78-d3b3-427d-bf94-8a1134040ce7-kube-api-access-tcp4w" (OuterVolumeSpecName: "kube-api-access-tcp4w") pod "029e0d78-d3b3-427d-bf94-8a1134040ce7" (UID: "029e0d78-d3b3-427d-bf94-8a1134040ce7"). InnerVolumeSpecName "kube-api-access-tcp4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:34:06 crc kubenswrapper[4830]: I0318 19:34:06.516843 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcp4w\" (UniqueName: \"kubernetes.io/projected/029e0d78-d3b3-427d-bf94-8a1134040ce7-kube-api-access-tcp4w\") on node \"crc\" DevicePath \"\"" Mar 18 19:34:06 crc kubenswrapper[4830]: I0318 19:34:06.516944 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564368-6rcr9"] Mar 18 19:34:06 crc kubenswrapper[4830]: I0318 19:34:06.523729 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564368-6rcr9"] Mar 18 19:34:07 crc kubenswrapper[4830]: I0318 19:34:07.159533 4830 scope.go:117] "RemoveContainer" containerID="6ad1c3bba87fcadc576c13864e5422d738a7b290264fe8b1b06d202384ee42b2" Mar 18 19:34:07 crc kubenswrapper[4830]: I0318 19:34:07.159567 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6r27l/crc-debug-s2n7b" Mar 18 19:34:08 crc kubenswrapper[4830]: I0318 19:34:08.244425 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="029e0d78-d3b3-427d-bf94-8a1134040ce7" path="/var/lib/kubelet/pods/029e0d78-d3b3-427d-bf94-8a1134040ce7/volumes" Mar 18 19:34:08 crc kubenswrapper[4830]: I0318 19:34:08.245166 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6e61852-2fc0-4654-9fe3-084767781d4d" path="/var/lib/kubelet/pods/b6e61852-2fc0-4654-9fe3-084767781d4d/volumes" Mar 18 19:34:20 crc kubenswrapper[4830]: I0318 19:34:20.727513 4830 scope.go:117] "RemoveContainer" containerID="4ac8ccfb14a6a8f02d4a7e58c9ffed928a41635000ad34fc38fad25125e240ba" Mar 18 19:34:27 crc kubenswrapper[4830]: I0318 19:34:27.455297 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7b9c58654c-md7lg_9db59c48-e641-4065-8d37-60d2aa70d67e/init/0.log" Mar 18 19:34:27 crc kubenswrapper[4830]: I0318 19:34:27.606101 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7b9c58654c-md7lg_9db59c48-e641-4065-8d37-60d2aa70d67e/init/0.log" Mar 18 19:34:27 crc kubenswrapper[4830]: I0318 19:34:27.638000 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7b9c58654c-md7lg_9db59c48-e641-4065-8d37-60d2aa70d67e/dnsmasq-dns/0.log" Mar 18 19:34:27 crc kubenswrapper[4830]: I0318 19:34:27.772801 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-copy-data_11c7a099-6459-475e-9c09-d2acfacec884/adoption/0.log" Mar 18 19:34:27 crc kubenswrapper[4830]: I0318 19:34:27.885536 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_ae6d3440-d0f8-4bf4-a1cd-0a9cf63bc92d/memcached/0.log" Mar 18 19:34:27 crc kubenswrapper[4830]: I0318 19:34:27.968989 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_06e64bc2-e23b-4b88-8e5e-87d979fd10f3/mysql-bootstrap/0.log" Mar 18 19:34:28 crc kubenswrapper[4830]: I0318 19:34:28.153589 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_06e64bc2-e23b-4b88-8e5e-87d979fd10f3/mysql-bootstrap/0.log" Mar 18 19:34:28 crc kubenswrapper[4830]: I0318 19:34:28.156848 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_06e64bc2-e23b-4b88-8e5e-87d979fd10f3/galera/0.log" Mar 18 19:34:28 crc kubenswrapper[4830]: I0318 19:34:28.205068 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b9f6fa82-cef7-4a49-a6ac-053e904d5142/mysql-bootstrap/0.log" Mar 18 19:34:28 crc kubenswrapper[4830]: I0318 19:34:28.336200 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b9f6fa82-cef7-4a49-a6ac-053e904d5142/mysql-bootstrap/0.log" Mar 18 19:34:28 crc kubenswrapper[4830]: I0318 19:34:28.374227 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b9f6fa82-cef7-4a49-a6ac-053e904d5142/galera/0.log" Mar 18 19:34:28 crc kubenswrapper[4830]: I0318 19:34:28.397800 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-copy-data_d3970015-6cb7-4a6e-a33a-8701129b7335/adoption/0.log" Mar 18 19:34:28 crc kubenswrapper[4830]: I0318 19:34:28.550159 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_54f68d3c-1db5-4d9e-88e2-c970adb3a4fc/openstack-network-exporter/0.log" Mar 18 19:34:28 crc kubenswrapper[4830]: I0318 19:34:28.569699 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_54f68d3c-1db5-4d9e-88e2-c970adb3a4fc/ovn-northd/0.log" Mar 18 19:34:28 crc kubenswrapper[4830]: I0318 19:34:28.723921 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3a144c27-96d7-47b3-abb0-ceefa288f311/openstack-network-exporter/0.log" Mar 18 19:34:28 crc kubenswrapper[4830]: I0318 19:34:28.731099 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3a144c27-96d7-47b3-abb0-ceefa288f311/ovsdbserver-nb/0.log" Mar 18 19:34:28 crc kubenswrapper[4830]: I0318 19:34:28.797472 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_7a66a884-3818-43a9-86e9-27d718abe5a6/openstack-network-exporter/0.log" Mar 18 19:34:28 crc kubenswrapper[4830]: I0318 19:34:28.912340 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_7a66a884-3818-43a9-86e9-27d718abe5a6/ovsdbserver-nb/0.log" Mar 18 19:34:28 crc kubenswrapper[4830]: I0318 19:34:28.952242 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_c6fd0990-a53c-473e-9a5f-44721bfac06c/openstack-network-exporter/0.log" Mar 18 19:34:28 crc kubenswrapper[4830]: I0318 19:34:28.955283 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_c6fd0990-a53c-473e-9a5f-44721bfac06c/ovsdbserver-nb/0.log" Mar 18 19:34:29 crc kubenswrapper[4830]: I0318 19:34:29.120401 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06/ovsdbserver-sb/0.log" Mar 18 19:34:29 crc kubenswrapper[4830]: I0318 19:34:29.134571 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b9e7c8a4-0bd6-4f1b-b681-3590db6d0d06/openstack-network-exporter/0.log" Mar 18 19:34:29 crc kubenswrapper[4830]: I0318 19:34:29.271697 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_b542ab9f-9954-4ce5-9b4a-e043befc3ffb/openstack-network-exporter/0.log" Mar 18 19:34:29 crc kubenswrapper[4830]: I0318 19:34:29.273439 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_b542ab9f-9954-4ce5-9b4a-e043befc3ffb/ovsdbserver-sb/0.log" Mar 18 19:34:29 crc kubenswrapper[4830]: I0318 19:34:29.315192 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_aa7136b8-263e-426e-9a0b-b9951c57dc16/openstack-network-exporter/0.log" Mar 18 19:34:29 crc kubenswrapper[4830]: I0318 19:34:29.448969 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_aa7136b8-263e-426e-9a0b-b9951c57dc16/ovsdbserver-sb/0.log" Mar 18 19:34:29 crc kubenswrapper[4830]: I0318 19:34:29.492840 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_26291628-d7b3-47e1-a7a4-81c569506ff8/setup-container/0.log" Mar 18 19:34:29 crc kubenswrapper[4830]: I0318 19:34:29.509615 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:34:29 crc kubenswrapper[4830]: I0318 19:34:29.509680 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:34:29 crc kubenswrapper[4830]: I0318 19:34:29.668984 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_26291628-d7b3-47e1-a7a4-81c569506ff8/rabbitmq/0.log" Mar 18 19:34:29 crc kubenswrapper[4830]: I0318 19:34:29.675484 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_26291628-d7b3-47e1-a7a4-81c569506ff8/setup-container/0.log" Mar 18 19:34:29 crc kubenswrapper[4830]: I0318 19:34:29.715969 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4cd410e3-2ad4-4616-86ca-a5423a651ab7/setup-container/0.log" Mar 18 19:34:29 crc kubenswrapper[4830]: I0318 19:34:29.854355 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_root-account-create-update-xf6rc_f6cac547-ed4f-439e-80d4-2deb7c49dec7/mariadb-account-create-update/0.log" Mar 18 19:34:29 crc kubenswrapper[4830]: I0318 19:34:29.859951 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4cd410e3-2ad4-4616-86ca-a5423a651ab7/rabbitmq/0.log" Mar 18 19:34:29 crc kubenswrapper[4830]: I0318 19:34:29.889469 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4cd410e3-2ad4-4616-86ca-a5423a651ab7/setup-container/0.log" Mar 18 19:34:45 crc kubenswrapper[4830]: I0318 19:34:45.815945 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk_469b4b97-e3cf-43f8-b161-3dfe6489da28/util/0.log" Mar 18 19:34:45 crc kubenswrapper[4830]: I0318 19:34:45.981864 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk_469b4b97-e3cf-43f8-b161-3dfe6489da28/util/0.log" Mar 18 19:34:45 crc kubenswrapper[4830]: I0318 19:34:45.995188 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk_469b4b97-e3cf-43f8-b161-3dfe6489da28/pull/0.log" Mar 18 19:34:46 crc kubenswrapper[4830]: I0318 19:34:46.022547 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk_469b4b97-e3cf-43f8-b161-3dfe6489da28/pull/0.log" Mar 18 19:34:46 crc kubenswrapper[4830]: I0318 19:34:46.171530 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk_469b4b97-e3cf-43f8-b161-3dfe6489da28/util/0.log" Mar 18 19:34:46 crc kubenswrapper[4830]: I0318 19:34:46.199022 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk_469b4b97-e3cf-43f8-b161-3dfe6489da28/extract/0.log" Mar 18 19:34:46 crc kubenswrapper[4830]: I0318 19:34:46.201043 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cwmhdk_469b4b97-e3cf-43f8-b161-3dfe6489da28/pull/0.log" Mar 18 19:34:46 crc kubenswrapper[4830]: I0318 19:34:46.414061 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-n478g_09f3d007-f621-4c30-a3f8-f3280a7db75d/manager/0.log" Mar 18 19:34:46 crc kubenswrapper[4830]: I0318 19:34:46.595728 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-bw766_cebd0fbd-7733-464a-aead-539d69b70b04/manager/0.log" Mar 18 19:34:46 crc kubenswrapper[4830]: I0318 19:34:46.753757 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-cspdp_bd94092b-4a34-4f83-9aaa-5ddac374e97a/manager/0.log" Mar 18 19:34:46 crc kubenswrapper[4830]: I0318 19:34:46.848254 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-84z2j_b1bf7404-a81d-42e8-bc1b-157c5cd791b0/manager/0.log" Mar 18 19:34:47 crc kubenswrapper[4830]: I0318 19:34:47.025675 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-b2f94_45854217-6284-4678-903f-d64b4088ec29/manager/0.log" Mar 18 19:34:47 crc kubenswrapper[4830]: I0318 19:34:47.308966 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-nvkbv_abc98dd9-9c79-45a2-a641-023633c1b75b/manager/0.log" Mar 18 19:34:47 crc kubenswrapper[4830]: I0318 19:34:47.466246 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-mg24p_3d9eef66-a93f-432a-8391-f6a55dc3f800/manager/0.log" Mar 18 19:34:47 crc kubenswrapper[4830]: I0318 19:34:47.604666 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-dls7x_7e1c4c11-ddb2-45a3-94eb-8b5b27866996/manager/0.log" Mar 18 19:34:47 crc kubenswrapper[4830]: I0318 19:34:47.675722 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-cpvkm_db7f527f-6421-4a26-9eae-5a68054b2a88/manager/0.log" Mar 18 19:34:47 crc kubenswrapper[4830]: I0318 19:34:47.932292 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-vjzn6_6a1a3dc1-1535-4091-97a8-abc6dc2d1388/manager/0.log" Mar 18 19:34:48 crc kubenswrapper[4830]: I0318 19:34:48.025966 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-6jmp6_2f1b63b3-9d24-4f33-8d39-7decb4a7e0a8/manager/0.log" Mar 18 19:34:48 crc kubenswrapper[4830]: I0318 19:34:48.086743 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-4tflr_ff6a5b70-c9ae-4087-b4fd-e24a712e6e33/manager/0.log" Mar 18 19:34:48 crc kubenswrapper[4830]: I0318 19:34:48.204660 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-4mzzb_1adc20ef-dc7a-4dce-ba20-6fe6eb6146f8/manager/0.log" Mar 18 19:34:48 crc kubenswrapper[4830]: I0318 19:34:48.244441 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-6282s_28e6b5ce-47e2-43fb-b524-c2e642dbd166/manager/0.log" Mar 18 19:34:48 crc kubenswrapper[4830]: I0318 19:34:48.443992 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-74c4796899qtfsv_7a872110-8984-419e-b5ed-177ec5669cfc/manager/0.log" Mar 18 19:34:48 crc kubenswrapper[4830]: I0318 19:34:48.634677 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-b85c4d696-j9fcx_2897dfaf-4627-4986-8920-e6c789387c3c/operator/0.log" Mar 18 19:34:48 crc kubenswrapper[4830]: I0318 19:34:48.919262 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-f9lwr_2f7546fc-e4cb-438e-8091-74ed74bee260/registry-server/0.log" Mar 18 19:34:49 crc kubenswrapper[4830]: I0318 19:34:49.108005 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-d76n9_e7de1579-8ed8-4434-818e-a5ea0c366cf7/manager/0.log" Mar 18 19:34:49 crc kubenswrapper[4830]: I0318 19:34:49.145408 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-ct2qr_ed9931be-8631-4e73-92bd-ff18076dcb69/manager/0.log" Mar 18 19:34:49 crc kubenswrapper[4830]: I0318 19:34:49.383567 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-85j54_e663b49d-cd0f-4a18-8284-b12dad6c136a/operator/0.log" Mar 18 19:34:49 crc kubenswrapper[4830]: I0318 19:34:49.496403 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-86bd8996f6-x2hs7_8c1cd3c4-f399-4810-bdaf-53644d7555ff/manager/0.log" Mar 18 19:34:49 crc kubenswrapper[4830]: I0318 19:34:49.514344 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-thwsc_d60eef08-1564-405f-b4c1-3f391bbf741d/manager/0.log" Mar 18 19:34:49 crc kubenswrapper[4830]: I0318 19:34:49.663129 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-pfj9c_debd9df4-01da-4f5e-b66c-d1f7bd08574f/manager/0.log" Mar 18 19:34:49 crc kubenswrapper[4830]: I0318 19:34:49.683136 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-x9zq5_6fd482c2-5871-4d59-8402-1e57b06055b0/manager/0.log" Mar 18 19:34:49 crc kubenswrapper[4830]: I0318 19:34:49.702265 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-r9bbq_f68f6008-a3fb-4039-85a0-c0475455ac09/manager/0.log" Mar 18 19:34:59 crc kubenswrapper[4830]: I0318 19:34:59.510356 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:34:59 crc kubenswrapper[4830]: I0318 19:34:59.511041 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:35:08 crc kubenswrapper[4830]: I0318 19:35:08.497825 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-g5rg5_57c79b01-642f-4c45-886c-b3e852c0bc23/control-plane-machine-set-operator/0.log" Mar 18 19:35:08 crc kubenswrapper[4830]: I0318 19:35:08.618324 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-f77k6_9279fbd5-1378-4a9a-b35d-85a7b9430674/kube-rbac-proxy/0.log" Mar 18 19:35:08 crc kubenswrapper[4830]: I0318 19:35:08.671594 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-f77k6_9279fbd5-1378-4a9a-b35d-85a7b9430674/machine-api-operator/0.log" Mar 18 19:35:22 crc kubenswrapper[4830]: I0318 19:35:22.088455 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-6d85b_c3a619f8-3b2a-4852-996d-eb0fb7c0ae8e/cert-manager-controller/0.log" Mar 18 19:35:22 crc kubenswrapper[4830]: I0318 19:35:22.234460 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-k5pz4_4668ca7c-f03c-4236-83f8-17c1c156c754/cert-manager-webhook/0.log" Mar 18 19:35:22 crc kubenswrapper[4830]: I0318 19:35:22.256026 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-l4ncg_286e3d3b-4f86-45b5-819c-1e6c107eb985/cert-manager-cainjector/0.log" Mar 18 19:35:29 crc kubenswrapper[4830]: I0318 19:35:29.509673 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:35:29 crc kubenswrapper[4830]: I0318 19:35:29.511408 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:35:29 crc kubenswrapper[4830]: I0318 19:35:29.511567 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" Mar 18 19:35:29 crc kubenswrapper[4830]: I0318 19:35:29.512388 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"359983fc39dc77da53ab9c5f404699ef39069a5a5ae55ff906f4e3793c0766a4"} pod="openshift-machine-config-operator/machine-config-daemon-plzpb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 19:35:29 crc kubenswrapper[4830]: I0318 19:35:29.512539 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" containerID="cri-o://359983fc39dc77da53ab9c5f404699ef39069a5a5ae55ff906f4e3793c0766a4" gracePeriod=600 Mar 18 19:35:29 crc kubenswrapper[4830]: I0318 19:35:29.888283 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerDied","Data":"359983fc39dc77da53ab9c5f404699ef39069a5a5ae55ff906f4e3793c0766a4"} Mar 18 19:35:29 crc kubenswrapper[4830]: I0318 19:35:29.888712 4830 scope.go:117] "RemoveContainer" containerID="2b2583ffa620998bcee9d8c36a0271eaafa77acd768af552c84519fa8e9cd8a5" Mar 18 19:35:29 crc kubenswrapper[4830]: I0318 19:35:29.888879 4830 generic.go:334] "Generic (PLEG): container finished" podID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerID="359983fc39dc77da53ab9c5f404699ef39069a5a5ae55ff906f4e3793c0766a4" exitCode=0 Mar 18 19:35:29 crc kubenswrapper[4830]: I0318 19:35:29.889489 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerStarted","Data":"12586c31daa01567ccf75e5331eca78a5750508a1fa1f0cccc481425a63bde9b"} Mar 18 19:35:35 crc kubenswrapper[4830]: I0318 19:35:35.138614 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-fhhnd_0d49e8e6-69ee-4c37-a687-433ad140281d/nmstate-console-plugin/0.log" Mar 18 19:35:35 crc kubenswrapper[4830]: I0318 19:35:35.249867 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-qdbvx_f480edf1-a3da-4567-bf97-d3067dd88a64/nmstate-handler/0.log" Mar 18 19:35:35 crc kubenswrapper[4830]: I0318 19:35:35.285217 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-lfvdq_fb2bc12d-5865-47fb-bb2e-85e8c0dad6c8/kube-rbac-proxy/0.log" Mar 18 19:35:35 crc kubenswrapper[4830]: I0318 19:35:35.349971 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-lfvdq_fb2bc12d-5865-47fb-bb2e-85e8c0dad6c8/nmstate-metrics/0.log" Mar 18 19:35:35 crc kubenswrapper[4830]: I0318 19:35:35.470688 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-9mfrv_50496be8-7302-44e4-87ac-b976ebb2099e/nmstate-operator/0.log" Mar 18 19:35:35 crc kubenswrapper[4830]: I0318 19:35:35.529616 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-xww5t_aca99d47-cac9-4c1d-97da-5ad69260de41/nmstate-webhook/0.log" Mar 18 19:35:54 crc kubenswrapper[4830]: I0318 19:35:54.078757 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-xf6rc"] Mar 18 19:35:54 crc kubenswrapper[4830]: I0318 19:35:54.092030 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-xf6rc"] Mar 18 19:35:54 crc kubenswrapper[4830]: I0318 19:35:54.247743 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6cac547-ed4f-439e-80d4-2deb7c49dec7" path="/var/lib/kubelet/pods/f6cac547-ed4f-439e-80d4-2deb7c49dec7/volumes" Mar 18 19:36:00 crc kubenswrapper[4830]: I0318 19:36:00.155219 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564376-vzw8s"] Mar 18 19:36:00 crc kubenswrapper[4830]: E0318 19:36:00.156114 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="029e0d78-d3b3-427d-bf94-8a1134040ce7" containerName="container-00" Mar 18 19:36:00 crc kubenswrapper[4830]: I0318 19:36:00.156137 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="029e0d78-d3b3-427d-bf94-8a1134040ce7" containerName="container-00" Mar 18 19:36:00 crc kubenswrapper[4830]: E0318 19:36:00.156155 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c289a25d-46e5-48a1-8e1a-ec0415ee0bd2" containerName="oc" Mar 18 19:36:00 crc kubenswrapper[4830]: I0318 19:36:00.156168 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c289a25d-46e5-48a1-8e1a-ec0415ee0bd2" containerName="oc" Mar 18 19:36:00 crc kubenswrapper[4830]: I0318 19:36:00.156462 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="c289a25d-46e5-48a1-8e1a-ec0415ee0bd2" containerName="oc" Mar 18 19:36:00 crc kubenswrapper[4830]: I0318 19:36:00.156485 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="029e0d78-d3b3-427d-bf94-8a1134040ce7" containerName="container-00" Mar 18 19:36:00 crc kubenswrapper[4830]: I0318 19:36:00.157427 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564376-vzw8s" Mar 18 19:36:00 crc kubenswrapper[4830]: I0318 19:36:00.159160 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 19:36:00 crc kubenswrapper[4830]: I0318 19:36:00.159530 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:36:00 crc kubenswrapper[4830]: I0318 19:36:00.159638 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:36:00 crc kubenswrapper[4830]: I0318 19:36:00.165598 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564376-vzw8s"] Mar 18 19:36:00 crc kubenswrapper[4830]: I0318 19:36:00.244127 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc5sv\" (UniqueName: \"kubernetes.io/projected/35f27262-49ea-414d-8bc2-d77e9a8c102b-kube-api-access-mc5sv\") pod \"auto-csr-approver-29564376-vzw8s\" (UID: \"35f27262-49ea-414d-8bc2-d77e9a8c102b\") " pod="openshift-infra/auto-csr-approver-29564376-vzw8s" Mar 18 19:36:00 crc kubenswrapper[4830]: I0318 19:36:00.346611 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc5sv\" (UniqueName: \"kubernetes.io/projected/35f27262-49ea-414d-8bc2-d77e9a8c102b-kube-api-access-mc5sv\") pod \"auto-csr-approver-29564376-vzw8s\" (UID: \"35f27262-49ea-414d-8bc2-d77e9a8c102b\") " pod="openshift-infra/auto-csr-approver-29564376-vzw8s" Mar 18 19:36:00 crc kubenswrapper[4830]: I0318 19:36:00.368450 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc5sv\" (UniqueName: \"kubernetes.io/projected/35f27262-49ea-414d-8bc2-d77e9a8c102b-kube-api-access-mc5sv\") pod \"auto-csr-approver-29564376-vzw8s\" (UID: \"35f27262-49ea-414d-8bc2-d77e9a8c102b\") " pod="openshift-infra/auto-csr-approver-29564376-vzw8s" Mar 18 19:36:00 crc kubenswrapper[4830]: I0318 19:36:00.495432 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564376-vzw8s" Mar 18 19:36:00 crc kubenswrapper[4830]: I0318 19:36:00.948285 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564376-vzw8s"] Mar 18 19:36:00 crc kubenswrapper[4830]: W0318 19:36:00.949997 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35f27262_49ea_414d_8bc2_d77e9a8c102b.slice/crio-750ed938cb222916d76ef5216e1860d035d4b292a4acd182405e78f272f31758 WatchSource:0}: Error finding container 750ed938cb222916d76ef5216e1860d035d4b292a4acd182405e78f272f31758: Status 404 returned error can't find the container with id 750ed938cb222916d76ef5216e1860d035d4b292a4acd182405e78f272f31758 Mar 18 19:36:01 crc kubenswrapper[4830]: I0318 19:36:01.184216 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564376-vzw8s" event={"ID":"35f27262-49ea-414d-8bc2-d77e9a8c102b","Type":"ContainerStarted","Data":"750ed938cb222916d76ef5216e1860d035d4b292a4acd182405e78f272f31758"} Mar 18 19:36:03 crc kubenswrapper[4830]: I0318 19:36:03.202485 4830 generic.go:334] "Generic (PLEG): container finished" podID="35f27262-49ea-414d-8bc2-d77e9a8c102b" containerID="9f5af2dec175593426328231acc5447cf7333c3fb0c3b454a4cfc3d471cccdbb" exitCode=0 Mar 18 19:36:03 crc kubenswrapper[4830]: I0318 19:36:03.202546 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564376-vzw8s" event={"ID":"35f27262-49ea-414d-8bc2-d77e9a8c102b","Type":"ContainerDied","Data":"9f5af2dec175593426328231acc5447cf7333c3fb0c3b454a4cfc3d471cccdbb"} Mar 18 19:36:03 crc kubenswrapper[4830]: I0318 19:36:03.838532 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-xcvlh_73874e54-172a-4960-8e37-e495e16e4ff7/kube-rbac-proxy/0.log" Mar 18 19:36:04 crc kubenswrapper[4830]: I0318 19:36:04.023755 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhmfc_0f86ec1c-6e52-4bd0-af13-dbb311f12c6b/cp-frr-files/0.log" Mar 18 19:36:04 crc kubenswrapper[4830]: I0318 19:36:04.172070 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-xcvlh_73874e54-172a-4960-8e37-e495e16e4ff7/controller/0.log" Mar 18 19:36:04 crc kubenswrapper[4830]: I0318 19:36:04.217366 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhmfc_0f86ec1c-6e52-4bd0-af13-dbb311f12c6b/cp-frr-files/0.log" Mar 18 19:36:04 crc kubenswrapper[4830]: I0318 19:36:04.237626 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhmfc_0f86ec1c-6e52-4bd0-af13-dbb311f12c6b/cp-reloader/0.log" Mar 18 19:36:04 crc kubenswrapper[4830]: I0318 19:36:04.241830 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhmfc_0f86ec1c-6e52-4bd0-af13-dbb311f12c6b/cp-metrics/0.log" Mar 18 19:36:04 crc kubenswrapper[4830]: I0318 19:36:04.376549 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhmfc_0f86ec1c-6e52-4bd0-af13-dbb311f12c6b/cp-reloader/0.log" Mar 18 19:36:04 crc kubenswrapper[4830]: I0318 19:36:04.499228 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564376-vzw8s" Mar 18 19:36:04 crc kubenswrapper[4830]: I0318 19:36:04.560428 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhmfc_0f86ec1c-6e52-4bd0-af13-dbb311f12c6b/cp-metrics/0.log" Mar 18 19:36:04 crc kubenswrapper[4830]: I0318 19:36:04.568421 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhmfc_0f86ec1c-6e52-4bd0-af13-dbb311f12c6b/cp-metrics/0.log" Mar 18 19:36:04 crc kubenswrapper[4830]: I0318 19:36:04.569710 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhmfc_0f86ec1c-6e52-4bd0-af13-dbb311f12c6b/cp-frr-files/0.log" Mar 18 19:36:04 crc kubenswrapper[4830]: I0318 19:36:04.589317 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhmfc_0f86ec1c-6e52-4bd0-af13-dbb311f12c6b/cp-reloader/0.log" Mar 18 19:36:04 crc kubenswrapper[4830]: I0318 19:36:04.614030 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc5sv\" (UniqueName: \"kubernetes.io/projected/35f27262-49ea-414d-8bc2-d77e9a8c102b-kube-api-access-mc5sv\") pod \"35f27262-49ea-414d-8bc2-d77e9a8c102b\" (UID: \"35f27262-49ea-414d-8bc2-d77e9a8c102b\") " Mar 18 19:36:05 crc kubenswrapper[4830]: I0318 19:36:04.782833 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhmfc_0f86ec1c-6e52-4bd0-af13-dbb311f12c6b/cp-metrics/0.log" Mar 18 19:36:05 crc kubenswrapper[4830]: I0318 19:36:04.788005 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhmfc_0f86ec1c-6e52-4bd0-af13-dbb311f12c6b/cp-reloader/0.log" Mar 18 19:36:05 crc kubenswrapper[4830]: I0318 19:36:04.792083 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhmfc_0f86ec1c-6e52-4bd0-af13-dbb311f12c6b/cp-frr-files/0.log" Mar 18 19:36:05 crc kubenswrapper[4830]: I0318 19:36:04.825240 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhmfc_0f86ec1c-6e52-4bd0-af13-dbb311f12c6b/controller/0.log" Mar 18 19:36:05 crc kubenswrapper[4830]: I0318 19:36:04.956348 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhmfc_0f86ec1c-6e52-4bd0-af13-dbb311f12c6b/kube-rbac-proxy/0.log" Mar 18 19:36:05 crc kubenswrapper[4830]: I0318 19:36:04.972564 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhmfc_0f86ec1c-6e52-4bd0-af13-dbb311f12c6b/frr-metrics/0.log" Mar 18 19:36:05 crc kubenswrapper[4830]: I0318 19:36:04.983229 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhmfc_0f86ec1c-6e52-4bd0-af13-dbb311f12c6b/kube-rbac-proxy-frr/0.log" Mar 18 19:36:05 crc kubenswrapper[4830]: I0318 19:36:05.000990 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35f27262-49ea-414d-8bc2-d77e9a8c102b-kube-api-access-mc5sv" (OuterVolumeSpecName: "kube-api-access-mc5sv") pod "35f27262-49ea-414d-8bc2-d77e9a8c102b" (UID: "35f27262-49ea-414d-8bc2-d77e9a8c102b"). InnerVolumeSpecName "kube-api-access-mc5sv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:36:05 crc kubenswrapper[4830]: I0318 19:36:05.019177 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mc5sv\" (UniqueName: \"kubernetes.io/projected/35f27262-49ea-414d-8bc2-d77e9a8c102b-kube-api-access-mc5sv\") on node \"crc\" DevicePath \"\"" Mar 18 19:36:05 crc kubenswrapper[4830]: I0318 19:36:05.148591 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhmfc_0f86ec1c-6e52-4bd0-af13-dbb311f12c6b/reloader/0.log" Mar 18 19:36:05 crc kubenswrapper[4830]: I0318 19:36:05.168585 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-kjllh_ac9878d6-cca1-49b1-bca8-3ad035256043/frr-k8s-webhook-server/0.log" Mar 18 19:36:05 crc kubenswrapper[4830]: I0318 19:36:05.221068 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564376-vzw8s" event={"ID":"35f27262-49ea-414d-8bc2-d77e9a8c102b","Type":"ContainerDied","Data":"750ed938cb222916d76ef5216e1860d035d4b292a4acd182405e78f272f31758"} Mar 18 19:36:05 crc kubenswrapper[4830]: I0318 19:36:05.222081 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="750ed938cb222916d76ef5216e1860d035d4b292a4acd182405e78f272f31758" Mar 18 19:36:05 crc kubenswrapper[4830]: I0318 19:36:05.221111 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564376-vzw8s" Mar 18 19:36:05 crc kubenswrapper[4830]: I0318 19:36:05.395466 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-86dfc68bcc-zxmhg_684ba3af-f549-47d2-81b9-52a1993a93ff/manager/0.log" Mar 18 19:36:05 crc kubenswrapper[4830]: I0318 19:36:05.576847 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564370-57ddf"] Mar 18 19:36:05 crc kubenswrapper[4830]: I0318 19:36:05.578993 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564370-57ddf"] Mar 18 19:36:05 crc kubenswrapper[4830]: I0318 19:36:05.590453 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-85f6f6858-qtgzb_1ca766c2-0f41-45d7-b219-d5293e66ca65/webhook-server/0.log" Mar 18 19:36:05 crc kubenswrapper[4830]: I0318 19:36:05.613688 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7csnn_c470043c-dedf-46ee-a690-ccc828a69f63/kube-rbac-proxy/0.log" Mar 18 19:36:06 crc kubenswrapper[4830]: I0318 19:36:06.205503 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7csnn_c470043c-dedf-46ee-a690-ccc828a69f63/speaker/0.log" Mar 18 19:36:06 crc kubenswrapper[4830]: I0318 19:36:06.248523 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70b24fea-e1aa-4450-89d7-6f932c84f2c7" path="/var/lib/kubelet/pods/70b24fea-e1aa-4450-89d7-6f932c84f2c7/volumes" Mar 18 19:36:06 crc kubenswrapper[4830]: I0318 19:36:06.939627 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhmfc_0f86ec1c-6e52-4bd0-af13-dbb311f12c6b/frr/0.log" Mar 18 19:36:20 crc kubenswrapper[4830]: I0318 19:36:20.409204 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj_6700b709-fb58-41aa-a8e9-aad61b389860/util/0.log" Mar 18 19:36:20 crc kubenswrapper[4830]: I0318 19:36:20.506729 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj_6700b709-fb58-41aa-a8e9-aad61b389860/util/0.log" Mar 18 19:36:20 crc kubenswrapper[4830]: I0318 19:36:20.551117 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj_6700b709-fb58-41aa-a8e9-aad61b389860/pull/0.log" Mar 18 19:36:20 crc kubenswrapper[4830]: I0318 19:36:20.583108 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj_6700b709-fb58-41aa-a8e9-aad61b389860/pull/0.log" Mar 18 19:36:20 crc kubenswrapper[4830]: I0318 19:36:20.699871 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj_6700b709-fb58-41aa-a8e9-aad61b389860/util/0.log" Mar 18 19:36:20 crc kubenswrapper[4830]: I0318 19:36:20.733079 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj_6700b709-fb58-41aa-a8e9-aad61b389860/pull/0.log" Mar 18 19:36:20 crc kubenswrapper[4830]: I0318 19:36:20.741905 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kpvxj_6700b709-fb58-41aa-a8e9-aad61b389860/extract/0.log" Mar 18 19:36:20 crc kubenswrapper[4830]: I0318 19:36:20.848535 4830 scope.go:117] "RemoveContainer" containerID="745b75b99bb44c6f5cc8cbffd58e409153b4b2829442ea7e5bbbd3bcfddb1377" Mar 18 19:36:20 crc kubenswrapper[4830]: I0318 19:36:20.867855 4830 scope.go:117] "RemoveContainer" containerID="7c0e8892ac206e736e07af575208adb9e3727da708072c019517ec285b107e9a" Mar 18 19:36:20 crc kubenswrapper[4830]: I0318 19:36:20.868600 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c_3ced8bbc-00e4-4b23-88dc-809962a78be2/util/0.log" Mar 18 19:36:21 crc kubenswrapper[4830]: I0318 19:36:21.015192 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c_3ced8bbc-00e4-4b23-88dc-809962a78be2/util/0.log" Mar 18 19:36:21 crc kubenswrapper[4830]: I0318 19:36:21.034838 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c_3ced8bbc-00e4-4b23-88dc-809962a78be2/pull/0.log" Mar 18 19:36:21 crc kubenswrapper[4830]: I0318 19:36:21.053798 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c_3ced8bbc-00e4-4b23-88dc-809962a78be2/pull/0.log" Mar 18 19:36:21 crc kubenswrapper[4830]: I0318 19:36:21.204887 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c_3ced8bbc-00e4-4b23-88dc-809962a78be2/util/0.log" Mar 18 19:36:21 crc kubenswrapper[4830]: I0318 19:36:21.205513 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c_3ced8bbc-00e4-4b23-88dc-809962a78be2/extract/0.log" Mar 18 19:36:21 crc kubenswrapper[4830]: I0318 19:36:21.226032 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1qs86c_3ced8bbc-00e4-4b23-88dc-809962a78be2/pull/0.log" Mar 18 19:36:21 crc kubenswrapper[4830]: I0318 19:36:21.374824 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq_0b224be1-4685-41ee-b31e-8dfbcb80968d/util/0.log" Mar 18 19:36:21 crc kubenswrapper[4830]: I0318 19:36:21.549450 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq_0b224be1-4685-41ee-b31e-8dfbcb80968d/pull/0.log" Mar 18 19:36:21 crc kubenswrapper[4830]: I0318 19:36:21.552856 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq_0b224be1-4685-41ee-b31e-8dfbcb80968d/pull/0.log" Mar 18 19:36:21 crc kubenswrapper[4830]: I0318 19:36:21.568982 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq_0b224be1-4685-41ee-b31e-8dfbcb80968d/util/0.log" Mar 18 19:36:21 crc kubenswrapper[4830]: I0318 19:36:21.705378 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq_0b224be1-4685-41ee-b31e-8dfbcb80968d/util/0.log" Mar 18 19:36:21 crc kubenswrapper[4830]: I0318 19:36:21.723390 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq_0b224be1-4685-41ee-b31e-8dfbcb80968d/extract/0.log" Mar 18 19:36:21 crc kubenswrapper[4830]: I0318 19:36:21.738116 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xggsq_0b224be1-4685-41ee-b31e-8dfbcb80968d/pull/0.log" Mar 18 19:36:21 crc kubenswrapper[4830]: I0318 19:36:21.878999 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-88skg_f7518e45-59c3-47b1-bd28-fc7f74a2dfaa/extract-utilities/0.log" Mar 18 19:36:22 crc kubenswrapper[4830]: I0318 19:36:22.005446 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-88skg_f7518e45-59c3-47b1-bd28-fc7f74a2dfaa/extract-utilities/0.log" Mar 18 19:36:22 crc kubenswrapper[4830]: I0318 19:36:22.008322 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-88skg_f7518e45-59c3-47b1-bd28-fc7f74a2dfaa/extract-content/0.log" Mar 18 19:36:22 crc kubenswrapper[4830]: I0318 19:36:22.025879 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-88skg_f7518e45-59c3-47b1-bd28-fc7f74a2dfaa/extract-content/0.log" Mar 18 19:36:22 crc kubenswrapper[4830]: I0318 19:36:22.162175 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-88skg_f7518e45-59c3-47b1-bd28-fc7f74a2dfaa/extract-content/0.log" Mar 18 19:36:22 crc kubenswrapper[4830]: I0318 19:36:22.168557 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-88skg_f7518e45-59c3-47b1-bd28-fc7f74a2dfaa/extract-utilities/0.log" Mar 18 19:36:22 crc kubenswrapper[4830]: I0318 19:36:22.349267 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wnjfd_68e35d1e-15cb-4293-aa87-cb04b6dc1a72/extract-utilities/0.log" Mar 18 19:36:22 crc kubenswrapper[4830]: I0318 19:36:22.617451 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wnjfd_68e35d1e-15cb-4293-aa87-cb04b6dc1a72/extract-utilities/0.log" Mar 18 19:36:22 crc kubenswrapper[4830]: I0318 19:36:22.619557 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wnjfd_68e35d1e-15cb-4293-aa87-cb04b6dc1a72/extract-content/0.log" Mar 18 19:36:22 crc kubenswrapper[4830]: I0318 19:36:22.659141 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wnjfd_68e35d1e-15cb-4293-aa87-cb04b6dc1a72/extract-content/0.log" Mar 18 19:36:22 crc kubenswrapper[4830]: I0318 19:36:22.831107 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wnjfd_68e35d1e-15cb-4293-aa87-cb04b6dc1a72/extract-utilities/0.log" Mar 18 19:36:22 crc kubenswrapper[4830]: I0318 19:36:22.882332 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wnjfd_68e35d1e-15cb-4293-aa87-cb04b6dc1a72/extract-content/0.log" Mar 18 19:36:23 crc kubenswrapper[4830]: I0318 19:36:23.011701 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-88skg_f7518e45-59c3-47b1-bd28-fc7f74a2dfaa/registry-server/0.log" Mar 18 19:36:23 crc kubenswrapper[4830]: I0318 19:36:23.122741 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xqf2s_77c8fe94-c2c8-419b-a4c0-a1bc5f4d011f/marketplace-operator/0.log" Mar 18 19:36:23 crc kubenswrapper[4830]: I0318 19:36:23.305589 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7w7m4_c9ff1801-8502-4f37-aa28-55a689cbb2c1/extract-utilities/0.log" Mar 18 19:36:23 crc kubenswrapper[4830]: I0318 19:36:23.451470 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7w7m4_c9ff1801-8502-4f37-aa28-55a689cbb2c1/extract-utilities/0.log" Mar 18 19:36:23 crc kubenswrapper[4830]: I0318 19:36:23.538638 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7w7m4_c9ff1801-8502-4f37-aa28-55a689cbb2c1/extract-content/0.log" Mar 18 19:36:23 crc kubenswrapper[4830]: I0318 19:36:23.557809 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7w7m4_c9ff1801-8502-4f37-aa28-55a689cbb2c1/extract-content/0.log" Mar 18 19:36:23 crc kubenswrapper[4830]: I0318 19:36:23.637049 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wnjfd_68e35d1e-15cb-4293-aa87-cb04b6dc1a72/registry-server/0.log" Mar 18 19:36:23 crc kubenswrapper[4830]: I0318 19:36:23.659781 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7w7m4_c9ff1801-8502-4f37-aa28-55a689cbb2c1/extract-utilities/0.log" Mar 18 19:36:23 crc kubenswrapper[4830]: I0318 19:36:23.707559 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7w7m4_c9ff1801-8502-4f37-aa28-55a689cbb2c1/extract-content/0.log" Mar 18 19:36:23 crc kubenswrapper[4830]: I0318 19:36:23.845839 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6pt2b_f1147eda-0b31-4c1e-9923-f8d73c80f9a6/extract-utilities/0.log" Mar 18 19:36:23 crc kubenswrapper[4830]: I0318 19:36:23.894582 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7w7m4_c9ff1801-8502-4f37-aa28-55a689cbb2c1/registry-server/0.log" Mar 18 19:36:23 crc kubenswrapper[4830]: I0318 19:36:23.997638 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6pt2b_f1147eda-0b31-4c1e-9923-f8d73c80f9a6/extract-utilities/0.log" Mar 18 19:36:24 crc kubenswrapper[4830]: I0318 19:36:24.021348 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6pt2b_f1147eda-0b31-4c1e-9923-f8d73c80f9a6/extract-content/0.log" Mar 18 19:36:24 crc kubenswrapper[4830]: I0318 19:36:24.026635 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6pt2b_f1147eda-0b31-4c1e-9923-f8d73c80f9a6/extract-content/0.log" Mar 18 19:36:24 crc kubenswrapper[4830]: I0318 19:36:24.150764 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6pt2b_f1147eda-0b31-4c1e-9923-f8d73c80f9a6/extract-utilities/0.log" Mar 18 19:36:24 crc kubenswrapper[4830]: I0318 19:36:24.166039 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6pt2b_f1147eda-0b31-4c1e-9923-f8d73c80f9a6/extract-content/0.log" Mar 18 19:36:24 crc kubenswrapper[4830]: I0318 19:36:24.880934 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6pt2b_f1147eda-0b31-4c1e-9923-f8d73c80f9a6/registry-server/0.log" Mar 18 19:36:27 crc kubenswrapper[4830]: I0318 19:36:27.597814 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nmpk7"] Mar 18 19:36:27 crc kubenswrapper[4830]: E0318 19:36:27.598463 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35f27262-49ea-414d-8bc2-d77e9a8c102b" containerName="oc" Mar 18 19:36:27 crc kubenswrapper[4830]: I0318 19:36:27.598476 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="35f27262-49ea-414d-8bc2-d77e9a8c102b" containerName="oc" Mar 18 19:36:27 crc kubenswrapper[4830]: I0318 19:36:27.598691 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="35f27262-49ea-414d-8bc2-d77e9a8c102b" containerName="oc" Mar 18 19:36:27 crc kubenswrapper[4830]: I0318 19:36:27.600055 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nmpk7" Mar 18 19:36:27 crc kubenswrapper[4830]: I0318 19:36:27.614473 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nmpk7"] Mar 18 19:36:27 crc kubenswrapper[4830]: I0318 19:36:27.693359 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbgdw\" (UniqueName: \"kubernetes.io/projected/0f9676bf-2353-462e-baf2-308a57abaef3-kube-api-access-fbgdw\") pod \"redhat-operators-nmpk7\" (UID: \"0f9676bf-2353-462e-baf2-308a57abaef3\") " pod="openshift-marketplace/redhat-operators-nmpk7" Mar 18 19:36:27 crc kubenswrapper[4830]: I0318 19:36:27.693611 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f9676bf-2353-462e-baf2-308a57abaef3-catalog-content\") pod \"redhat-operators-nmpk7\" (UID: \"0f9676bf-2353-462e-baf2-308a57abaef3\") " pod="openshift-marketplace/redhat-operators-nmpk7" Mar 18 19:36:27 crc kubenswrapper[4830]: I0318 19:36:27.693671 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f9676bf-2353-462e-baf2-308a57abaef3-utilities\") pod \"redhat-operators-nmpk7\" (UID: \"0f9676bf-2353-462e-baf2-308a57abaef3\") " pod="openshift-marketplace/redhat-operators-nmpk7" Mar 18 19:36:27 crc kubenswrapper[4830]: I0318 19:36:27.794999 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbgdw\" (UniqueName: \"kubernetes.io/projected/0f9676bf-2353-462e-baf2-308a57abaef3-kube-api-access-fbgdw\") pod \"redhat-operators-nmpk7\" (UID: \"0f9676bf-2353-462e-baf2-308a57abaef3\") " pod="openshift-marketplace/redhat-operators-nmpk7" Mar 18 19:36:27 crc kubenswrapper[4830]: I0318 19:36:27.795051 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f9676bf-2353-462e-baf2-308a57abaef3-catalog-content\") pod \"redhat-operators-nmpk7\" (UID: \"0f9676bf-2353-462e-baf2-308a57abaef3\") " pod="openshift-marketplace/redhat-operators-nmpk7" Mar 18 19:36:27 crc kubenswrapper[4830]: I0318 19:36:27.795099 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f9676bf-2353-462e-baf2-308a57abaef3-utilities\") pod \"redhat-operators-nmpk7\" (UID: \"0f9676bf-2353-462e-baf2-308a57abaef3\") " pod="openshift-marketplace/redhat-operators-nmpk7" Mar 18 19:36:27 crc kubenswrapper[4830]: I0318 19:36:27.795690 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f9676bf-2353-462e-baf2-308a57abaef3-utilities\") pod \"redhat-operators-nmpk7\" (UID: \"0f9676bf-2353-462e-baf2-308a57abaef3\") " pod="openshift-marketplace/redhat-operators-nmpk7" Mar 18 19:36:27 crc kubenswrapper[4830]: I0318 19:36:27.796251 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f9676bf-2353-462e-baf2-308a57abaef3-catalog-content\") pod \"redhat-operators-nmpk7\" (UID: \"0f9676bf-2353-462e-baf2-308a57abaef3\") " pod="openshift-marketplace/redhat-operators-nmpk7" Mar 18 19:36:27 crc kubenswrapper[4830]: I0318 19:36:27.825201 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbgdw\" (UniqueName: \"kubernetes.io/projected/0f9676bf-2353-462e-baf2-308a57abaef3-kube-api-access-fbgdw\") pod \"redhat-operators-nmpk7\" (UID: \"0f9676bf-2353-462e-baf2-308a57abaef3\") " pod="openshift-marketplace/redhat-operators-nmpk7" Mar 18 19:36:27 crc kubenswrapper[4830]: I0318 19:36:27.922998 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nmpk7" Mar 18 19:36:28 crc kubenswrapper[4830]: I0318 19:36:28.362057 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nmpk7"] Mar 18 19:36:28 crc kubenswrapper[4830]: I0318 19:36:28.419349 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nmpk7" event={"ID":"0f9676bf-2353-462e-baf2-308a57abaef3","Type":"ContainerStarted","Data":"8d689ff3559094e4bff15d62e62f30ff97935367148f55f40a7ff7bd6cec9aed"} Mar 18 19:36:29 crc kubenswrapper[4830]: I0318 19:36:29.427834 4830 generic.go:334] "Generic (PLEG): container finished" podID="0f9676bf-2353-462e-baf2-308a57abaef3" containerID="8fe7467fc8ec36a438d92c7bcce68299cf5f40eb83f8878081c95d522e509a43" exitCode=0 Mar 18 19:36:29 crc kubenswrapper[4830]: I0318 19:36:29.427893 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nmpk7" event={"ID":"0f9676bf-2353-462e-baf2-308a57abaef3","Type":"ContainerDied","Data":"8fe7467fc8ec36a438d92c7bcce68299cf5f40eb83f8878081c95d522e509a43"} Mar 18 19:36:29 crc kubenswrapper[4830]: I0318 19:36:29.992732 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lqh98"] Mar 18 19:36:29 crc kubenswrapper[4830]: I0318 19:36:29.994320 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lqh98" Mar 18 19:36:30 crc kubenswrapper[4830]: I0318 19:36:30.007058 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lqh98"] Mar 18 19:36:30 crc kubenswrapper[4830]: I0318 19:36:30.030036 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf95m\" (UniqueName: \"kubernetes.io/projected/35ce8420-4a0e-4217-8685-54ac853d4e4e-kube-api-access-jf95m\") pod \"community-operators-lqh98\" (UID: \"35ce8420-4a0e-4217-8685-54ac853d4e4e\") " pod="openshift-marketplace/community-operators-lqh98" Mar 18 19:36:30 crc kubenswrapper[4830]: I0318 19:36:30.030107 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35ce8420-4a0e-4217-8685-54ac853d4e4e-utilities\") pod \"community-operators-lqh98\" (UID: \"35ce8420-4a0e-4217-8685-54ac853d4e4e\") " pod="openshift-marketplace/community-operators-lqh98" Mar 18 19:36:30 crc kubenswrapper[4830]: I0318 19:36:30.030486 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35ce8420-4a0e-4217-8685-54ac853d4e4e-catalog-content\") pod \"community-operators-lqh98\" (UID: \"35ce8420-4a0e-4217-8685-54ac853d4e4e\") " pod="openshift-marketplace/community-operators-lqh98" Mar 18 19:36:30 crc kubenswrapper[4830]: I0318 19:36:30.132481 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35ce8420-4a0e-4217-8685-54ac853d4e4e-catalog-content\") pod \"community-operators-lqh98\" (UID: \"35ce8420-4a0e-4217-8685-54ac853d4e4e\") " pod="openshift-marketplace/community-operators-lqh98" Mar 18 19:36:30 crc kubenswrapper[4830]: I0318 19:36:30.132551 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf95m\" (UniqueName: \"kubernetes.io/projected/35ce8420-4a0e-4217-8685-54ac853d4e4e-kube-api-access-jf95m\") pod \"community-operators-lqh98\" (UID: \"35ce8420-4a0e-4217-8685-54ac853d4e4e\") " pod="openshift-marketplace/community-operators-lqh98" Mar 18 19:36:30 crc kubenswrapper[4830]: I0318 19:36:30.132583 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35ce8420-4a0e-4217-8685-54ac853d4e4e-utilities\") pod \"community-operators-lqh98\" (UID: \"35ce8420-4a0e-4217-8685-54ac853d4e4e\") " pod="openshift-marketplace/community-operators-lqh98" Mar 18 19:36:30 crc kubenswrapper[4830]: I0318 19:36:30.133044 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35ce8420-4a0e-4217-8685-54ac853d4e4e-catalog-content\") pod \"community-operators-lqh98\" (UID: \"35ce8420-4a0e-4217-8685-54ac853d4e4e\") " pod="openshift-marketplace/community-operators-lqh98" Mar 18 19:36:30 crc kubenswrapper[4830]: I0318 19:36:30.133073 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35ce8420-4a0e-4217-8685-54ac853d4e4e-utilities\") pod \"community-operators-lqh98\" (UID: \"35ce8420-4a0e-4217-8685-54ac853d4e4e\") " pod="openshift-marketplace/community-operators-lqh98" Mar 18 19:36:30 crc kubenswrapper[4830]: I0318 19:36:30.151875 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf95m\" (UniqueName: \"kubernetes.io/projected/35ce8420-4a0e-4217-8685-54ac853d4e4e-kube-api-access-jf95m\") pod \"community-operators-lqh98\" (UID: \"35ce8420-4a0e-4217-8685-54ac853d4e4e\") " pod="openshift-marketplace/community-operators-lqh98" Mar 18 19:36:30 crc kubenswrapper[4830]: I0318 19:36:30.193646 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-khsbj"] Mar 18 19:36:30 crc kubenswrapper[4830]: I0318 19:36:30.195174 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-khsbj" Mar 18 19:36:30 crc kubenswrapper[4830]: I0318 19:36:30.204466 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-khsbj"] Mar 18 19:36:30 crc kubenswrapper[4830]: I0318 19:36:30.234578 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/133e1556-2942-4826-a1ca-8b3232c5332f-utilities\") pod \"certified-operators-khsbj\" (UID: \"133e1556-2942-4826-a1ca-8b3232c5332f\") " pod="openshift-marketplace/certified-operators-khsbj" Mar 18 19:36:30 crc kubenswrapper[4830]: I0318 19:36:30.234649 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/133e1556-2942-4826-a1ca-8b3232c5332f-catalog-content\") pod \"certified-operators-khsbj\" (UID: \"133e1556-2942-4826-a1ca-8b3232c5332f\") " pod="openshift-marketplace/certified-operators-khsbj" Mar 18 19:36:30 crc kubenswrapper[4830]: I0318 19:36:30.234681 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c4vv\" (UniqueName: \"kubernetes.io/projected/133e1556-2942-4826-a1ca-8b3232c5332f-kube-api-access-4c4vv\") pod \"certified-operators-khsbj\" (UID: \"133e1556-2942-4826-a1ca-8b3232c5332f\") " pod="openshift-marketplace/certified-operators-khsbj" Mar 18 19:36:30 crc kubenswrapper[4830]: I0318 19:36:30.335991 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lqh98" Mar 18 19:36:30 crc kubenswrapper[4830]: I0318 19:36:30.336400 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/133e1556-2942-4826-a1ca-8b3232c5332f-utilities\") pod \"certified-operators-khsbj\" (UID: \"133e1556-2942-4826-a1ca-8b3232c5332f\") " pod="openshift-marketplace/certified-operators-khsbj" Mar 18 19:36:30 crc kubenswrapper[4830]: I0318 19:36:30.336444 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/133e1556-2942-4826-a1ca-8b3232c5332f-catalog-content\") pod \"certified-operators-khsbj\" (UID: \"133e1556-2942-4826-a1ca-8b3232c5332f\") " pod="openshift-marketplace/certified-operators-khsbj" Mar 18 19:36:30 crc kubenswrapper[4830]: I0318 19:36:30.336466 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c4vv\" (UniqueName: \"kubernetes.io/projected/133e1556-2942-4826-a1ca-8b3232c5332f-kube-api-access-4c4vv\") pod \"certified-operators-khsbj\" (UID: \"133e1556-2942-4826-a1ca-8b3232c5332f\") " pod="openshift-marketplace/certified-operators-khsbj" Mar 18 19:36:30 crc kubenswrapper[4830]: I0318 19:36:30.336944 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/133e1556-2942-4826-a1ca-8b3232c5332f-utilities\") pod \"certified-operators-khsbj\" (UID: \"133e1556-2942-4826-a1ca-8b3232c5332f\") " pod="openshift-marketplace/certified-operators-khsbj" Mar 18 19:36:30 crc kubenswrapper[4830]: I0318 19:36:30.336961 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/133e1556-2942-4826-a1ca-8b3232c5332f-catalog-content\") pod \"certified-operators-khsbj\" (UID: \"133e1556-2942-4826-a1ca-8b3232c5332f\") " pod="openshift-marketplace/certified-operators-khsbj" Mar 18 19:36:30 crc kubenswrapper[4830]: I0318 19:36:30.353749 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c4vv\" (UniqueName: \"kubernetes.io/projected/133e1556-2942-4826-a1ca-8b3232c5332f-kube-api-access-4c4vv\") pod \"certified-operators-khsbj\" (UID: \"133e1556-2942-4826-a1ca-8b3232c5332f\") " pod="openshift-marketplace/certified-operators-khsbj" Mar 18 19:36:30 crc kubenswrapper[4830]: I0318 19:36:30.436612 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nmpk7" event={"ID":"0f9676bf-2353-462e-baf2-308a57abaef3","Type":"ContainerStarted","Data":"ed7b6fac970d2240f07dbb27c3e8c0915f7a38a027b4d7ed1fe1624fc4f9dd03"} Mar 18 19:36:30 crc kubenswrapper[4830]: I0318 19:36:30.514500 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-khsbj" Mar 18 19:36:30 crc kubenswrapper[4830]: I0318 19:36:30.827288 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-khsbj"] Mar 18 19:36:30 crc kubenswrapper[4830]: W0318 19:36:30.845518 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod133e1556_2942_4826_a1ca_8b3232c5332f.slice/crio-02b2fed93a1951b9ea05019ec1da01905a6e243e12271494fc0ad3ddbf2f1a26 WatchSource:0}: Error finding container 02b2fed93a1951b9ea05019ec1da01905a6e243e12271494fc0ad3ddbf2f1a26: Status 404 returned error can't find the container with id 02b2fed93a1951b9ea05019ec1da01905a6e243e12271494fc0ad3ddbf2f1a26 Mar 18 19:36:30 crc kubenswrapper[4830]: I0318 19:36:30.864365 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lqh98"] Mar 18 19:36:31 crc kubenswrapper[4830]: I0318 19:36:31.447943 4830 generic.go:334] "Generic (PLEG): container finished" podID="0f9676bf-2353-462e-baf2-308a57abaef3" containerID="ed7b6fac970d2240f07dbb27c3e8c0915f7a38a027b4d7ed1fe1624fc4f9dd03" exitCode=0 Mar 18 19:36:31 crc kubenswrapper[4830]: I0318 19:36:31.448002 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nmpk7" event={"ID":"0f9676bf-2353-462e-baf2-308a57abaef3","Type":"ContainerDied","Data":"ed7b6fac970d2240f07dbb27c3e8c0915f7a38a027b4d7ed1fe1624fc4f9dd03"} Mar 18 19:36:31 crc kubenswrapper[4830]: I0318 19:36:31.450284 4830 generic.go:334] "Generic (PLEG): container finished" podID="35ce8420-4a0e-4217-8685-54ac853d4e4e" containerID="81f2fe40705375851246c69029bc41d646ee3bf97924a1616cd92c57a1ac50f1" exitCode=0 Mar 18 19:36:31 crc kubenswrapper[4830]: I0318 19:36:31.450905 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lqh98" event={"ID":"35ce8420-4a0e-4217-8685-54ac853d4e4e","Type":"ContainerDied","Data":"81f2fe40705375851246c69029bc41d646ee3bf97924a1616cd92c57a1ac50f1"} Mar 18 19:36:31 crc kubenswrapper[4830]: I0318 19:36:31.450950 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lqh98" event={"ID":"35ce8420-4a0e-4217-8685-54ac853d4e4e","Type":"ContainerStarted","Data":"34b2237e8d881cce27f33f05f1083d1135c160ba33da87e0fbb78bb153562f11"} Mar 18 19:36:31 crc kubenswrapper[4830]: I0318 19:36:31.454324 4830 generic.go:334] "Generic (PLEG): container finished" podID="133e1556-2942-4826-a1ca-8b3232c5332f" containerID="9f6e9b69d19d94b22148a0f573893ba7df60bb0135a124083ca2e9a58e6c2ad1" exitCode=0 Mar 18 19:36:31 crc kubenswrapper[4830]: I0318 19:36:31.454386 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-khsbj" event={"ID":"133e1556-2942-4826-a1ca-8b3232c5332f","Type":"ContainerDied","Data":"9f6e9b69d19d94b22148a0f573893ba7df60bb0135a124083ca2e9a58e6c2ad1"} Mar 18 19:36:31 crc kubenswrapper[4830]: I0318 19:36:31.454435 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-khsbj" event={"ID":"133e1556-2942-4826-a1ca-8b3232c5332f","Type":"ContainerStarted","Data":"02b2fed93a1951b9ea05019ec1da01905a6e243e12271494fc0ad3ddbf2f1a26"} Mar 18 19:36:32 crc kubenswrapper[4830]: I0318 19:36:32.463538 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nmpk7" event={"ID":"0f9676bf-2353-462e-baf2-308a57abaef3","Type":"ContainerStarted","Data":"7b3599fbc3b2447eb10a49d6564616187c098716fc8946ee6033ef71857eb0bb"} Mar 18 19:36:32 crc kubenswrapper[4830]: I0318 19:36:32.467463 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lqh98" event={"ID":"35ce8420-4a0e-4217-8685-54ac853d4e4e","Type":"ContainerStarted","Data":"83149655ce1ee74a4f259fa13ddeb2308064c6d63fc1ca4ade56829782438c64"} Mar 18 19:36:32 crc kubenswrapper[4830]: I0318 19:36:32.485876 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nmpk7" podStartSLOduration=3.052840543 podStartE2EDuration="5.485859682s" podCreationTimestamp="2026-03-18 19:36:27 +0000 UTC" firstStartedPulling="2026-03-18 19:36:29.429979136 +0000 UTC m=+5623.997609478" lastFinishedPulling="2026-03-18 19:36:31.862998285 +0000 UTC m=+5626.430628617" observedRunningTime="2026-03-18 19:36:32.4790539 +0000 UTC m=+5627.046684232" watchObservedRunningTime="2026-03-18 19:36:32.485859682 +0000 UTC m=+5627.053490014" Mar 18 19:36:33 crc kubenswrapper[4830]: I0318 19:36:33.480072 4830 generic.go:334] "Generic (PLEG): container finished" podID="133e1556-2942-4826-a1ca-8b3232c5332f" containerID="04294fbc2d213c8bc5766bf3bbd08bc75a42240431e87541af7dd6c4d52cad1d" exitCode=0 Mar 18 19:36:33 crc kubenswrapper[4830]: I0318 19:36:33.480216 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-khsbj" event={"ID":"133e1556-2942-4826-a1ca-8b3232c5332f","Type":"ContainerDied","Data":"04294fbc2d213c8bc5766bf3bbd08bc75a42240431e87541af7dd6c4d52cad1d"} Mar 18 19:36:33 crc kubenswrapper[4830]: I0318 19:36:33.483048 4830 generic.go:334] "Generic (PLEG): container finished" podID="35ce8420-4a0e-4217-8685-54ac853d4e4e" containerID="83149655ce1ee74a4f259fa13ddeb2308064c6d63fc1ca4ade56829782438c64" exitCode=0 Mar 18 19:36:33 crc kubenswrapper[4830]: I0318 19:36:33.483422 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lqh98" event={"ID":"35ce8420-4a0e-4217-8685-54ac853d4e4e","Type":"ContainerDied","Data":"83149655ce1ee74a4f259fa13ddeb2308064c6d63fc1ca4ade56829782438c64"} Mar 18 19:36:34 crc kubenswrapper[4830]: I0318 19:36:34.496171 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-khsbj" event={"ID":"133e1556-2942-4826-a1ca-8b3232c5332f","Type":"ContainerStarted","Data":"79167c8c85d3991cc962816df40fad2baafad1a88d326e56d73f5ebfd05e2ac6"} Mar 18 19:36:34 crc kubenswrapper[4830]: I0318 19:36:34.498592 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lqh98" event={"ID":"35ce8420-4a0e-4217-8685-54ac853d4e4e","Type":"ContainerStarted","Data":"03ef2e33fc3a814a1a09793f67c275946fa178b015ccb4587e256a5b89da9984"} Mar 18 19:36:34 crc kubenswrapper[4830]: I0318 19:36:34.520110 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-khsbj" podStartSLOduration=2.065148403 podStartE2EDuration="4.52009374s" podCreationTimestamp="2026-03-18 19:36:30 +0000 UTC" firstStartedPulling="2026-03-18 19:36:31.457228768 +0000 UTC m=+5626.024859090" lastFinishedPulling="2026-03-18 19:36:33.912174085 +0000 UTC m=+5628.479804427" observedRunningTime="2026-03-18 19:36:34.516014755 +0000 UTC m=+5629.083645087" watchObservedRunningTime="2026-03-18 19:36:34.52009374 +0000 UTC m=+5629.087724062" Mar 18 19:36:34 crc kubenswrapper[4830]: I0318 19:36:34.539860 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lqh98" podStartSLOduration=3.052068845 podStartE2EDuration="5.539840057s" podCreationTimestamp="2026-03-18 19:36:29 +0000 UTC" firstStartedPulling="2026-03-18 19:36:31.452524155 +0000 UTC m=+5626.020154487" lastFinishedPulling="2026-03-18 19:36:33.940295377 +0000 UTC m=+5628.507925699" observedRunningTime="2026-03-18 19:36:34.53214068 +0000 UTC m=+5629.099771012" watchObservedRunningTime="2026-03-18 19:36:34.539840057 +0000 UTC m=+5629.107470379" Mar 18 19:36:37 crc kubenswrapper[4830]: I0318 19:36:37.923643 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nmpk7" Mar 18 19:36:37 crc kubenswrapper[4830]: I0318 19:36:37.924137 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nmpk7" Mar 18 19:36:38 crc kubenswrapper[4830]: I0318 19:36:38.973350 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nmpk7" podUID="0f9676bf-2353-462e-baf2-308a57abaef3" containerName="registry-server" probeResult="failure" output=< Mar 18 19:36:38 crc kubenswrapper[4830]: timeout: failed to connect service ":50051" within 1s Mar 18 19:36:38 crc kubenswrapper[4830]: > Mar 18 19:36:40 crc kubenswrapper[4830]: I0318 19:36:40.336335 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lqh98" Mar 18 19:36:40 crc kubenswrapper[4830]: I0318 19:36:40.336411 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lqh98" Mar 18 19:36:40 crc kubenswrapper[4830]: I0318 19:36:40.401091 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lqh98" Mar 18 19:36:40 crc kubenswrapper[4830]: I0318 19:36:40.515537 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-khsbj" Mar 18 19:36:40 crc kubenswrapper[4830]: I0318 19:36:40.515580 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-khsbj" Mar 18 19:36:40 crc kubenswrapper[4830]: I0318 19:36:40.567349 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-khsbj" Mar 18 19:36:40 crc kubenswrapper[4830]: I0318 19:36:40.613449 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lqh98" Mar 18 19:36:40 crc kubenswrapper[4830]: I0318 19:36:40.628019 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-khsbj" Mar 18 19:36:41 crc kubenswrapper[4830]: I0318 19:36:41.987215 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-khsbj"] Mar 18 19:36:42 crc kubenswrapper[4830]: I0318 19:36:42.554590 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-khsbj" podUID="133e1556-2942-4826-a1ca-8b3232c5332f" containerName="registry-server" containerID="cri-o://79167c8c85d3991cc962816df40fad2baafad1a88d326e56d73f5ebfd05e2ac6" gracePeriod=2 Mar 18 19:36:42 crc kubenswrapper[4830]: I0318 19:36:42.998029 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lqh98"] Mar 18 19:36:42 crc kubenswrapper[4830]: I0318 19:36:42.998320 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lqh98" podUID="35ce8420-4a0e-4217-8685-54ac853d4e4e" containerName="registry-server" containerID="cri-o://03ef2e33fc3a814a1a09793f67c275946fa178b015ccb4587e256a5b89da9984" gracePeriod=2 Mar 18 19:36:43 crc kubenswrapper[4830]: I0318 19:36:43.168176 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-khsbj" Mar 18 19:36:43 crc kubenswrapper[4830]: I0318 19:36:43.256085 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4c4vv\" (UniqueName: \"kubernetes.io/projected/133e1556-2942-4826-a1ca-8b3232c5332f-kube-api-access-4c4vv\") pod \"133e1556-2942-4826-a1ca-8b3232c5332f\" (UID: \"133e1556-2942-4826-a1ca-8b3232c5332f\") " Mar 18 19:36:43 crc kubenswrapper[4830]: I0318 19:36:43.256166 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/133e1556-2942-4826-a1ca-8b3232c5332f-catalog-content\") pod \"133e1556-2942-4826-a1ca-8b3232c5332f\" (UID: \"133e1556-2942-4826-a1ca-8b3232c5332f\") " Mar 18 19:36:43 crc kubenswrapper[4830]: I0318 19:36:43.256208 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/133e1556-2942-4826-a1ca-8b3232c5332f-utilities\") pod \"133e1556-2942-4826-a1ca-8b3232c5332f\" (UID: \"133e1556-2942-4826-a1ca-8b3232c5332f\") " Mar 18 19:36:43 crc kubenswrapper[4830]: I0318 19:36:43.257260 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/133e1556-2942-4826-a1ca-8b3232c5332f-utilities" (OuterVolumeSpecName: "utilities") pod "133e1556-2942-4826-a1ca-8b3232c5332f" (UID: "133e1556-2942-4826-a1ca-8b3232c5332f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:36:43 crc kubenswrapper[4830]: I0318 19:36:43.261363 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/133e1556-2942-4826-a1ca-8b3232c5332f-kube-api-access-4c4vv" (OuterVolumeSpecName: "kube-api-access-4c4vv") pod "133e1556-2942-4826-a1ca-8b3232c5332f" (UID: "133e1556-2942-4826-a1ca-8b3232c5332f"). InnerVolumeSpecName "kube-api-access-4c4vv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:36:43 crc kubenswrapper[4830]: I0318 19:36:43.358150 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4c4vv\" (UniqueName: \"kubernetes.io/projected/133e1556-2942-4826-a1ca-8b3232c5332f-kube-api-access-4c4vv\") on node \"crc\" DevicePath \"\"" Mar 18 19:36:43 crc kubenswrapper[4830]: I0318 19:36:43.358187 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/133e1556-2942-4826-a1ca-8b3232c5332f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 19:36:43 crc kubenswrapper[4830]: I0318 19:36:43.532070 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/133e1556-2942-4826-a1ca-8b3232c5332f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "133e1556-2942-4826-a1ca-8b3232c5332f" (UID: "133e1556-2942-4826-a1ca-8b3232c5332f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:36:43 crc kubenswrapper[4830]: I0318 19:36:43.564836 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/133e1556-2942-4826-a1ca-8b3232c5332f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 19:36:43 crc kubenswrapper[4830]: I0318 19:36:43.570126 4830 generic.go:334] "Generic (PLEG): container finished" podID="133e1556-2942-4826-a1ca-8b3232c5332f" containerID="79167c8c85d3991cc962816df40fad2baafad1a88d326e56d73f5ebfd05e2ac6" exitCode=0 Mar 18 19:36:43 crc kubenswrapper[4830]: I0318 19:36:43.570242 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-khsbj" event={"ID":"133e1556-2942-4826-a1ca-8b3232c5332f","Type":"ContainerDied","Data":"79167c8c85d3991cc962816df40fad2baafad1a88d326e56d73f5ebfd05e2ac6"} Mar 18 19:36:43 crc kubenswrapper[4830]: I0318 19:36:43.570283 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-khsbj" event={"ID":"133e1556-2942-4826-a1ca-8b3232c5332f","Type":"ContainerDied","Data":"02b2fed93a1951b9ea05019ec1da01905a6e243e12271494fc0ad3ddbf2f1a26"} Mar 18 19:36:43 crc kubenswrapper[4830]: I0318 19:36:43.577061 4830 scope.go:117] "RemoveContainer" containerID="79167c8c85d3991cc962816df40fad2baafad1a88d326e56d73f5ebfd05e2ac6" Mar 18 19:36:43 crc kubenswrapper[4830]: I0318 19:36:43.577388 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-khsbj" Mar 18 19:36:43 crc kubenswrapper[4830]: I0318 19:36:43.583313 4830 generic.go:334] "Generic (PLEG): container finished" podID="35ce8420-4a0e-4217-8685-54ac853d4e4e" containerID="03ef2e33fc3a814a1a09793f67c275946fa178b015ccb4587e256a5b89da9984" exitCode=0 Mar 18 19:36:43 crc kubenswrapper[4830]: I0318 19:36:43.583361 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lqh98" event={"ID":"35ce8420-4a0e-4217-8685-54ac853d4e4e","Type":"ContainerDied","Data":"03ef2e33fc3a814a1a09793f67c275946fa178b015ccb4587e256a5b89da9984"} Mar 18 19:36:43 crc kubenswrapper[4830]: I0318 19:36:43.617464 4830 scope.go:117] "RemoveContainer" containerID="04294fbc2d213c8bc5766bf3bbd08bc75a42240431e87541af7dd6c4d52cad1d" Mar 18 19:36:43 crc kubenswrapper[4830]: I0318 19:36:43.622535 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-khsbj"] Mar 18 19:36:43 crc kubenswrapper[4830]: I0318 19:36:43.629699 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-khsbj"] Mar 18 19:36:43 crc kubenswrapper[4830]: I0318 19:36:43.640311 4830 scope.go:117] "RemoveContainer" containerID="9f6e9b69d19d94b22148a0f573893ba7df60bb0135a124083ca2e9a58e6c2ad1" Mar 18 19:36:43 crc kubenswrapper[4830]: I0318 19:36:43.658183 4830 scope.go:117] "RemoveContainer" containerID="79167c8c85d3991cc962816df40fad2baafad1a88d326e56d73f5ebfd05e2ac6" Mar 18 19:36:43 crc kubenswrapper[4830]: E0318 19:36:43.658536 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79167c8c85d3991cc962816df40fad2baafad1a88d326e56d73f5ebfd05e2ac6\": container with ID starting with 79167c8c85d3991cc962816df40fad2baafad1a88d326e56d73f5ebfd05e2ac6 not found: ID does not exist" containerID="79167c8c85d3991cc962816df40fad2baafad1a88d326e56d73f5ebfd05e2ac6" Mar 18 19:36:43 crc kubenswrapper[4830]: I0318 19:36:43.658647 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79167c8c85d3991cc962816df40fad2baafad1a88d326e56d73f5ebfd05e2ac6"} err="failed to get container status \"79167c8c85d3991cc962816df40fad2baafad1a88d326e56d73f5ebfd05e2ac6\": rpc error: code = NotFound desc = could not find container \"79167c8c85d3991cc962816df40fad2baafad1a88d326e56d73f5ebfd05e2ac6\": container with ID starting with 79167c8c85d3991cc962816df40fad2baafad1a88d326e56d73f5ebfd05e2ac6 not found: ID does not exist" Mar 18 19:36:43 crc kubenswrapper[4830]: I0318 19:36:43.658670 4830 scope.go:117] "RemoveContainer" containerID="04294fbc2d213c8bc5766bf3bbd08bc75a42240431e87541af7dd6c4d52cad1d" Mar 18 19:36:43 crc kubenswrapper[4830]: E0318 19:36:43.658899 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04294fbc2d213c8bc5766bf3bbd08bc75a42240431e87541af7dd6c4d52cad1d\": container with ID starting with 04294fbc2d213c8bc5766bf3bbd08bc75a42240431e87541af7dd6c4d52cad1d not found: ID does not exist" containerID="04294fbc2d213c8bc5766bf3bbd08bc75a42240431e87541af7dd6c4d52cad1d" Mar 18 19:36:43 crc kubenswrapper[4830]: I0318 19:36:43.658923 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04294fbc2d213c8bc5766bf3bbd08bc75a42240431e87541af7dd6c4d52cad1d"} err="failed to get container status \"04294fbc2d213c8bc5766bf3bbd08bc75a42240431e87541af7dd6c4d52cad1d\": rpc error: code = NotFound desc = could not find container \"04294fbc2d213c8bc5766bf3bbd08bc75a42240431e87541af7dd6c4d52cad1d\": container with ID starting with 04294fbc2d213c8bc5766bf3bbd08bc75a42240431e87541af7dd6c4d52cad1d not found: ID does not exist" Mar 18 19:36:43 crc kubenswrapper[4830]: I0318 19:36:43.658942 4830 scope.go:117] "RemoveContainer" containerID="9f6e9b69d19d94b22148a0f573893ba7df60bb0135a124083ca2e9a58e6c2ad1" Mar 18 19:36:43 crc kubenswrapper[4830]: E0318 19:36:43.659146 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f6e9b69d19d94b22148a0f573893ba7df60bb0135a124083ca2e9a58e6c2ad1\": container with ID starting with 9f6e9b69d19d94b22148a0f573893ba7df60bb0135a124083ca2e9a58e6c2ad1 not found: ID does not exist" containerID="9f6e9b69d19d94b22148a0f573893ba7df60bb0135a124083ca2e9a58e6c2ad1" Mar 18 19:36:43 crc kubenswrapper[4830]: I0318 19:36:43.659167 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f6e9b69d19d94b22148a0f573893ba7df60bb0135a124083ca2e9a58e6c2ad1"} err="failed to get container status \"9f6e9b69d19d94b22148a0f573893ba7df60bb0135a124083ca2e9a58e6c2ad1\": rpc error: code = NotFound desc = could not find container \"9f6e9b69d19d94b22148a0f573893ba7df60bb0135a124083ca2e9a58e6c2ad1\": container with ID starting with 9f6e9b69d19d94b22148a0f573893ba7df60bb0135a124083ca2e9a58e6c2ad1 not found: ID does not exist" Mar 18 19:36:43 crc kubenswrapper[4830]: I0318 19:36:43.936677 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lqh98" Mar 18 19:36:43 crc kubenswrapper[4830]: I0318 19:36:43.970991 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35ce8420-4a0e-4217-8685-54ac853d4e4e-catalog-content\") pod \"35ce8420-4a0e-4217-8685-54ac853d4e4e\" (UID: \"35ce8420-4a0e-4217-8685-54ac853d4e4e\") " Mar 18 19:36:43 crc kubenswrapper[4830]: I0318 19:36:43.971094 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35ce8420-4a0e-4217-8685-54ac853d4e4e-utilities\") pod \"35ce8420-4a0e-4217-8685-54ac853d4e4e\" (UID: \"35ce8420-4a0e-4217-8685-54ac853d4e4e\") " Mar 18 19:36:43 crc kubenswrapper[4830]: I0318 19:36:43.971203 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jf95m\" (UniqueName: \"kubernetes.io/projected/35ce8420-4a0e-4217-8685-54ac853d4e4e-kube-api-access-jf95m\") pod \"35ce8420-4a0e-4217-8685-54ac853d4e4e\" (UID: \"35ce8420-4a0e-4217-8685-54ac853d4e4e\") " Mar 18 19:36:43 crc kubenswrapper[4830]: I0318 19:36:43.972065 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35ce8420-4a0e-4217-8685-54ac853d4e4e-utilities" (OuterVolumeSpecName: "utilities") pod "35ce8420-4a0e-4217-8685-54ac853d4e4e" (UID: "35ce8420-4a0e-4217-8685-54ac853d4e4e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:36:43 crc kubenswrapper[4830]: I0318 19:36:43.985487 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35ce8420-4a0e-4217-8685-54ac853d4e4e-kube-api-access-jf95m" (OuterVolumeSpecName: "kube-api-access-jf95m") pod "35ce8420-4a0e-4217-8685-54ac853d4e4e" (UID: "35ce8420-4a0e-4217-8685-54ac853d4e4e"). InnerVolumeSpecName "kube-api-access-jf95m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:36:44 crc kubenswrapper[4830]: I0318 19:36:44.031180 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35ce8420-4a0e-4217-8685-54ac853d4e4e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "35ce8420-4a0e-4217-8685-54ac853d4e4e" (UID: "35ce8420-4a0e-4217-8685-54ac853d4e4e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:36:44 crc kubenswrapper[4830]: I0318 19:36:44.072869 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jf95m\" (UniqueName: \"kubernetes.io/projected/35ce8420-4a0e-4217-8685-54ac853d4e4e-kube-api-access-jf95m\") on node \"crc\" DevicePath \"\"" Mar 18 19:36:44 crc kubenswrapper[4830]: I0318 19:36:44.072901 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35ce8420-4a0e-4217-8685-54ac853d4e4e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 19:36:44 crc kubenswrapper[4830]: I0318 19:36:44.072910 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35ce8420-4a0e-4217-8685-54ac853d4e4e-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 19:36:44 crc kubenswrapper[4830]: I0318 19:36:44.247906 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="133e1556-2942-4826-a1ca-8b3232c5332f" path="/var/lib/kubelet/pods/133e1556-2942-4826-a1ca-8b3232c5332f/volumes" Mar 18 19:36:44 crc kubenswrapper[4830]: I0318 19:36:44.595290 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lqh98" event={"ID":"35ce8420-4a0e-4217-8685-54ac853d4e4e","Type":"ContainerDied","Data":"34b2237e8d881cce27f33f05f1083d1135c160ba33da87e0fbb78bb153562f11"} Mar 18 19:36:44 crc kubenswrapper[4830]: I0318 19:36:44.595354 4830 scope.go:117] "RemoveContainer" containerID="03ef2e33fc3a814a1a09793f67c275946fa178b015ccb4587e256a5b89da9984" Mar 18 19:36:44 crc kubenswrapper[4830]: I0318 19:36:44.595382 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lqh98" Mar 18 19:36:44 crc kubenswrapper[4830]: I0318 19:36:44.624866 4830 scope.go:117] "RemoveContainer" containerID="83149655ce1ee74a4f259fa13ddeb2308064c6d63fc1ca4ade56829782438c64" Mar 18 19:36:44 crc kubenswrapper[4830]: I0318 19:36:44.630308 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lqh98"] Mar 18 19:36:44 crc kubenswrapper[4830]: I0318 19:36:44.637919 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lqh98"] Mar 18 19:36:44 crc kubenswrapper[4830]: I0318 19:36:44.649553 4830 scope.go:117] "RemoveContainer" containerID="81f2fe40705375851246c69029bc41d646ee3bf97924a1616cd92c57a1ac50f1" Mar 18 19:36:46 crc kubenswrapper[4830]: I0318 19:36:46.251880 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35ce8420-4a0e-4217-8685-54ac853d4e4e" path="/var/lib/kubelet/pods/35ce8420-4a0e-4217-8685-54ac853d4e4e/volumes" Mar 18 19:36:47 crc kubenswrapper[4830]: I0318 19:36:47.997004 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nmpk7" Mar 18 19:36:48 crc kubenswrapper[4830]: I0318 19:36:48.094998 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nmpk7" Mar 18 19:36:48 crc kubenswrapper[4830]: I0318 19:36:48.795664 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nmpk7"] Mar 18 19:36:49 crc kubenswrapper[4830]: I0318 19:36:49.640743 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nmpk7" podUID="0f9676bf-2353-462e-baf2-308a57abaef3" containerName="registry-server" containerID="cri-o://7b3599fbc3b2447eb10a49d6564616187c098716fc8946ee6033ef71857eb0bb" gracePeriod=2 Mar 18 19:36:50 crc kubenswrapper[4830]: I0318 19:36:50.126653 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nmpk7" Mar 18 19:36:50 crc kubenswrapper[4830]: I0318 19:36:50.179032 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbgdw\" (UniqueName: \"kubernetes.io/projected/0f9676bf-2353-462e-baf2-308a57abaef3-kube-api-access-fbgdw\") pod \"0f9676bf-2353-462e-baf2-308a57abaef3\" (UID: \"0f9676bf-2353-462e-baf2-308a57abaef3\") " Mar 18 19:36:50 crc kubenswrapper[4830]: I0318 19:36:50.179155 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f9676bf-2353-462e-baf2-308a57abaef3-utilities\") pod \"0f9676bf-2353-462e-baf2-308a57abaef3\" (UID: \"0f9676bf-2353-462e-baf2-308a57abaef3\") " Mar 18 19:36:50 crc kubenswrapper[4830]: I0318 19:36:50.179246 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f9676bf-2353-462e-baf2-308a57abaef3-catalog-content\") pod \"0f9676bf-2353-462e-baf2-308a57abaef3\" (UID: \"0f9676bf-2353-462e-baf2-308a57abaef3\") " Mar 18 19:36:50 crc kubenswrapper[4830]: I0318 19:36:50.180469 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f9676bf-2353-462e-baf2-308a57abaef3-utilities" (OuterVolumeSpecName: "utilities") pod "0f9676bf-2353-462e-baf2-308a57abaef3" (UID: "0f9676bf-2353-462e-baf2-308a57abaef3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:36:50 crc kubenswrapper[4830]: I0318 19:36:50.189195 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f9676bf-2353-462e-baf2-308a57abaef3-kube-api-access-fbgdw" (OuterVolumeSpecName: "kube-api-access-fbgdw") pod "0f9676bf-2353-462e-baf2-308a57abaef3" (UID: "0f9676bf-2353-462e-baf2-308a57abaef3"). InnerVolumeSpecName "kube-api-access-fbgdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:36:50 crc kubenswrapper[4830]: I0318 19:36:50.281072 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f9676bf-2353-462e-baf2-308a57abaef3-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 19:36:50 crc kubenswrapper[4830]: I0318 19:36:50.281108 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbgdw\" (UniqueName: \"kubernetes.io/projected/0f9676bf-2353-462e-baf2-308a57abaef3-kube-api-access-fbgdw\") on node \"crc\" DevicePath \"\"" Mar 18 19:36:50 crc kubenswrapper[4830]: I0318 19:36:50.312369 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f9676bf-2353-462e-baf2-308a57abaef3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f9676bf-2353-462e-baf2-308a57abaef3" (UID: "0f9676bf-2353-462e-baf2-308a57abaef3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:36:50 crc kubenswrapper[4830]: I0318 19:36:50.382724 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f9676bf-2353-462e-baf2-308a57abaef3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 19:36:50 crc kubenswrapper[4830]: I0318 19:36:50.649407 4830 generic.go:334] "Generic (PLEG): container finished" podID="0f9676bf-2353-462e-baf2-308a57abaef3" containerID="7b3599fbc3b2447eb10a49d6564616187c098716fc8946ee6033ef71857eb0bb" exitCode=0 Mar 18 19:36:50 crc kubenswrapper[4830]: I0318 19:36:50.649452 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nmpk7" event={"ID":"0f9676bf-2353-462e-baf2-308a57abaef3","Type":"ContainerDied","Data":"7b3599fbc3b2447eb10a49d6564616187c098716fc8946ee6033ef71857eb0bb"} Mar 18 19:36:50 crc kubenswrapper[4830]: I0318 19:36:50.649499 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nmpk7" event={"ID":"0f9676bf-2353-462e-baf2-308a57abaef3","Type":"ContainerDied","Data":"8d689ff3559094e4bff15d62e62f30ff97935367148f55f40a7ff7bd6cec9aed"} Mar 18 19:36:50 crc kubenswrapper[4830]: I0318 19:36:50.649518 4830 scope.go:117] "RemoveContainer" containerID="7b3599fbc3b2447eb10a49d6564616187c098716fc8946ee6033ef71857eb0bb" Mar 18 19:36:50 crc kubenswrapper[4830]: I0318 19:36:50.649895 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nmpk7" Mar 18 19:36:50 crc kubenswrapper[4830]: I0318 19:36:50.667129 4830 scope.go:117] "RemoveContainer" containerID="ed7b6fac970d2240f07dbb27c3e8c0915f7a38a027b4d7ed1fe1624fc4f9dd03" Mar 18 19:36:50 crc kubenswrapper[4830]: I0318 19:36:50.683658 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nmpk7"] Mar 18 19:36:50 crc kubenswrapper[4830]: I0318 19:36:50.686904 4830 scope.go:117] "RemoveContainer" containerID="8fe7467fc8ec36a438d92c7bcce68299cf5f40eb83f8878081c95d522e509a43" Mar 18 19:36:50 crc kubenswrapper[4830]: I0318 19:36:50.707928 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nmpk7"] Mar 18 19:36:50 crc kubenswrapper[4830]: I0318 19:36:50.750166 4830 scope.go:117] "RemoveContainer" containerID="7b3599fbc3b2447eb10a49d6564616187c098716fc8946ee6033ef71857eb0bb" Mar 18 19:36:50 crc kubenswrapper[4830]: E0318 19:36:50.750589 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b3599fbc3b2447eb10a49d6564616187c098716fc8946ee6033ef71857eb0bb\": container with ID starting with 7b3599fbc3b2447eb10a49d6564616187c098716fc8946ee6033ef71857eb0bb not found: ID does not exist" containerID="7b3599fbc3b2447eb10a49d6564616187c098716fc8946ee6033ef71857eb0bb" Mar 18 19:36:50 crc kubenswrapper[4830]: I0318 19:36:50.750637 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b3599fbc3b2447eb10a49d6564616187c098716fc8946ee6033ef71857eb0bb"} err="failed to get container status \"7b3599fbc3b2447eb10a49d6564616187c098716fc8946ee6033ef71857eb0bb\": rpc error: code = NotFound desc = could not find container \"7b3599fbc3b2447eb10a49d6564616187c098716fc8946ee6033ef71857eb0bb\": container with ID starting with 7b3599fbc3b2447eb10a49d6564616187c098716fc8946ee6033ef71857eb0bb not found: ID does not exist" Mar 18 19:36:50 crc kubenswrapper[4830]: I0318 19:36:50.750666 4830 scope.go:117] "RemoveContainer" containerID="ed7b6fac970d2240f07dbb27c3e8c0915f7a38a027b4d7ed1fe1624fc4f9dd03" Mar 18 19:36:50 crc kubenswrapper[4830]: E0318 19:36:50.750916 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed7b6fac970d2240f07dbb27c3e8c0915f7a38a027b4d7ed1fe1624fc4f9dd03\": container with ID starting with ed7b6fac970d2240f07dbb27c3e8c0915f7a38a027b4d7ed1fe1624fc4f9dd03 not found: ID does not exist" containerID="ed7b6fac970d2240f07dbb27c3e8c0915f7a38a027b4d7ed1fe1624fc4f9dd03" Mar 18 19:36:50 crc kubenswrapper[4830]: I0318 19:36:50.750939 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed7b6fac970d2240f07dbb27c3e8c0915f7a38a027b4d7ed1fe1624fc4f9dd03"} err="failed to get container status \"ed7b6fac970d2240f07dbb27c3e8c0915f7a38a027b4d7ed1fe1624fc4f9dd03\": rpc error: code = NotFound desc = could not find container \"ed7b6fac970d2240f07dbb27c3e8c0915f7a38a027b4d7ed1fe1624fc4f9dd03\": container with ID starting with ed7b6fac970d2240f07dbb27c3e8c0915f7a38a027b4d7ed1fe1624fc4f9dd03 not found: ID does not exist" Mar 18 19:36:50 crc kubenswrapper[4830]: I0318 19:36:50.750952 4830 scope.go:117] "RemoveContainer" containerID="8fe7467fc8ec36a438d92c7bcce68299cf5f40eb83f8878081c95d522e509a43" Mar 18 19:36:50 crc kubenswrapper[4830]: E0318 19:36:50.751597 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fe7467fc8ec36a438d92c7bcce68299cf5f40eb83f8878081c95d522e509a43\": container with ID starting with 8fe7467fc8ec36a438d92c7bcce68299cf5f40eb83f8878081c95d522e509a43 not found: ID does not exist" containerID="8fe7467fc8ec36a438d92c7bcce68299cf5f40eb83f8878081c95d522e509a43" Mar 18 19:36:50 crc kubenswrapper[4830]: I0318 19:36:50.751648 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fe7467fc8ec36a438d92c7bcce68299cf5f40eb83f8878081c95d522e509a43"} err="failed to get container status \"8fe7467fc8ec36a438d92c7bcce68299cf5f40eb83f8878081c95d522e509a43\": rpc error: code = NotFound desc = could not find container \"8fe7467fc8ec36a438d92c7bcce68299cf5f40eb83f8878081c95d522e509a43\": container with ID starting with 8fe7467fc8ec36a438d92c7bcce68299cf5f40eb83f8878081c95d522e509a43 not found: ID does not exist" Mar 18 19:36:52 crc kubenswrapper[4830]: I0318 19:36:52.243383 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f9676bf-2353-462e-baf2-308a57abaef3" path="/var/lib/kubelet/pods/0f9676bf-2353-462e-baf2-308a57abaef3/volumes" Mar 18 19:37:29 crc kubenswrapper[4830]: I0318 19:37:29.510094 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:37:29 crc kubenswrapper[4830]: I0318 19:37:29.510568 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:37:57 crc kubenswrapper[4830]: I0318 19:37:57.289530 4830 generic.go:334] "Generic (PLEG): container finished" podID="40d9417e-f228-46aa-a7c8-81f506659186" containerID="413131c532a8cd0d8ba5d1d51ae9f8e660c80d11698c75ef1501c86ba96e7ff5" exitCode=0 Mar 18 19:37:57 crc kubenswrapper[4830]: I0318 19:37:57.289676 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6r27l/must-gather-cgl7l" event={"ID":"40d9417e-f228-46aa-a7c8-81f506659186","Type":"ContainerDied","Data":"413131c532a8cd0d8ba5d1d51ae9f8e660c80d11698c75ef1501c86ba96e7ff5"} Mar 18 19:37:57 crc kubenswrapper[4830]: I0318 19:37:57.290763 4830 scope.go:117] "RemoveContainer" containerID="413131c532a8cd0d8ba5d1d51ae9f8e660c80d11698c75ef1501c86ba96e7ff5" Mar 18 19:37:58 crc kubenswrapper[4830]: I0318 19:37:58.054672 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6r27l_must-gather-cgl7l_40d9417e-f228-46aa-a7c8-81f506659186/gather/0.log" Mar 18 19:37:59 crc kubenswrapper[4830]: I0318 19:37:59.512078 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:37:59 crc kubenswrapper[4830]: I0318 19:37:59.515445 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:38:00 crc kubenswrapper[4830]: I0318 19:38:00.163110 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564378-ngb88"] Mar 18 19:38:00 crc kubenswrapper[4830]: E0318 19:38:00.164172 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f9676bf-2353-462e-baf2-308a57abaef3" containerName="extract-utilities" Mar 18 19:38:00 crc kubenswrapper[4830]: I0318 19:38:00.164407 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f9676bf-2353-462e-baf2-308a57abaef3" containerName="extract-utilities" Mar 18 19:38:00 crc kubenswrapper[4830]: E0318 19:38:00.164616 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="133e1556-2942-4826-a1ca-8b3232c5332f" containerName="extract-content" Mar 18 19:38:00 crc kubenswrapper[4830]: I0318 19:38:00.164823 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="133e1556-2942-4826-a1ca-8b3232c5332f" containerName="extract-content" Mar 18 19:38:00 crc kubenswrapper[4830]: E0318 19:38:00.165016 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35ce8420-4a0e-4217-8685-54ac853d4e4e" containerName="extract-utilities" Mar 18 19:38:00 crc kubenswrapper[4830]: I0318 19:38:00.165203 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="35ce8420-4a0e-4217-8685-54ac853d4e4e" containerName="extract-utilities" Mar 18 19:38:00 crc kubenswrapper[4830]: E0318 19:38:00.165381 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f9676bf-2353-462e-baf2-308a57abaef3" containerName="extract-content" Mar 18 19:38:00 crc kubenswrapper[4830]: I0318 19:38:00.165545 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f9676bf-2353-462e-baf2-308a57abaef3" containerName="extract-content" Mar 18 19:38:00 crc kubenswrapper[4830]: E0318 19:38:00.165754 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35ce8420-4a0e-4217-8685-54ac853d4e4e" containerName="registry-server" Mar 18 19:38:00 crc kubenswrapper[4830]: I0318 19:38:00.167007 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="35ce8420-4a0e-4217-8685-54ac853d4e4e" containerName="registry-server" Mar 18 19:38:00 crc kubenswrapper[4830]: E0318 19:38:00.167042 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35ce8420-4a0e-4217-8685-54ac853d4e4e" containerName="extract-content" Mar 18 19:38:00 crc kubenswrapper[4830]: I0318 19:38:00.167065 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="35ce8420-4a0e-4217-8685-54ac853d4e4e" containerName="extract-content" Mar 18 19:38:00 crc kubenswrapper[4830]: E0318 19:38:00.167084 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f9676bf-2353-462e-baf2-308a57abaef3" containerName="registry-server" Mar 18 19:38:00 crc kubenswrapper[4830]: I0318 19:38:00.167093 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f9676bf-2353-462e-baf2-308a57abaef3" containerName="registry-server" Mar 18 19:38:00 crc kubenswrapper[4830]: E0318 19:38:00.167110 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="133e1556-2942-4826-a1ca-8b3232c5332f" containerName="registry-server" Mar 18 19:38:00 crc kubenswrapper[4830]: I0318 19:38:00.167118 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="133e1556-2942-4826-a1ca-8b3232c5332f" containerName="registry-server" Mar 18 19:38:00 crc kubenswrapper[4830]: E0318 19:38:00.167138 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="133e1556-2942-4826-a1ca-8b3232c5332f" containerName="extract-utilities" Mar 18 19:38:00 crc kubenswrapper[4830]: I0318 19:38:00.167146 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="133e1556-2942-4826-a1ca-8b3232c5332f" containerName="extract-utilities" Mar 18 19:38:00 crc kubenswrapper[4830]: I0318 19:38:00.167359 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="133e1556-2942-4826-a1ca-8b3232c5332f" containerName="registry-server" Mar 18 19:38:00 crc kubenswrapper[4830]: I0318 19:38:00.167386 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f9676bf-2353-462e-baf2-308a57abaef3" containerName="registry-server" Mar 18 19:38:00 crc kubenswrapper[4830]: I0318 19:38:00.167400 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="35ce8420-4a0e-4217-8685-54ac853d4e4e" containerName="registry-server" Mar 18 19:38:00 crc kubenswrapper[4830]: I0318 19:38:00.168068 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564378-ngb88" Mar 18 19:38:00 crc kubenswrapper[4830]: I0318 19:38:00.170338 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:38:00 crc kubenswrapper[4830]: I0318 19:38:00.170459 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:38:00 crc kubenswrapper[4830]: I0318 19:38:00.170612 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 19:38:00 crc kubenswrapper[4830]: I0318 19:38:00.176492 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564378-ngb88"] Mar 18 19:38:00 crc kubenswrapper[4830]: I0318 19:38:00.191239 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72lzw\" (UniqueName: \"kubernetes.io/projected/f8d4d80e-c7ab-403b-89c7-28f957f2ec29-kube-api-access-72lzw\") pod \"auto-csr-approver-29564378-ngb88\" (UID: \"f8d4d80e-c7ab-403b-89c7-28f957f2ec29\") " pod="openshift-infra/auto-csr-approver-29564378-ngb88" Mar 18 19:38:00 crc kubenswrapper[4830]: I0318 19:38:00.292906 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72lzw\" (UniqueName: \"kubernetes.io/projected/f8d4d80e-c7ab-403b-89c7-28f957f2ec29-kube-api-access-72lzw\") pod \"auto-csr-approver-29564378-ngb88\" (UID: \"f8d4d80e-c7ab-403b-89c7-28f957f2ec29\") " pod="openshift-infra/auto-csr-approver-29564378-ngb88" Mar 18 19:38:00 crc kubenswrapper[4830]: I0318 19:38:00.316886 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72lzw\" (UniqueName: \"kubernetes.io/projected/f8d4d80e-c7ab-403b-89c7-28f957f2ec29-kube-api-access-72lzw\") pod \"auto-csr-approver-29564378-ngb88\" (UID: \"f8d4d80e-c7ab-403b-89c7-28f957f2ec29\") " pod="openshift-infra/auto-csr-approver-29564378-ngb88" Mar 18 19:38:00 crc kubenswrapper[4830]: I0318 19:38:00.507344 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564378-ngb88" Mar 18 19:38:00 crc kubenswrapper[4830]: I0318 19:38:00.779729 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564378-ngb88"] Mar 18 19:38:00 crc kubenswrapper[4830]: W0318 19:38:00.785951 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8d4d80e_c7ab_403b_89c7_28f957f2ec29.slice/crio-c5e8b22fe6cc15a29ca8d626499271c076ad474e374cdac8a8c13e60b5a238ae WatchSource:0}: Error finding container c5e8b22fe6cc15a29ca8d626499271c076ad474e374cdac8a8c13e60b5a238ae: Status 404 returned error can't find the container with id c5e8b22fe6cc15a29ca8d626499271c076ad474e374cdac8a8c13e60b5a238ae Mar 18 19:38:00 crc kubenswrapper[4830]: I0318 19:38:00.789914 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 19:38:01 crc kubenswrapper[4830]: I0318 19:38:01.331256 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564378-ngb88" event={"ID":"f8d4d80e-c7ab-403b-89c7-28f957f2ec29","Type":"ContainerStarted","Data":"c5e8b22fe6cc15a29ca8d626499271c076ad474e374cdac8a8c13e60b5a238ae"} Mar 18 19:38:03 crc kubenswrapper[4830]: I0318 19:38:03.355505 4830 generic.go:334] "Generic (PLEG): container finished" podID="f8d4d80e-c7ab-403b-89c7-28f957f2ec29" containerID="8b5d91b95c87f46a35eaee1c1cac9f0da027ee30d6ad58e617e35c949c39b12f" exitCode=0 Mar 18 19:38:03 crc kubenswrapper[4830]: I0318 19:38:03.355568 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564378-ngb88" event={"ID":"f8d4d80e-c7ab-403b-89c7-28f957f2ec29","Type":"ContainerDied","Data":"8b5d91b95c87f46a35eaee1c1cac9f0da027ee30d6ad58e617e35c949c39b12f"} Mar 18 19:38:04 crc kubenswrapper[4830]: I0318 19:38:04.701489 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564378-ngb88" Mar 18 19:38:04 crc kubenswrapper[4830]: I0318 19:38:04.881828 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72lzw\" (UniqueName: \"kubernetes.io/projected/f8d4d80e-c7ab-403b-89c7-28f957f2ec29-kube-api-access-72lzw\") pod \"f8d4d80e-c7ab-403b-89c7-28f957f2ec29\" (UID: \"f8d4d80e-c7ab-403b-89c7-28f957f2ec29\") " Mar 18 19:38:04 crc kubenswrapper[4830]: I0318 19:38:04.887718 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8d4d80e-c7ab-403b-89c7-28f957f2ec29-kube-api-access-72lzw" (OuterVolumeSpecName: "kube-api-access-72lzw") pod "f8d4d80e-c7ab-403b-89c7-28f957f2ec29" (UID: "f8d4d80e-c7ab-403b-89c7-28f957f2ec29"). InnerVolumeSpecName "kube-api-access-72lzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:38:04 crc kubenswrapper[4830]: I0318 19:38:04.983810 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72lzw\" (UniqueName: \"kubernetes.io/projected/f8d4d80e-c7ab-403b-89c7-28f957f2ec29-kube-api-access-72lzw\") on node \"crc\" DevicePath \"\"" Mar 18 19:38:05 crc kubenswrapper[4830]: I0318 19:38:05.378083 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564378-ngb88" event={"ID":"f8d4d80e-c7ab-403b-89c7-28f957f2ec29","Type":"ContainerDied","Data":"c5e8b22fe6cc15a29ca8d626499271c076ad474e374cdac8a8c13e60b5a238ae"} Mar 18 19:38:05 crc kubenswrapper[4830]: I0318 19:38:05.378128 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5e8b22fe6cc15a29ca8d626499271c076ad474e374cdac8a8c13e60b5a238ae" Mar 18 19:38:05 crc kubenswrapper[4830]: I0318 19:38:05.378142 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564378-ngb88" Mar 18 19:38:05 crc kubenswrapper[4830]: I0318 19:38:05.796929 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564372-f5wwf"] Mar 18 19:38:05 crc kubenswrapper[4830]: I0318 19:38:05.805255 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564372-f5wwf"] Mar 18 19:38:06 crc kubenswrapper[4830]: I0318 19:38:06.188411 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6r27l/must-gather-cgl7l"] Mar 18 19:38:06 crc kubenswrapper[4830]: I0318 19:38:06.188722 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-6r27l/must-gather-cgl7l" podUID="40d9417e-f228-46aa-a7c8-81f506659186" containerName="copy" containerID="cri-o://ae72e7eddd74bf29ab46beaf2231e7164d31d8a25fee373f104b0c715d88f1bd" gracePeriod=2 Mar 18 19:38:06 crc kubenswrapper[4830]: I0318 19:38:06.197756 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6r27l/must-gather-cgl7l"] Mar 18 19:38:06 crc kubenswrapper[4830]: I0318 19:38:06.268541 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31812e16-dacb-42b9-90ca-91a269bc452e" path="/var/lib/kubelet/pods/31812e16-dacb-42b9-90ca-91a269bc452e/volumes" Mar 18 19:38:06 crc kubenswrapper[4830]: I0318 19:38:06.402833 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6r27l_must-gather-cgl7l_40d9417e-f228-46aa-a7c8-81f506659186/copy/0.log" Mar 18 19:38:06 crc kubenswrapper[4830]: I0318 19:38:06.407382 4830 generic.go:334] "Generic (PLEG): container finished" podID="40d9417e-f228-46aa-a7c8-81f506659186" containerID="ae72e7eddd74bf29ab46beaf2231e7164d31d8a25fee373f104b0c715d88f1bd" exitCode=143 Mar 18 19:38:06 crc kubenswrapper[4830]: I0318 19:38:06.658268 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6r27l_must-gather-cgl7l_40d9417e-f228-46aa-a7c8-81f506659186/copy/0.log" Mar 18 19:38:06 crc kubenswrapper[4830]: I0318 19:38:06.658976 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6r27l/must-gather-cgl7l" Mar 18 19:38:06 crc kubenswrapper[4830]: I0318 19:38:06.771376 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fft99\" (UniqueName: \"kubernetes.io/projected/40d9417e-f228-46aa-a7c8-81f506659186-kube-api-access-fft99\") pod \"40d9417e-f228-46aa-a7c8-81f506659186\" (UID: \"40d9417e-f228-46aa-a7c8-81f506659186\") " Mar 18 19:38:06 crc kubenswrapper[4830]: I0318 19:38:06.771467 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/40d9417e-f228-46aa-a7c8-81f506659186-must-gather-output\") pod \"40d9417e-f228-46aa-a7c8-81f506659186\" (UID: \"40d9417e-f228-46aa-a7c8-81f506659186\") " Mar 18 19:38:06 crc kubenswrapper[4830]: I0318 19:38:06.792934 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40d9417e-f228-46aa-a7c8-81f506659186-kube-api-access-fft99" (OuterVolumeSpecName: "kube-api-access-fft99") pod "40d9417e-f228-46aa-a7c8-81f506659186" (UID: "40d9417e-f228-46aa-a7c8-81f506659186"). InnerVolumeSpecName "kube-api-access-fft99". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:38:06 crc kubenswrapper[4830]: I0318 19:38:06.873781 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fft99\" (UniqueName: \"kubernetes.io/projected/40d9417e-f228-46aa-a7c8-81f506659186-kube-api-access-fft99\") on node \"crc\" DevicePath \"\"" Mar 18 19:38:06 crc kubenswrapper[4830]: I0318 19:38:06.894332 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40d9417e-f228-46aa-a7c8-81f506659186-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "40d9417e-f228-46aa-a7c8-81f506659186" (UID: "40d9417e-f228-46aa-a7c8-81f506659186"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:38:06 crc kubenswrapper[4830]: I0318 19:38:06.975642 4830 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/40d9417e-f228-46aa-a7c8-81f506659186-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 18 19:38:07 crc kubenswrapper[4830]: I0318 19:38:07.417340 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6r27l_must-gather-cgl7l_40d9417e-f228-46aa-a7c8-81f506659186/copy/0.log" Mar 18 19:38:07 crc kubenswrapper[4830]: I0318 19:38:07.417888 4830 scope.go:117] "RemoveContainer" containerID="ae72e7eddd74bf29ab46beaf2231e7164d31d8a25fee373f104b0c715d88f1bd" Mar 18 19:38:07 crc kubenswrapper[4830]: I0318 19:38:07.417950 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6r27l/must-gather-cgl7l" Mar 18 19:38:07 crc kubenswrapper[4830]: I0318 19:38:07.435110 4830 scope.go:117] "RemoveContainer" containerID="413131c532a8cd0d8ba5d1d51ae9f8e660c80d11698c75ef1501c86ba96e7ff5" Mar 18 19:38:08 crc kubenswrapper[4830]: I0318 19:38:08.254051 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40d9417e-f228-46aa-a7c8-81f506659186" path="/var/lib/kubelet/pods/40d9417e-f228-46aa-a7c8-81f506659186/volumes" Mar 18 19:38:21 crc kubenswrapper[4830]: I0318 19:38:21.040881 4830 scope.go:117] "RemoveContainer" containerID="5a6fb9d5be538d1cba3e482b20abe4416f054634107597fd5210bf1c45d1c1f5" Mar 18 19:38:21 crc kubenswrapper[4830]: I0318 19:38:21.071743 4830 scope.go:117] "RemoveContainer" containerID="fe3295288a11cae05a63cabc68d21d080181f6943b89ea74288fd70d8f0385f5" Mar 18 19:38:21 crc kubenswrapper[4830]: I0318 19:38:21.119407 4830 scope.go:117] "RemoveContainer" containerID="8619914e4a023da11a3e9be1f7c2e238278ad285ee79f0240a5698a24e4b92cd" Mar 18 19:38:29 crc kubenswrapper[4830]: I0318 19:38:29.509990 4830 patch_prober.go:28] interesting pod/machine-config-daemon-plzpb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:38:29 crc kubenswrapper[4830]: I0318 19:38:29.510900 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:38:29 crc kubenswrapper[4830]: I0318 19:38:29.510975 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" Mar 18 19:38:29 crc kubenswrapper[4830]: I0318 19:38:29.512084 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"12586c31daa01567ccf75e5331eca78a5750508a1fa1f0cccc481425a63bde9b"} pod="openshift-machine-config-operator/machine-config-daemon-plzpb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 19:38:29 crc kubenswrapper[4830]: I0318 19:38:29.512173 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerName="machine-config-daemon" containerID="cri-o://12586c31daa01567ccf75e5331eca78a5750508a1fa1f0cccc481425a63bde9b" gracePeriod=600 Mar 18 19:38:29 crc kubenswrapper[4830]: E0318 19:38:29.643594 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:38:29 crc kubenswrapper[4830]: I0318 19:38:29.650540 4830 generic.go:334] "Generic (PLEG): container finished" podID="fbe02a32-24dc-4772-8a10-0128d3a304e4" containerID="12586c31daa01567ccf75e5331eca78a5750508a1fa1f0cccc481425a63bde9b" exitCode=0 Mar 18 19:38:29 crc kubenswrapper[4830]: I0318 19:38:29.650611 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" event={"ID":"fbe02a32-24dc-4772-8a10-0128d3a304e4","Type":"ContainerDied","Data":"12586c31daa01567ccf75e5331eca78a5750508a1fa1f0cccc481425a63bde9b"} Mar 18 19:38:29 crc kubenswrapper[4830]: I0318 19:38:29.650658 4830 scope.go:117] "RemoveContainer" containerID="359983fc39dc77da53ab9c5f404699ef39069a5a5ae55ff906f4e3793c0766a4" Mar 18 19:38:29 crc kubenswrapper[4830]: I0318 19:38:29.651572 4830 scope.go:117] "RemoveContainer" containerID="12586c31daa01567ccf75e5331eca78a5750508a1fa1f0cccc481425a63bde9b" Mar 18 19:38:29 crc kubenswrapper[4830]: E0318 19:38:29.652092 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:38:42 crc kubenswrapper[4830]: I0318 19:38:42.234911 4830 scope.go:117] "RemoveContainer" containerID="12586c31daa01567ccf75e5331eca78a5750508a1fa1f0cccc481425a63bde9b" Mar 18 19:38:42 crc kubenswrapper[4830]: E0318 19:38:42.236493 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:38:54 crc kubenswrapper[4830]: I0318 19:38:54.235304 4830 scope.go:117] "RemoveContainer" containerID="12586c31daa01567ccf75e5331eca78a5750508a1fa1f0cccc481425a63bde9b" Mar 18 19:38:54 crc kubenswrapper[4830]: E0318 19:38:54.236337 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:39:09 crc kubenswrapper[4830]: I0318 19:39:09.235429 4830 scope.go:117] "RemoveContainer" containerID="12586c31daa01567ccf75e5331eca78a5750508a1fa1f0cccc481425a63bde9b" Mar 18 19:39:09 crc kubenswrapper[4830]: E0318 19:39:09.236757 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:39:23 crc kubenswrapper[4830]: I0318 19:39:23.234964 4830 scope.go:117] "RemoveContainer" containerID="12586c31daa01567ccf75e5331eca78a5750508a1fa1f0cccc481425a63bde9b" Mar 18 19:39:23 crc kubenswrapper[4830]: E0318 19:39:23.235953 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:39:35 crc kubenswrapper[4830]: I0318 19:39:35.234822 4830 scope.go:117] "RemoveContainer" containerID="12586c31daa01567ccf75e5331eca78a5750508a1fa1f0cccc481425a63bde9b" Mar 18 19:39:35 crc kubenswrapper[4830]: E0318 19:39:35.235978 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:39:49 crc kubenswrapper[4830]: I0318 19:39:49.234820 4830 scope.go:117] "RemoveContainer" containerID="12586c31daa01567ccf75e5331eca78a5750508a1fa1f0cccc481425a63bde9b" Mar 18 19:39:49 crc kubenswrapper[4830]: E0318 19:39:49.235966 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:40:00 crc kubenswrapper[4830]: I0318 19:40:00.158168 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564380-wh9jw"] Mar 18 19:40:00 crc kubenswrapper[4830]: E0318 19:40:00.159405 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40d9417e-f228-46aa-a7c8-81f506659186" containerName="gather" Mar 18 19:40:00 crc kubenswrapper[4830]: I0318 19:40:00.159422 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="40d9417e-f228-46aa-a7c8-81f506659186" containerName="gather" Mar 18 19:40:00 crc kubenswrapper[4830]: E0318 19:40:00.159434 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40d9417e-f228-46aa-a7c8-81f506659186" containerName="copy" Mar 18 19:40:00 crc kubenswrapper[4830]: I0318 19:40:00.159442 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="40d9417e-f228-46aa-a7c8-81f506659186" containerName="copy" Mar 18 19:40:00 crc kubenswrapper[4830]: E0318 19:40:00.159456 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8d4d80e-c7ab-403b-89c7-28f957f2ec29" containerName="oc" Mar 18 19:40:00 crc kubenswrapper[4830]: I0318 19:40:00.159465 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8d4d80e-c7ab-403b-89c7-28f957f2ec29" containerName="oc" Mar 18 19:40:00 crc kubenswrapper[4830]: I0318 19:40:00.159693 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="40d9417e-f228-46aa-a7c8-81f506659186" containerName="gather" Mar 18 19:40:00 crc kubenswrapper[4830]: I0318 19:40:00.159707 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="40d9417e-f228-46aa-a7c8-81f506659186" containerName="copy" Mar 18 19:40:00 crc kubenswrapper[4830]: I0318 19:40:00.159720 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8d4d80e-c7ab-403b-89c7-28f957f2ec29" containerName="oc" Mar 18 19:40:00 crc kubenswrapper[4830]: I0318 19:40:00.160336 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564380-wh9jw" Mar 18 19:40:00 crc kubenswrapper[4830]: I0318 19:40:00.167220 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qhdgs" Mar 18 19:40:00 crc kubenswrapper[4830]: I0318 19:40:00.167628 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:40:00 crc kubenswrapper[4830]: I0318 19:40:00.167958 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:40:00 crc kubenswrapper[4830]: I0318 19:40:00.173391 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564380-wh9jw"] Mar 18 19:40:00 crc kubenswrapper[4830]: I0318 19:40:00.319273 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgw46\" (UniqueName: \"kubernetes.io/projected/8d4f49d6-7305-42d0-a1bf-90034da5fa04-kube-api-access-xgw46\") pod \"auto-csr-approver-29564380-wh9jw\" (UID: \"8d4f49d6-7305-42d0-a1bf-90034da5fa04\") " pod="openshift-infra/auto-csr-approver-29564380-wh9jw" Mar 18 19:40:00 crc kubenswrapper[4830]: I0318 19:40:00.421279 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgw46\" (UniqueName: \"kubernetes.io/projected/8d4f49d6-7305-42d0-a1bf-90034da5fa04-kube-api-access-xgw46\") pod \"auto-csr-approver-29564380-wh9jw\" (UID: \"8d4f49d6-7305-42d0-a1bf-90034da5fa04\") " pod="openshift-infra/auto-csr-approver-29564380-wh9jw" Mar 18 19:40:00 crc kubenswrapper[4830]: I0318 19:40:00.455903 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgw46\" (UniqueName: \"kubernetes.io/projected/8d4f49d6-7305-42d0-a1bf-90034da5fa04-kube-api-access-xgw46\") pod \"auto-csr-approver-29564380-wh9jw\" (UID: \"8d4f49d6-7305-42d0-a1bf-90034da5fa04\") " pod="openshift-infra/auto-csr-approver-29564380-wh9jw" Mar 18 19:40:00 crc kubenswrapper[4830]: I0318 19:40:00.496248 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564380-wh9jw" Mar 18 19:40:00 crc kubenswrapper[4830]: I0318 19:40:00.789071 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564380-wh9jw"] Mar 18 19:40:01 crc kubenswrapper[4830]: I0318 19:40:01.234625 4830 scope.go:117] "RemoveContainer" containerID="12586c31daa01567ccf75e5331eca78a5750508a1fa1f0cccc481425a63bde9b" Mar 18 19:40:01 crc kubenswrapper[4830]: E0318 19:40:01.235132 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:40:01 crc kubenswrapper[4830]: I0318 19:40:01.560900 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564380-wh9jw" event={"ID":"8d4f49d6-7305-42d0-a1bf-90034da5fa04","Type":"ContainerStarted","Data":"8f637c027eb31dd8ba5475f3aadb98d84cb636d9d40a06731c8a8508818460bc"} Mar 18 19:40:02 crc kubenswrapper[4830]: I0318 19:40:02.572637 4830 generic.go:334] "Generic (PLEG): container finished" podID="8d4f49d6-7305-42d0-a1bf-90034da5fa04" containerID="d11af5dcd6edb2d6a8dcc0260f7c0a2a742ef7fb8eb9dc1b732dc4d295cc4215" exitCode=0 Mar 18 19:40:02 crc kubenswrapper[4830]: I0318 19:40:02.572739 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564380-wh9jw" event={"ID":"8d4f49d6-7305-42d0-a1bf-90034da5fa04","Type":"ContainerDied","Data":"d11af5dcd6edb2d6a8dcc0260f7c0a2a742ef7fb8eb9dc1b732dc4d295cc4215"} Mar 18 19:40:03 crc kubenswrapper[4830]: I0318 19:40:03.297860 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kb5v7"] Mar 18 19:40:03 crc kubenswrapper[4830]: I0318 19:40:03.302312 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kb5v7" Mar 18 19:40:03 crc kubenswrapper[4830]: I0318 19:40:03.321467 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kb5v7"] Mar 18 19:40:03 crc kubenswrapper[4830]: I0318 19:40:03.375124 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c4a191e-3bb1-41e4-acd2-3eac62f071bc-utilities\") pod \"redhat-marketplace-kb5v7\" (UID: \"7c4a191e-3bb1-41e4-acd2-3eac62f071bc\") " pod="openshift-marketplace/redhat-marketplace-kb5v7" Mar 18 19:40:03 crc kubenswrapper[4830]: I0318 19:40:03.375317 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bswf2\" (UniqueName: \"kubernetes.io/projected/7c4a191e-3bb1-41e4-acd2-3eac62f071bc-kube-api-access-bswf2\") pod \"redhat-marketplace-kb5v7\" (UID: \"7c4a191e-3bb1-41e4-acd2-3eac62f071bc\") " pod="openshift-marketplace/redhat-marketplace-kb5v7" Mar 18 19:40:03 crc kubenswrapper[4830]: I0318 19:40:03.375447 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c4a191e-3bb1-41e4-acd2-3eac62f071bc-catalog-content\") pod \"redhat-marketplace-kb5v7\" (UID: \"7c4a191e-3bb1-41e4-acd2-3eac62f071bc\") " pod="openshift-marketplace/redhat-marketplace-kb5v7" Mar 18 19:40:03 crc kubenswrapper[4830]: I0318 19:40:03.476617 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bswf2\" (UniqueName: \"kubernetes.io/projected/7c4a191e-3bb1-41e4-acd2-3eac62f071bc-kube-api-access-bswf2\") pod \"redhat-marketplace-kb5v7\" (UID: \"7c4a191e-3bb1-41e4-acd2-3eac62f071bc\") " pod="openshift-marketplace/redhat-marketplace-kb5v7" Mar 18 19:40:03 crc kubenswrapper[4830]: I0318 19:40:03.476709 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c4a191e-3bb1-41e4-acd2-3eac62f071bc-catalog-content\") pod \"redhat-marketplace-kb5v7\" (UID: \"7c4a191e-3bb1-41e4-acd2-3eac62f071bc\") " pod="openshift-marketplace/redhat-marketplace-kb5v7" Mar 18 19:40:03 crc kubenswrapper[4830]: I0318 19:40:03.476793 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c4a191e-3bb1-41e4-acd2-3eac62f071bc-utilities\") pod \"redhat-marketplace-kb5v7\" (UID: \"7c4a191e-3bb1-41e4-acd2-3eac62f071bc\") " pod="openshift-marketplace/redhat-marketplace-kb5v7" Mar 18 19:40:03 crc kubenswrapper[4830]: I0318 19:40:03.477330 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c4a191e-3bb1-41e4-acd2-3eac62f071bc-utilities\") pod \"redhat-marketplace-kb5v7\" (UID: \"7c4a191e-3bb1-41e4-acd2-3eac62f071bc\") " pod="openshift-marketplace/redhat-marketplace-kb5v7" Mar 18 19:40:03 crc kubenswrapper[4830]: I0318 19:40:03.477744 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c4a191e-3bb1-41e4-acd2-3eac62f071bc-catalog-content\") pod \"redhat-marketplace-kb5v7\" (UID: \"7c4a191e-3bb1-41e4-acd2-3eac62f071bc\") " pod="openshift-marketplace/redhat-marketplace-kb5v7" Mar 18 19:40:03 crc kubenswrapper[4830]: I0318 19:40:03.501501 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bswf2\" (UniqueName: \"kubernetes.io/projected/7c4a191e-3bb1-41e4-acd2-3eac62f071bc-kube-api-access-bswf2\") pod \"redhat-marketplace-kb5v7\" (UID: \"7c4a191e-3bb1-41e4-acd2-3eac62f071bc\") " pod="openshift-marketplace/redhat-marketplace-kb5v7" Mar 18 19:40:03 crc kubenswrapper[4830]: I0318 19:40:03.639958 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kb5v7" Mar 18 19:40:03 crc kubenswrapper[4830]: I0318 19:40:03.957279 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564380-wh9jw" Mar 18 19:40:04 crc kubenswrapper[4830]: I0318 19:40:04.086843 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgw46\" (UniqueName: \"kubernetes.io/projected/8d4f49d6-7305-42d0-a1bf-90034da5fa04-kube-api-access-xgw46\") pod \"8d4f49d6-7305-42d0-a1bf-90034da5fa04\" (UID: \"8d4f49d6-7305-42d0-a1bf-90034da5fa04\") " Mar 18 19:40:04 crc kubenswrapper[4830]: I0318 19:40:04.093582 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d4f49d6-7305-42d0-a1bf-90034da5fa04-kube-api-access-xgw46" (OuterVolumeSpecName: "kube-api-access-xgw46") pod "8d4f49d6-7305-42d0-a1bf-90034da5fa04" (UID: "8d4f49d6-7305-42d0-a1bf-90034da5fa04"). InnerVolumeSpecName "kube-api-access-xgw46". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:40:04 crc kubenswrapper[4830]: I0318 19:40:04.163430 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kb5v7"] Mar 18 19:40:04 crc kubenswrapper[4830]: W0318 19:40:04.170276 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c4a191e_3bb1_41e4_acd2_3eac62f071bc.slice/crio-9db0f6c96dec08194dc9bf058e21be818549b8f41f06121c21c9e01085ae68b5 WatchSource:0}: Error finding container 9db0f6c96dec08194dc9bf058e21be818549b8f41f06121c21c9e01085ae68b5: Status 404 returned error can't find the container with id 9db0f6c96dec08194dc9bf058e21be818549b8f41f06121c21c9e01085ae68b5 Mar 18 19:40:04 crc kubenswrapper[4830]: I0318 19:40:04.189495 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgw46\" (UniqueName: \"kubernetes.io/projected/8d4f49d6-7305-42d0-a1bf-90034da5fa04-kube-api-access-xgw46\") on node \"crc\" DevicePath \"\"" Mar 18 19:40:04 crc kubenswrapper[4830]: I0318 19:40:04.597051 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564380-wh9jw" event={"ID":"8d4f49d6-7305-42d0-a1bf-90034da5fa04","Type":"ContainerDied","Data":"8f637c027eb31dd8ba5475f3aadb98d84cb636d9d40a06731c8a8508818460bc"} Mar 18 19:40:04 crc kubenswrapper[4830]: I0318 19:40:04.597085 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564380-wh9jw" Mar 18 19:40:04 crc kubenswrapper[4830]: I0318 19:40:04.597094 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f637c027eb31dd8ba5475f3aadb98d84cb636d9d40a06731c8a8508818460bc" Mar 18 19:40:04 crc kubenswrapper[4830]: I0318 19:40:04.600018 4830 generic.go:334] "Generic (PLEG): container finished" podID="7c4a191e-3bb1-41e4-acd2-3eac62f071bc" containerID="b612fafa9759d4d1866e480fd2572cfe2c9a77b02191701aaaa477bd6b125d03" exitCode=0 Mar 18 19:40:04 crc kubenswrapper[4830]: I0318 19:40:04.600056 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kb5v7" event={"ID":"7c4a191e-3bb1-41e4-acd2-3eac62f071bc","Type":"ContainerDied","Data":"b612fafa9759d4d1866e480fd2572cfe2c9a77b02191701aaaa477bd6b125d03"} Mar 18 19:40:04 crc kubenswrapper[4830]: I0318 19:40:04.600101 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kb5v7" event={"ID":"7c4a191e-3bb1-41e4-acd2-3eac62f071bc","Type":"ContainerStarted","Data":"9db0f6c96dec08194dc9bf058e21be818549b8f41f06121c21c9e01085ae68b5"} Mar 18 19:40:05 crc kubenswrapper[4830]: I0318 19:40:05.044133 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564374-xb5r4"] Mar 18 19:40:05 crc kubenswrapper[4830]: I0318 19:40:05.057433 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564374-xb5r4"] Mar 18 19:40:06 crc kubenswrapper[4830]: I0318 19:40:06.255750 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c289a25d-46e5-48a1-8e1a-ec0415ee0bd2" path="/var/lib/kubelet/pods/c289a25d-46e5-48a1-8e1a-ec0415ee0bd2/volumes" Mar 18 19:40:07 crc kubenswrapper[4830]: I0318 19:40:07.629511 4830 generic.go:334] "Generic (PLEG): container finished" podID="7c4a191e-3bb1-41e4-acd2-3eac62f071bc" containerID="6d80ef2611c8cf0584118492ffe96880754a1b7ce76911d659c0b2bfa602a715" exitCode=0 Mar 18 19:40:07 crc kubenswrapper[4830]: I0318 19:40:07.629611 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kb5v7" event={"ID":"7c4a191e-3bb1-41e4-acd2-3eac62f071bc","Type":"ContainerDied","Data":"6d80ef2611c8cf0584118492ffe96880754a1b7ce76911d659c0b2bfa602a715"} Mar 18 19:40:08 crc kubenswrapper[4830]: I0318 19:40:08.642104 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kb5v7" event={"ID":"7c4a191e-3bb1-41e4-acd2-3eac62f071bc","Type":"ContainerStarted","Data":"1ca0f59ea1180d55e2e8a36a9b513f67e62d2a93e13d6db529408fd0e60aa547"} Mar 18 19:40:08 crc kubenswrapper[4830]: I0318 19:40:08.676328 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kb5v7" podStartSLOduration=1.9893579620000001 podStartE2EDuration="5.676308806s" podCreationTimestamp="2026-03-18 19:40:03 +0000 UTC" firstStartedPulling="2026-03-18 19:40:04.60186483 +0000 UTC m=+5839.169495162" lastFinishedPulling="2026-03-18 19:40:08.288815664 +0000 UTC m=+5842.856446006" observedRunningTime="2026-03-18 19:40:08.673518758 +0000 UTC m=+5843.241149130" watchObservedRunningTime="2026-03-18 19:40:08.676308806 +0000 UTC m=+5843.243939158" Mar 18 19:40:12 crc kubenswrapper[4830]: I0318 19:40:12.235134 4830 scope.go:117] "RemoveContainer" containerID="12586c31daa01567ccf75e5331eca78a5750508a1fa1f0cccc481425a63bde9b" Mar 18 19:40:12 crc kubenswrapper[4830]: E0318 19:40:12.235846 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:40:13 crc kubenswrapper[4830]: I0318 19:40:13.640046 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kb5v7" Mar 18 19:40:13 crc kubenswrapper[4830]: I0318 19:40:13.640364 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kb5v7" Mar 18 19:40:13 crc kubenswrapper[4830]: I0318 19:40:13.708725 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kb5v7" Mar 18 19:40:13 crc kubenswrapper[4830]: I0318 19:40:13.792075 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kb5v7" Mar 18 19:40:13 crc kubenswrapper[4830]: I0318 19:40:13.959812 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kb5v7"] Mar 18 19:40:15 crc kubenswrapper[4830]: I0318 19:40:15.712880 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kb5v7" podUID="7c4a191e-3bb1-41e4-acd2-3eac62f071bc" containerName="registry-server" containerID="cri-o://1ca0f59ea1180d55e2e8a36a9b513f67e62d2a93e13d6db529408fd0e60aa547" gracePeriod=2 Mar 18 19:40:16 crc kubenswrapper[4830]: I0318 19:40:16.154605 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kb5v7" Mar 18 19:40:16 crc kubenswrapper[4830]: I0318 19:40:16.211767 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c4a191e-3bb1-41e4-acd2-3eac62f071bc-utilities\") pod \"7c4a191e-3bb1-41e4-acd2-3eac62f071bc\" (UID: \"7c4a191e-3bb1-41e4-acd2-3eac62f071bc\") " Mar 18 19:40:16 crc kubenswrapper[4830]: I0318 19:40:16.211924 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bswf2\" (UniqueName: \"kubernetes.io/projected/7c4a191e-3bb1-41e4-acd2-3eac62f071bc-kube-api-access-bswf2\") pod \"7c4a191e-3bb1-41e4-acd2-3eac62f071bc\" (UID: \"7c4a191e-3bb1-41e4-acd2-3eac62f071bc\") " Mar 18 19:40:16 crc kubenswrapper[4830]: I0318 19:40:16.212272 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c4a191e-3bb1-41e4-acd2-3eac62f071bc-catalog-content\") pod \"7c4a191e-3bb1-41e4-acd2-3eac62f071bc\" (UID: \"7c4a191e-3bb1-41e4-acd2-3eac62f071bc\") " Mar 18 19:40:16 crc kubenswrapper[4830]: I0318 19:40:16.213879 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c4a191e-3bb1-41e4-acd2-3eac62f071bc-utilities" (OuterVolumeSpecName: "utilities") pod "7c4a191e-3bb1-41e4-acd2-3eac62f071bc" (UID: "7c4a191e-3bb1-41e4-acd2-3eac62f071bc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:40:16 crc kubenswrapper[4830]: I0318 19:40:16.218974 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c4a191e-3bb1-41e4-acd2-3eac62f071bc-kube-api-access-bswf2" (OuterVolumeSpecName: "kube-api-access-bswf2") pod "7c4a191e-3bb1-41e4-acd2-3eac62f071bc" (UID: "7c4a191e-3bb1-41e4-acd2-3eac62f071bc"). InnerVolumeSpecName "kube-api-access-bswf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:40:16 crc kubenswrapper[4830]: I0318 19:40:16.269241 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c4a191e-3bb1-41e4-acd2-3eac62f071bc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c4a191e-3bb1-41e4-acd2-3eac62f071bc" (UID: "7c4a191e-3bb1-41e4-acd2-3eac62f071bc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:40:16 crc kubenswrapper[4830]: I0318 19:40:16.316118 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c4a191e-3bb1-41e4-acd2-3eac62f071bc-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 19:40:16 crc kubenswrapper[4830]: I0318 19:40:16.316171 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bswf2\" (UniqueName: \"kubernetes.io/projected/7c4a191e-3bb1-41e4-acd2-3eac62f071bc-kube-api-access-bswf2\") on node \"crc\" DevicePath \"\"" Mar 18 19:40:16 crc kubenswrapper[4830]: I0318 19:40:16.316194 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c4a191e-3bb1-41e4-acd2-3eac62f071bc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 19:40:16 crc kubenswrapper[4830]: I0318 19:40:16.727372 4830 generic.go:334] "Generic (PLEG): container finished" podID="7c4a191e-3bb1-41e4-acd2-3eac62f071bc" containerID="1ca0f59ea1180d55e2e8a36a9b513f67e62d2a93e13d6db529408fd0e60aa547" exitCode=0 Mar 18 19:40:16 crc kubenswrapper[4830]: I0318 19:40:16.727436 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kb5v7" event={"ID":"7c4a191e-3bb1-41e4-acd2-3eac62f071bc","Type":"ContainerDied","Data":"1ca0f59ea1180d55e2e8a36a9b513f67e62d2a93e13d6db529408fd0e60aa547"} Mar 18 19:40:16 crc kubenswrapper[4830]: I0318 19:40:16.727458 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kb5v7" Mar 18 19:40:16 crc kubenswrapper[4830]: I0318 19:40:16.727490 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kb5v7" event={"ID":"7c4a191e-3bb1-41e4-acd2-3eac62f071bc","Type":"ContainerDied","Data":"9db0f6c96dec08194dc9bf058e21be818549b8f41f06121c21c9e01085ae68b5"} Mar 18 19:40:16 crc kubenswrapper[4830]: I0318 19:40:16.727533 4830 scope.go:117] "RemoveContainer" containerID="1ca0f59ea1180d55e2e8a36a9b513f67e62d2a93e13d6db529408fd0e60aa547" Mar 18 19:40:16 crc kubenswrapper[4830]: I0318 19:40:16.759027 4830 scope.go:117] "RemoveContainer" containerID="6d80ef2611c8cf0584118492ffe96880754a1b7ce76911d659c0b2bfa602a715" Mar 18 19:40:16 crc kubenswrapper[4830]: I0318 19:40:16.786074 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kb5v7"] Mar 18 19:40:16 crc kubenswrapper[4830]: I0318 19:40:16.794766 4830 scope.go:117] "RemoveContainer" containerID="b612fafa9759d4d1866e480fd2572cfe2c9a77b02191701aaaa477bd6b125d03" Mar 18 19:40:16 crc kubenswrapper[4830]: I0318 19:40:16.798744 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kb5v7"] Mar 18 19:40:16 crc kubenswrapper[4830]: I0318 19:40:16.844853 4830 scope.go:117] "RemoveContainer" containerID="1ca0f59ea1180d55e2e8a36a9b513f67e62d2a93e13d6db529408fd0e60aa547" Mar 18 19:40:16 crc kubenswrapper[4830]: E0318 19:40:16.846169 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ca0f59ea1180d55e2e8a36a9b513f67e62d2a93e13d6db529408fd0e60aa547\": container with ID starting with 1ca0f59ea1180d55e2e8a36a9b513f67e62d2a93e13d6db529408fd0e60aa547 not found: ID does not exist" containerID="1ca0f59ea1180d55e2e8a36a9b513f67e62d2a93e13d6db529408fd0e60aa547" Mar 18 19:40:16 crc kubenswrapper[4830]: I0318 19:40:16.846273 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ca0f59ea1180d55e2e8a36a9b513f67e62d2a93e13d6db529408fd0e60aa547"} err="failed to get container status \"1ca0f59ea1180d55e2e8a36a9b513f67e62d2a93e13d6db529408fd0e60aa547\": rpc error: code = NotFound desc = could not find container \"1ca0f59ea1180d55e2e8a36a9b513f67e62d2a93e13d6db529408fd0e60aa547\": container with ID starting with 1ca0f59ea1180d55e2e8a36a9b513f67e62d2a93e13d6db529408fd0e60aa547 not found: ID does not exist" Mar 18 19:40:16 crc kubenswrapper[4830]: I0318 19:40:16.846314 4830 scope.go:117] "RemoveContainer" containerID="6d80ef2611c8cf0584118492ffe96880754a1b7ce76911d659c0b2bfa602a715" Mar 18 19:40:16 crc kubenswrapper[4830]: E0318 19:40:16.846991 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d80ef2611c8cf0584118492ffe96880754a1b7ce76911d659c0b2bfa602a715\": container with ID starting with 6d80ef2611c8cf0584118492ffe96880754a1b7ce76911d659c0b2bfa602a715 not found: ID does not exist" containerID="6d80ef2611c8cf0584118492ffe96880754a1b7ce76911d659c0b2bfa602a715" Mar 18 19:40:16 crc kubenswrapper[4830]: I0318 19:40:16.847052 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d80ef2611c8cf0584118492ffe96880754a1b7ce76911d659c0b2bfa602a715"} err="failed to get container status \"6d80ef2611c8cf0584118492ffe96880754a1b7ce76911d659c0b2bfa602a715\": rpc error: code = NotFound desc = could not find container \"6d80ef2611c8cf0584118492ffe96880754a1b7ce76911d659c0b2bfa602a715\": container with ID starting with 6d80ef2611c8cf0584118492ffe96880754a1b7ce76911d659c0b2bfa602a715 not found: ID does not exist" Mar 18 19:40:16 crc kubenswrapper[4830]: I0318 19:40:16.847091 4830 scope.go:117] "RemoveContainer" containerID="b612fafa9759d4d1866e480fd2572cfe2c9a77b02191701aaaa477bd6b125d03" Mar 18 19:40:16 crc kubenswrapper[4830]: E0318 19:40:16.847561 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b612fafa9759d4d1866e480fd2572cfe2c9a77b02191701aaaa477bd6b125d03\": container with ID starting with b612fafa9759d4d1866e480fd2572cfe2c9a77b02191701aaaa477bd6b125d03 not found: ID does not exist" containerID="b612fafa9759d4d1866e480fd2572cfe2c9a77b02191701aaaa477bd6b125d03" Mar 18 19:40:16 crc kubenswrapper[4830]: I0318 19:40:16.847612 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b612fafa9759d4d1866e480fd2572cfe2c9a77b02191701aaaa477bd6b125d03"} err="failed to get container status \"b612fafa9759d4d1866e480fd2572cfe2c9a77b02191701aaaa477bd6b125d03\": rpc error: code = NotFound desc = could not find container \"b612fafa9759d4d1866e480fd2572cfe2c9a77b02191701aaaa477bd6b125d03\": container with ID starting with b612fafa9759d4d1866e480fd2572cfe2c9a77b02191701aaaa477bd6b125d03 not found: ID does not exist" Mar 18 19:40:18 crc kubenswrapper[4830]: I0318 19:40:18.246134 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c4a191e-3bb1-41e4-acd2-3eac62f071bc" path="/var/lib/kubelet/pods/7c4a191e-3bb1-41e4-acd2-3eac62f071bc/volumes" Mar 18 19:40:21 crc kubenswrapper[4830]: I0318 19:40:21.298539 4830 scope.go:117] "RemoveContainer" containerID="82b0782b4012662d0d892f8c021f7d47d87c798c6505077e79cadb9d21947411" Mar 18 19:40:21 crc kubenswrapper[4830]: I0318 19:40:21.326688 4830 scope.go:117] "RemoveContainer" containerID="c2957f435f804f1afe69869cfd2f39a75df751ca9c6696ad1f1cbe7baab0573e" Mar 18 19:40:23 crc kubenswrapper[4830]: I0318 19:40:23.235370 4830 scope.go:117] "RemoveContainer" containerID="12586c31daa01567ccf75e5331eca78a5750508a1fa1f0cccc481425a63bde9b" Mar 18 19:40:23 crc kubenswrapper[4830]: E0318 19:40:23.236109 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:40:36 crc kubenswrapper[4830]: I0318 19:40:36.240962 4830 scope.go:117] "RemoveContainer" containerID="12586c31daa01567ccf75e5331eca78a5750508a1fa1f0cccc481425a63bde9b" Mar 18 19:40:36 crc kubenswrapper[4830]: E0318 19:40:36.242668 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:40:48 crc kubenswrapper[4830]: I0318 19:40:48.235196 4830 scope.go:117] "RemoveContainer" containerID="12586c31daa01567ccf75e5331eca78a5750508a1fa1f0cccc481425a63bde9b" Mar 18 19:40:48 crc kubenswrapper[4830]: E0318 19:40:48.236345 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:41:02 crc kubenswrapper[4830]: I0318 19:41:02.236051 4830 scope.go:117] "RemoveContainer" containerID="12586c31daa01567ccf75e5331eca78a5750508a1fa1f0cccc481425a63bde9b" Mar 18 19:41:02 crc kubenswrapper[4830]: E0318 19:41:02.237081 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:41:17 crc kubenswrapper[4830]: I0318 19:41:17.235345 4830 scope.go:117] "RemoveContainer" containerID="12586c31daa01567ccf75e5331eca78a5750508a1fa1f0cccc481425a63bde9b" Mar 18 19:41:17 crc kubenswrapper[4830]: E0318 19:41:17.236442 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" Mar 18 19:41:31 crc kubenswrapper[4830]: I0318 19:41:31.234741 4830 scope.go:117] "RemoveContainer" containerID="12586c31daa01567ccf75e5331eca78a5750508a1fa1f0cccc481425a63bde9b" Mar 18 19:41:31 crc kubenswrapper[4830]: E0318 19:41:31.235558 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-plzpb_openshift-machine-config-operator(fbe02a32-24dc-4772-8a10-0128d3a304e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-plzpb" podUID="fbe02a32-24dc-4772-8a10-0128d3a304e4" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515156577765024475 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015156577766017413 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015156563730016520 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015156563730015470 5ustar corecore